CN115797244A - Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission - Google Patents
Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission Download PDFInfo
- Publication number
- CN115797244A CN115797244A CN202310069737.8A CN202310069737A CN115797244A CN 115797244 A CN115797244 A CN 115797244A CN 202310069737 A CN202310069737 A CN 202310069737A CN 115797244 A CN115797244 A CN 115797244A
- Authority
- CN
- China
- Prior art keywords
- image
- scale
- layer sub
- infrared
- base layer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 34
- 230000005540 biological transmission Effects 0.000 title claims description 10
- 230000004927 fusion Effects 0.000 claims abstract description 79
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 74
- 230000009466 transformation Effects 0.000 claims abstract description 61
- 238000012546 transfer Methods 0.000 claims abstract description 31
- 238000007781 pre-processing Methods 0.000 claims abstract description 11
- 238000005259 measurement Methods 0.000 claims description 18
- 238000000034 method Methods 0.000 claims description 17
- 238000004364 calculation method Methods 0.000 claims description 16
- 238000001914 filtration Methods 0.000 claims description 16
- 150000001875 compounds Chemical class 0.000 claims description 13
- 238000013519 translation Methods 0.000 claims description 9
- 230000003044 adaptive effect Effects 0.000 claims description 5
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 4
- 238000010606 normalization Methods 0.000 claims description 3
- 238000000844 transformation Methods 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 7
- 230000005855 radiation Effects 0.000 abstract description 5
- 230000006870 function Effects 0.000 description 8
- 230000007123 defense Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000001965 increasing effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000011541 reaction mixture Substances 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Landscapes
- Image Processing (AREA)
Abstract
An image fusion method based on a multi-scale direction co-occurrence filter and intensity transfer relates to the field of image fusion, and comprises the following steps: acquiring and preprocessing source infrared and visible light image data; iteratively using a co-occurrence filter to carry out multi-scale decomposition on the preliminarily decomposed image; carrying out multidirectional decomposition on the multi-scale infrared basic layer sub-image and the multi-scale visible light basic layer sub-image by adopting discrete tight support shear wave transformation; fusing detail layer sub-images; fusing the base layer sub-images; and reconstructing the fused multi-scale detail layer sub-image and the multi-scale multi-direction basic layer sub-images in different directions to obtain a fused image. The invention can effectively retain the remarkable information and more effective detail information of the heat radiation target and reduce the edge blurring; the invention effectively distinguishes the detail information of different scales and has the direction representation characteristic; the invention obviously improves the integral contrast and definition of the fused image and improves the integral fusion effect.
Description
Technical Field
The invention relates to the technical field of image fusion, in particular to an image fusion method based on a multi-scale direction co-occurrence filter and intensity transmission.
Background
In recent years, the demand for multi-sensor image fusion technology, especially for infrared and visible light image fusion, is increasing rapidly in various fields, such as fields of reconnaissance, auxiliary driving, geological monitoring and the like. The image fusion aims to combine a plurality of groups of source images of the multi-modal sensor to form a single image, so that the aims of comprehensively displaying image information and enhancing scene understanding are fulfilled. In the infrared and visible light image fusion technology, an infrared sensor can effectively reflect heat radiation target information in a weak light or shielded environment, which is not possessed by a visible light sensor; meanwhile, the high resolution of the visible light sensor is also an effective supplement of the infrared sensor. The infrared and visible light image fusion method can effectively utilize the complementary information of the two and can fully remove redundant information. According to the research of relevant scholars, the current infrared and visible light image fusion method can be divided into a traditional method and a deep learning-based method. The fusion algorithm based on the coupled impulse neural network is a representative of deep learning, and related learners perform migration application to obtain good effect, but the fusion algorithm is only suitable for specific application at present due to the characteristics of large requirements on training original data, network structures which are difficult to explain and the like. Whereas conventional algorithms can solve the above problems. In the traditional algorithm, the effect of a multi-scale transformation fusion method represented by a wavelet family and multi-scale geometric analysis is the most outstanding, and the fusion method is the currently generally adopted fusion method because the fusion mechanism of the fusion method accords with the vision of human eyes and can fully express the inherent characteristics of images. However, the traditional algorithm does not distinguish high-frequency detail information, and is still difficult to distinguish texture information in a large-scale contour and a small gradient change internal area. For example, the document "Aishwarya, N., and C. Bennla Thangamma." visual and associated image fusion DTCTCW and adaptive combined fused dictionary. "Infrered Physics & Technology 93 (2018): 300-309" discloses the fusion of Infrared and Visible light images using a modified dual-tree complex wavelet transform (DTCTWT), which is a typical representative algorithm of multi-scale geometric analysis, which solves the problem of fusion result undertargeting by modifying and using DTCTWT to decompose a source initial image and applying a targeted fusion strategy to decomposed high and low frequency sub-bands. However, in this method, since the detailed information of the image is entirely decomposed into high-frequency subbands, and the burden of direction representation and transmission of the subsequent directional filter is increased, the final fusion result is not only loss of the detailed information and edge blurring, but also the overall fusion speed is slow. Also, as the chinese patent "infrared and visible light image fusion method under non-downsampling shear wave transform domain" with publication number CN114549379a, by designing a low-frequency fusion strategy for extracting detail information remaining in a low-frequency subband and a high-frequency strategy for guiding filtering optimization weighting, the problems of partial artifacts and detail transmission are solved, but the situations of detail information loss and edge blurring still occur because the adopted non-downsampling shear wave transform fails to effectively distinguish different types of detail information. The problems of loss of detail information, edge blurring and low fusion speed of a fusion result are effectively solved by using a co-occurrence filter-based multi-scale fusion method, but the method is lack of directional description capability and cannot adjust the contrast of a final fusion image, so that the overall fusion effect is poor, and meanwhile, the problems of loss of detail direction characteristics and difficulty in highlighting a target in a complex scene are not well improved, and further solution is still needed.
Disclosure of Invention
The invention provides an image fusion method based on a multi-scale direction co-occurrence filter and strength transmission, aiming at solving the problems that the existing multi-scale fusion method based on the co-occurrence filter is lack of directional description capability, cannot adjust the contrast of a final fusion image, is poor in integral fusion effect, loses the characteristics of detailed directions and is not prominent in a remarkable target after fusion.
The technical scheme adopted by the invention for solving the technical problem is as follows:
the invention discloses an image fusion method based on a multi-scale direction co-occurrence filter and intensity transmission, which comprises the following steps of:
s1, acquiring and preprocessing source infrared and visible light image data;
s2, filtering the preprocessed source infrared and visible light images by using a co-occurrence filter to realize preliminary decomposition; iteratively using a co-occurrence filter to carry out multi-scale decomposition on the preliminarily decomposed image to obtain a multi-scale infrared detail layer sub-image, a multi-scale infrared basic layer sub-image, a multi-scale visible light detail layer sub-image and a multi-scale visible light basic layer sub-image;
s3, carrying out multi-directional decomposition on the multi-scale infrared basic layer sub-image and the multi-scale visible light basic layer sub-image by adopting discrete tight support shear wave transformation to obtain a multi-scale multi-directional infrared basic layer sub-image and a multi-scale multi-directional visible light basic layer sub-image;
s4, constructing a fusion weight strategy by adopting a fusion strategy based on phase consistency and intensity measurement to perform detail layer sub-image fusion to obtain a fused multi-scale detail layer sub-image;
s5, designing a self-adaptive intensity transfer function as a fusion criterion to perform base layer sub-image fusion to obtain fused multi-scale and multi-direction base layer sub-images in different directions;
and S6, reconstructing the fused multi-scale detail layer sub-image and the multi-scale multi-direction basic layer sub-images in different directions to obtain a final fused image.
Further, the specific operation steps of step S1 are as follows:
s1.1, respectively acquiring a source infrared image and a source visible light image from an infrared camera and a visible light camera;
s1.2, respectively carrying out image denoising operation and image enhancement operation on the source infrared image and the source visible light image.
Further, the image denoising operation adopts a denoising algorithm based on filtering; the image enhancement operation adopts an image enhancement algorithm based on scene fitting.
Further, the specific operation steps of step S2 are as follows:
s2.1, performing primary decomposition;
distributing pixel weights by utilizing co-occurrence information in the preprocessed source infrared and visible light images, and respectively adopting a co-occurrence filter to carry out filtering operation on the preprocessed source infrared and visible light images to realize preliminary decomposition to obtain a 0-level infrared basic layer sub-image and a 0-level visible light basic layer sub-image; subtracting the preprocessed source infrared image from the 0-level infrared basic layer sub-image to obtain a 0-level infrared detail layer sub-image, and simultaneously subtracting the preprocessed source visible light image from the 0-level visible light basic layer sub-image to obtain a 0-level visible light detail layer sub-image;
s2.2, carrying out multi-scale decomposition;
respectively and iteratively using a co-occurrence filter filtering level subtraction operator to carry out multi-scale decomposition on the 0-level infrared basic layer sub-image and the 0-level visible light basic layer sub-image to obtain a multi-scale infrared basic layer sub-imageMulti-scale visible light base layer sub-imageMulti-scale infrared detail layer sub-imageAnd multi-scale visible detail layer subimagesI =1 … k, k represents the decomposition order.
Further, the specific operation steps of step S3 are as follows:
respectively utilizing the constructed horizontal discrete shear wave transformation and vertical discrete shear wave transformation to carry out multi-scale infrared basic layer sub-imageAnd a multiscale visible light base layer sub-imageAre carried out horizontally and verticallyDecomposing in K directions to obtain K multi-scale multi-direction infrared base layer sub-images in the horizontal directionPerpendicular K multi-scale multi-directional infrared base layer sub-imagesHorizontal K multi-scale multi-directional visible light base layer sub-imagesPerpendicular K multi-scale multi-directional visible light base layer subimagesK multi-scale infrared detail layer sub-imagesAnd K multiple multi-scale visible detail layer sub-images 。
Further, the calculation formula of the horizontal discrete shear wave transformation operation is shown in formula (6), and the calculation formula of the vertical discrete shear wave transformation operation is shown in formula (7);
in the formula, k represents the decomposition order; n is 1 And n 2 Which represents two translation factors, the number of which,,represents the square of an integer field; s represents a direction factor, s =1,0, -1,s =1 represents a 45 ° direction, s =0 represents a 90 ° direction, and s = -1 represents a 180 ° direction;represents a round-down operation;andrespectively representing a horizontal discrete shear wave transformation and a vertical discrete shear wave transformation;the infrared basic layer sub-image adopts translation transformation when the kth decomposition is expressed,representing the equivalent transformation of the infrared base layer sub-image subjected to horizontal discrete shear wave transformation at the k-th decomposition,the visible light base layer sub-image adopts translation transformation when the kth level decomposition is represented,representing the equivalent transformation of the visible light base layer sub-image subjected to the horizontal discrete shear wave transformation at the k-th decomposition,and (4) representing the equivalent transformation of the visible light base layer sub-image subjected to vertical discrete shear wave transformation at the k-th decomposition level.
Further, the specific operation steps of step S4 are as follows:
s4.1, the K multi-scale infrared detail layer sub-images obtained in the step S2And K multiple multi-scale visible detail layer sub-imagesObtaining a phase consistency measure operator by adopting a phase consistency operation; the phase congruency measure operatorThe calculation formula of (a) is as follows;
in the formula, theta k Representing the direction angle at the decomposition level k,the n-th direction angle is theta k The magnitude of the Fourier series, [ e ] n,θk ,o n,θk Is the result of the convolution of the image with a log-Gabor filter at position (x, y); sigma k Represents the summation operation, sigma, on variable k n Representing a summation operation on a variable n;is a small positive constant;
s4.2, designing a strength measurement strategy by a windowing method, wherein the strength measurement strategy is shown as a formula (10);
in the formula (I), the compound is shown in the specification,(. Represents an intensity measurement operator at the decomposition level l, I l (x, y) represents the multi-scale detail layer sub-image pixel value at a position (x, y) below the decomposition level l, I l (x 0 ,y 0 ) Expressed at a position (x) below the decomposition level l 0 ,y 0 ) The sub-image pixel values of the multi-scale detail layer, omega, are expressed in terms of positionA partially windowed area that is a center of the window,representing a center pixel within the windowed area;
s4.3, a fusion weight strategy is constructed by combining a phase consistency method and an intensity measurement strategy, multi-scale detail layer sub-image fusion is realized, and a series of fused multi-scale detail layer sub-images are obtained; the fusion weight strategy is shown as formula (11);
in the formula (I), the compound is shown in the specification,representing the fused multi-scale detail layer sub-image, D IR (x, y) denotes an infrared detail layer sub-image, D VI (x, y) represents a visible light segment layer sub-image,(. Cndot.) represents the phase consistency measure operator at the decomposition level l.
Further, the specific operation steps of step S5 are as follows:
s5.1, calculating local energy of the image pixel by pixel according to the K multi-scale multi-direction infrared base layer sub-images in the horizontal direction and the K multi-scale multi-direction infrared base layer sub-images in the vertical direction obtained in the step S3;
in the formula, E (x, y) represents local energy of an image, and W le (i, j) represents a local energy window of size i × j, i ≦ 1 ≦ 3,1 ≦ j ≦ 3,B IR (x + i, y + j) represents performing a pixel-by-pixel traversal operation on the image;
s5.2, carrying out normalization operation on the local energy of the image to obtain a feature distribution expression operator P of the subimage of the basic layer;
in the formula (I), the compound is shown in the specification,,E max (x, y) represents a local energy maximum value calculated according to the local energy window, and E (x + i, y + j) represents traversing operation on the local energy value in the local energy window;
s5.3, introducing a proportion parameter into the base layer sub-image feature distribution expression operator P, constructing an adaptive intensity transfer function, and fusing the multi-scale multi-direction base layer sub-images in the horizontal direction and the vertical direction respectively to obtain a fused multi-scale multi-direction infrared base layer sub-image in the horizontal direction, a multi-scale multi-direction visible light base layer sub-image in the horizontal direction, a multi-scale multi-direction infrared base layer sub-image in the vertical direction and a multi-scale multi-direction visible light base layer sub-image in the vertical direction; the calculation formula of the fusion operation is shown as formula (14);
in the formula (I), the compound is shown in the specification,representing multi-scale multi-directional base layer sub-image fusion weights; gamma represents the introduced ratio parameter; arctan (·) represents an arctangent operator; p represents a base layer sub-image feature distribution representation operator;
s5.4 the expression of the final fused multi-scale multi-direction base layer subimages in different directions is as follows:
in the formula (I), the compound is shown in the specification,representing the fused multi-scale multi-directional base layer sub-image in different directions whenWhen the value is not less than 0, the reaction time is not less than 0,a multi-scale multi-directional base layer sub-image representing the fused horizontal direction,a multi-scale multi-directional infrared base layer sub-image representing the fused horizontal direction,a multi-scale multi-directional visible light base layer sub-image representing the fused horizontal direction; when a =1, the number of the bits is set to a =1,a multi-scale multi-directional base layer sub-image representing the fused vertical direction,a multi-scale multi-directional infrared base layer sub-image representing the fused vertical direction,a multi-scale multi-directional visible light base layer sub-image representing the fused vertical direction; k represents a direction factor.
Further, the specific operation steps of step S6 are as follows:
reconstructing the fused multi-scale and multi-directional base layer sub-images in different directions by utilizing inverse operation of discrete tight support shear wave transformation to obtain a fused base layer imageFusing the base layer imageAnd adding the fused multi-scale detail layer sub-image to obtain a final fused image F.
Further, the fusion base layer imageThe formula (2) is shown in formula (16); the describedThe calculation formula of the fusion image F is shown as formula (17);
in the formula (I), the compound is shown in the specification,representing the inverse operator of the discrete tightly-supported shear wave transformation,representing the fused ith-level multi-scale detail layer sub-image;a multi-scale, multi-directional base layer sub-image representing the fused horizontal direction;a multi-scale, multi-directional base layer sub-image representing the fused vertical direction.
The invention has the beneficial effects that:
1) The invention can effectively retain the remarkable information and more effective detail information of the heat radiation target and remarkably reduce the situation of edge blurring; the invention effectively balances the optimal relationship among the differentiation of detail information of different scales, the intensity adjustment and the salient information of the heat radiation target;
2) According to the invention, the multi-scale direction co-occurrence filter is designed through the joint co-occurrence filter and the discrete tight support shear wave transformation in the decomposition stage, the advantages of the co-occurrence filter and the directional shear wave transformation are fully combined, the detail information of different scales is effectively distinguished, and meanwhile, the direction representation characteristic is provided, and the edge blurring is reduced while more effective detail information is transmitted;
3) According to the invention, by designing the self-adaptive intensity transfer function as the fusion strategy of the basic layer, the overall contrast and definition of the fused image can be obviously improved, the effective fusion of the infrared and visible light images is realized, the image quality is enhanced, and the overall fusion effect is improved;
4) The fusion result of the invention has outstanding effects on the reservation of detail information and the adjustment of the whole contrast ratio, and provides a high-quality enhanced fusion image for a subsequent advanced visual image processing system;
5) The invention can also accelerate the processing efficiency of the whole advanced visual image processing system.
Drawings
FIG. 1 is a flow chart of an image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to the present invention;
FIG. 2 is a schematic diagram of an embodiment of the image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to the present invention;
FIG. 3 is a result of fusing a pair of source infrared and visible light image sequences describing a real urban highway scene in a RoadScreen public dataset using the image fusion method based on multi-scale direction co-occurrence filters and intensity transfer of the present invention;
FIG. 4 is a result of fusing a pair of source infrared and visible light image sequences describing a road scene in real rain in a RoadScreen public data set using the image fusion method based on multi-scale direction co-occurrence filter and intensity transfer of the present invention;
fig. 5 is a result of fusing a source infrared and visible light image sequence pair describing a real field defense scene in a TNO public data set by using the image fusion method based on the multi-scale direction co-occurrence filter and intensity transfer of the present invention.
Detailed Description
The present invention will be described in further detail below with reference to the accompanying drawings.
The invention relates to an image fusion method based on a multi-scale direction co-occurrence filter and intensity transmission, which comprises the following steps of: acquiring and preprocessing source infrared and visible light image data → performing filtering operation on the preprocessed source infrared and visible light images by adopting a co-occurrence filter to realize preliminary decomposition; iteratively performing multi-scale decomposition on the preliminarily decomposed image by using a co-occurrence filter to obtain a multi-scale infrared detail layer sub-image, a multi-scale infrared base layer sub-image, a multi-scale visible light detail layer sub-image and a multi-scale visible light base layer sub-image → performing multi-direction decomposition on the multi-scale infrared base layer sub-image and the multi-scale visible light base layer sub-image by using discrete tight support shear wave transformation to obtain a multi-scale multi-direction infrared base layer sub-image and a multi-scale multi-direction visible light base layer sub-image → constructing a fusion weight strategy based on phase consistency and intensity measurement to perform detail layer sub-image fusion to obtain a fused multi-scale detail layer sub-image → designing an adaptive intensity transfer function as a fusion criterion to perform base layer sub-image fusion to obtain a fused multi-scale multi-direction base layer sub-image in different directions → reconstructing the fused multi-scale detail layer sub-image and the multi-scale base layer sub-image in different directions to obtain a final fused image.
The invention discloses an image fusion method based on a multi-scale direction co-occurrence filter and intensity transmission, which comprises the following specific operation processes:
s1, acquiring and preprocessing source infrared and visible light image data; the specific operation steps are as follows:
s1.1, acquiring a source initial image, specifically acquiring a source infrared image from an infrared camera, and acquiring a source visible light image from a visible light camera;
s1.2, respectively carrying out preprocessing operation on the acquired source infrared image and the acquired source visible light image; the preprocessing operation comprises: image denoising operation and image enhancement operation; when the image denoising operation is performed, the image denoising can be specifically performed by adopting a denoising algorithm based on filtering, and when the image enhancement operation is performed, the image enhancement can be specifically performed by adopting an image enhancement algorithm based on scene fitting; by carrying out image denoising and image enhancement operations on the source initial image, partial noise of the source initial image can be eliminated, and the contrast and the definition of the source initial image are initially improved; specifically, the preprocessing operation of the source initial image can be realized through the formula (1), including the preliminary image denoising and image enhancement operation;
in formula (1), I represents a source infrared and visible light image data matrix after preprocessing operation, T (right) represents preprocessing operation, and I ’ Representing source infrared and visible light images.
S2, filtering the preprocessed source infrared and visible light images by using a co-occurrence filter to realize preliminary decomposition; iteratively using a co-occurrence filter to carry out multi-scale decomposition on the preliminarily decomposed image to obtain a multi-scale infrared detail layer sub-image, a multi-scale infrared basic layer sub-image, a multi-scale visible light detail layer sub-image and a multi-scale visible light basic layer sub-image; the specific operation steps are as follows:
s2.1, performing primary decomposition;
distributing pixel weights by utilizing co-occurrence information in the preprocessed source infrared and visible light images, respectively adopting a co-occurrence filter to carry out filtering operation on the preprocessed source infrared and visible light images, realizing preliminary decomposition, and further obtaining a 0-level infrared basic layer sub-imageAnd a 0-level visible base layer sub-imageAs shown in formula (2) and formula (3); preprocessing a source infrared image I IR And 0-level infrared base layer sub-imageSubtracting to obtain a sub-image of the 0-level infrared detail layerSimultaneously, the preprocessed source visible light image I is processed as shown in formula (2) VI And 0-level visible base layer sub-imageSubtracting to obtain 0-level visible light detail layer sub-imageAs shown in formula (3);
in equations (2) and (3), coF (.) represents a filtering operation performed on an image using a co-occurrence filter,representing a level 0 infrared base layer sub-image,representing a level 0 visible base layer sub-image,representing a level 0 infrared detail layer sub-image,representing a visible detail layer sub-image of level 0, I IR Representing the pre-processed source infrared image, I VI Representing the pre-processed source visible light image;
s2.2, carrying out multi-scale decomposition;
then the obtained 0-level infrared basic layer sub-image is subjected toAnd a 0-level visible base layer sub-imageRespectively and iteratively using the filter stage subtraction operator of the co-occurrence filter to carry out multi-scale decomposition, and finally obtaining the sub-image of the multi-scale infrared basic layerMulti-scale visible base layer sub-imageMulti-scale infrared detail layer sub-image(i =1 … k) and multi-scale visible detail layer subimages(i =1 … k), k represents the decomposition order;
for the i-1 st level infrared base layer sub-image, taking k-level decomposition as an exampleCarrying out co-occurrence filtering operation to obtain the ith-level infrared basic layer sub-imageFor the i-1 st visible light base layer sub-image at the same timeCarrying out co-occurrence filtering operation to obtain the i-th level visible light basic layer sub-image(ii) a Then, the i-1 level infrared basic layer sub-image is processedAnd ith level infrared base layer sub-imagePerforming subtraction operation to obtain the ith-level infrared detail layer sub-imageFor the i-1 st visible light base layer sub-image at the same timeAnd ith visible light base layer sub-imagePerforming subtraction operation to obtain the i-th level visible light detail layer sub-image(ii) a Repeating the steps k times to finally obtain the multi-scale infrared basic layer sub-imageMulti-scale visible light base layer sub-imageMulti-scale infrared detail layer sub-image(i =1 … k) and multi-scale visible detail layer subimages(i =1 … k); wherein, the i-1 level infrared basic layer sub-imageObtaining an i-level infrared basic layer sub-image after k-level decompositionAnd ith visible light base layer sub-imageThe formula (4) shows, the multi-scale infrared detail layer sub-imageAnd the multi-scale visible light detail layer sub-imageIs calculated asFormula (5);
in the formulae (4) and (5),representing the ith-level infrared base layer sub-image,representing an i-1 th level infrared base layer sub-image,representing the ith-level visible base layer sub-image,representing the i-1 st level visible base layer sub-image,representing the ith level infrared detail layer sub-image,to represent。
S3, further adopting discrete tight support shear wave transformation to carry out multi-scale infrared base layer sub-imageAnd a multiscale visible light base layer sub-imagePerforming multi-directional decomposition to obtain a multi-scale multi-directional infrared basic layer sub-image and a multi-scale multi-directional visible light basic layer sub-image; aiming at the defect of lack of directivity of multi-scale decomposition in the step S2, the discrete tight support shear wave is adoptedCarrying out multi-directional decomposition on the multi-scale base layer sub-image by transformation, and extracting the large-scale contour edge of the image; the specific operation steps are as follows:
s3.1 respectively utilizing the constructed horizontal discrete shear wave transformation and vertical discrete shear wave transformation to the multi-scale infrared basic layer sub-image obtained in the step S2And a multiscale visible light base layer sub-imagePerforming decomposition in each of K (K represents a direction factor) directions horizontally and vertically; wherein, the calculation formula of the horizontal discrete shear wave conversion operation is shown as the formula (6), and the calculation formula of the vertical discrete shear wave conversion operation is shown as the formula (7);
in the formulae (6) and (7), k represents the number of decomposition stages; n is 1 And n 2 Which represents two translation factors, the number of which,,represents the square of an integer field; s represents a direction factor, s =1,0, -1,s =1 represents a 45 ° direction, s =0 represents a 90 ° direction, and s = -1 represents a 180 ° direction;represents a round-down operation;andrespectively representing a horizontal discrete shear wave transformation and a vertical discrete shear wave transformation;the infrared basic layer sub-image adopts translation transformation when the kth decomposition is expressed,representing the equivalent transformation of the infrared base layer sub-image subjected to horizontal discrete shear wave transformation at the k-th decomposition,the visible light base layer sub-image adopts translation transformation when representing the k-th decomposition,
representing the equivalent transformation of the visible light base layer sub-image subjected to the horizontal discrete shear wave transformation at the k-th decomposition level,representing the equivalent transformation of the visible light base layer sub-image subjected to vertical discrete shear wave transformation in the k-th decomposition;
s3.2, after horizontal discrete shear wave transformation and vertical discrete shear wave transformation, multi-scale infrared basic layer sub-imagesAnd a multiscale visible light base layer sub-imageIs divided into 2K directions, namely a horizontal direction and a vertical direction; wherein the multi-scale multi-directional base layer sub-image data set obtained after the multi-directional decomposition can be represented asWherein, in the step (A),individual watchShowing the multi-scale multi-direction infrared base layer sub-image obtained after horizontal discrete shear wave transformation and vertical discrete shear wave transformation,respectively representing the sub-images of the multi-scale and multi-direction visible light base layer obtained after horizontal discrete shear wave transformation and vertical discrete shear wave transformation;
s3.3 horizontal and vertical, respectively, three directions, i.e. the direction factor K =3,s =1,0, -1; after multidirectional decomposition, K multi-scale multidirectional infrared base layer sub-images in the horizontal direction are obtainedPerpendicular K multi-scale multi-directional infrared base layer sub-imagesHorizontal K multi-scale multi-directional visible light base layer sub-imagesAnd K multi-scale multi-direction visible light base layer sub-images in the vertical directionAnd K multi-scale infrared detail layer sub-images。
S4, fusing the detail layer sub-images;
performing fusion strategy based on phase consistency and intensity measurement on the K multi-scale infrared detail layer sub-imagesAnd K multiple multi-scale visible detail layer sub-imagesPerforming fusion operation to obtain a series of fused multi-scale detail layer sub-images so as to transfer more texture details; the specific operation steps are as follows:
s4.1, firstly, the K multi-scale infrared detail layer sub-images obtained in the way are processedAnd K multiple multi-scale visible detail layer sub-imagesAdopting phase consistency operation of engineering application to obtain a phase consistency measure operator for designing a subsequent fusion strategy; wherein the phase coincidence measure operatorThe calculation formula of (a) is as follows;
in formulae (8) and (9), θ k Representing the direction angle at the decomposition level k,the n-th direction angle is theta k The magnitude of the Fourier series, [ e ] n,θ k ,o n,θ k Is an image (multiscale infrared detail layer sub-image or) Convolution results with a log-Gabor filter at position (x, y); sigma k Represents the summation operation, sigma, on the variable k n Representing a summation operation on a variable n;is a small positive number for preventing the denominator from being 0;
s4.2, designing a strength measurement strategy by a windowing method, wherein the strength measurement strategy is shown as a formula (10);
in the formula (10), N l (. Represents an intensity measurement operator at the decomposition level l, I l (x, y) represents the multi-scale detail layer sub-image pixel value at a position (x, y) below the decomposition level l, I l (x 0 ,y 0 ) Expressed at the position (x) below the decomposition level l 0 ,y 0 ) The multi-scale detail layer sub-image pixel values of (a), omega, are expressed in terms of position (x) 0 ,y 0 ) (x) a central partially windowed area 0 ,y 0 ) Representing a center pixel within the windowed area;
s4.3, finally, a fusion weight strategy is constructed by combining a phase consistency method and an intensity measurement strategy, so that the fusion of the sub-images of the multi-scale detail layer is realized, a series of fused sub-images of the multi-scale detail layer are obtained, and more texture details are transmitted; the fusion weight strategy of the multi-scale detail layer sub-image is shown as a formula (11);
in formula (11), D F (x, y) denotes the fused multi-scale detail layer sub-image, D IR (x, y) denotes an infrared detail layer sub-image, D VI (x, y) represents a visible light segment layer sub-image, P l (. Cndot.) represents the phase consistency measure operator at the decomposition level l.
S5, fusing the base layer sub-images;
by designing a self-adaptive intensity transfer function as a fusion criterion of the multi-scale multi-direction base layer subimages, adopting the infrared image base layer pixel intensity distribution information to adjust the multi-scale multi-direction base layer subimage intensity distribution, highlighting the heat radiation target significant information, completing the fusion of the multi-scale multi-direction base layer subimages in different directions, guiding the final fusion result by utilizing the intensity distribution of the infrared base layer subimages, and ensuring that the fused base layer subimages have high-contrast characteristics; the specific operation steps are as follows:
s5.1, firstly, aiming at K multi-scale in the horizontal direction obtained in the step S3Directional infrared base layer subimageAnd K multi-scale multi-direction infrared base layer sub-images in vertical directionCalculating the local energy of the image pixel by pixel;
in the formula (12), E (x, y) represents the local energy of the image, W le (i, j) represents a local energy window of size i × j, i ≦ 1 ≦ 3,1 ≦ j ≦ 3,B IR (x + i, y + j) represents performing a pixel-by-pixel traversal operation on the image;
s5.2, carrying out normalization operation on the local energy obtained by calculation to obtain a base layer sub-image feature distribution expression operator P;
in the formula (13), the reaction mixture is,,E max (x, y) represents a local energy maximum value calculated according to the local energy window, and E (x + i, y + j) represents traversing operation on the local energy value in the local energy window;
s5.3, introducing a proportion parameter into the base layer sub-image feature distribution expression operator P, constructing an adaptive intensity transfer function to fuse the multi-scale multi-direction base layer sub-images, adjusting the intensity distribution of the multi-scale multi-direction base layer sub-images, namely respectively fusing the multi-scale multi-direction base layer sub-images in the horizontal direction and the vertical direction to obtain a fused multi-scale multi-direction infrared base layer sub-image in the horizontal direction, a fused multi-scale multi-direction visible light base layer sub-image in the horizontal direction, a fused multi-scale multi-direction infrared base layer sub-image in the vertical direction and a fused multi-scale multi-direction visible light base layer sub-image in the vertical direction; the specific calculation formula of the fusion operation is shown as formula (14);
in the formula (14), the compound represented by the formula (I),representing multi-scale multi-directional base layer sub-image fusion weights; gamma represents an introduced proportion parameter and is used for better regulating and controlling the fusion weight of the multi-scale multidirectional base layer sub-image; arctan (·) represents an arctangent operator; p represents a base layer sub-image feature distribution representation operator;
s5.4, the expression of the finally fused multi-scale and multi-direction base layer sub-images in different directions is as follows:
in the formula (15), the reaction mixture is,multi-scale, multi-directional base layer subimages representing the fused different directions are merged, when a =0,a multi-scale multi-directional base layer sub-image representing the fused horizontal direction,a multi-scale multi-directional infrared base layer sub-image representing the fused horizontal direction,a multi-scale multi-directional visible light base layer sub-image representing the fused horizontal direction; when a =1, the number of the bits is set to a =1,a multi-scale multi-directional base layer sub-image representing the fused vertical direction,a multi-scale multi-directional infrared base layer sub-image representing the fused vertical direction,a multi-scale multi-directional visible light base layer sub-image representing the fused vertical direction; k represents a direction factor.
S6, reconstructing the fused multi-scale detail layer sub-image (the result obtained in the step four) and the fused multi-scale multi-direction basic layer sub-image (the result obtained in the step five) in different directions to obtain a final fused image F; the specific operation steps are as follows:
reconstructing the fused multi-scale and multi-direction base layer sub-images in different directions by utilizing the inverse operation of discrete tight support shear wave transformation to obtain a fused base layer image B F Finally, the base layer image B is fused F Performing simple addition operation on the fused multi-scale detail layer sub-image to obtain a final fused image F, wherein a specific calculation formula is shown as follows;
in the formulae (16) and (17),representing the inverse operator of the discrete tightly-supported shear wave transformation,representing the fused ith-level multi-scale detail layer sub-image;a multi-scale, multi-directional base layer sub-image representing the fused horizontal direction;a multi-scale, multi-directional base layer sub-image representing the fused vertical direction.
In order to verify the effectiveness of the image fusion method based on the multi-scale direction co-occurrence filter and the intensity transfer, the following verification test is performed.
As shown in fig. 2, according to the image fusion method based on multi-scale direction co-occurrence filter and intensity transfer of the present invention, (1) a source infrared image is first acquired from an infrared camera (a in fig. 2), and a source visible light image is simultaneously acquired from a visible light camera (b in fig. 2); (2) Performing filtering operation on the preprocessed source infrared and visible light images by using a co-occurrence filter to realize preliminary decomposition, and then performing multi-scale decomposition on the preliminarily decomposed images by iteratively using the co-occurrence filter to obtain a multi-scale infrared basic layer sub-image (c in figure 2), a multi-scale visible light basic layer sub-image (d in figure 2), a multi-scale infrared detail layer sub-image (e in figure 2) and a multi-scale visible light detail layer sub-image (f in figure 2); (3) Carrying out multi-directional decomposition on the multi-scale infrared basic layer sub-image and the multi-scale visible light basic layer sub-image by adopting discrete tight support shear wave transformation to obtain a multi-scale multi-directional infrared basic layer sub-image and a multi-scale multi-directional visible light basic layer sub-image; (4) Constructing a fusion weight strategy by adopting a fusion strategy based on phase consistency and intensity measurement to perform detail layer sub-image fusion on a multi-scale infrared detail layer sub-image (e in figure 2) and a multi-scale visible light detail layer sub-image (f in figure 2) to obtain a fused multi-scale detail layer sub-image (h in figure 2); (5) Designing a self-adaptive intensity transfer function as a fusion criterion to perform base layer sub-image fusion on the multi-scale infrared base layer sub-image (c in figure 2) and the multi-scale visible light base layer sub-image (d in figure 2) to obtain fused multi-scale multi-direction base layer sub-images (g in figure 2) in different directions; (6) And reconstructing the fused multi-scale detail layer sub-image and the multi-scale multi-direction base layer sub-images in different directions to obtain a final fused image (F in fig. 2).
The image fusion method based on the multi-scale direction co-occurrence filter and the intensity transfer is utilized to fuse the source infrared and visible light image sequence pair describing the real urban highway scene in the RoadScreen public data set, and the result is shown in figure 3, wherein the first row is the source infrared image describing the real urban highway scene, the second row is the source visible light image describing the real urban highway scene, and the third row is the final fused image.
The image fusion method based on the multi-scale direction co-occurrence filter and the intensity transfer is utilized to fuse the source infrared and visible light image sequence pair describing the real road scene in the rain in the RoadScreen public data set, and as a result, as shown in FIG. 4, the first row is the source infrared image describing the real road scene in the rain, the second row is the source visible light image describing the real road scene in the rain, and the third row is the final fused image.
The image fusion method based on the multi-scale direction co-occurrence filter and the intensity transfer is utilized to fuse the source infrared and visible light image sequence pair describing the real field defense scene in the TNO public data set, and the result is shown in figure 5, wherein the first row is the source infrared image describing the real field defense scene, the second row is the source visible light image describing the real field defense scene, and the third row is the final fused image.
In conclusion, the multi-scale multi-direction decomposition is carried out on the source initial image by designing the multi-direction co-occurrence filter, so that large-scale contour information and small-gradient texture information in an image area can be distinguished, and the method has good direction representation capability; the whole contrast and definition of the image can be effectively adjusted by designing a self-adaptive intensity transfer function and fusing the multi-direction base layer sub-image; in the aspect of detail layer sub-image fusion, a strategy based on phase consistency and intensity measurement is adopted to construct fusion weight, and more texture details are transmitted to a final fusion result; in addition, the method of the invention is simple and easy to operate, can balance the quality and efficiency of the algorithm by only adjusting a few parameters, and can provide necessary technical support for the subsequent advanced visual processing process.
The above description is only a preferred embodiment of the present invention, and does not limit the present invention in any way; any person skilled in the art can make any equivalent substitutions or modifications on the technical solutions and technical contents disclosed in the present invention without departing from the scope of the technical solutions of the present invention, and still fall within the protection scope of the present invention without departing from the technical solutions of the present invention.
Claims (10)
1. The image fusion method based on the multi-scale direction co-occurrence filter and the intensity transmission is characterized by comprising the following steps of:
s1, acquiring and preprocessing source infrared and visible light image data;
s2, filtering the preprocessed source infrared and visible light images by using a co-occurrence filter to realize preliminary decomposition; iteratively using a co-occurrence filter to carry out multi-scale decomposition on the preliminarily decomposed image to obtain a multi-scale infrared detail layer sub-image, a multi-scale infrared basic layer sub-image, a multi-scale visible light detail layer sub-image and a multi-scale visible light basic layer sub-image;
s3, carrying out multi-directional decomposition on the multi-scale infrared basic layer sub-image and the multi-scale visible light basic layer sub-image by adopting discrete tight support shear wave transformation to obtain a multi-scale multi-directional infrared basic layer sub-image and a multi-scale multi-directional visible light basic layer sub-image;
s4, constructing a fusion weight strategy by adopting a fusion strategy based on phase consistency and intensity measurement to perform detail layer sub-image fusion to obtain a fused multi-scale detail layer sub-image;
s5, designing a self-adaptive intensity transfer function as a fusion criterion to perform base layer sub-image fusion to obtain fused multi-scale and multi-direction base layer sub-images in different directions;
and S6, reconstructing the fused multi-scale detail layer sub-image and the multi-scale multi-direction basic layer sub-images in different directions to obtain a final fused image.
2. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 1, wherein the specific operation steps of step S1 are as follows:
s1.1, respectively acquiring a source infrared image and a source visible light image from an infrared camera and a visible light camera;
s1.2, respectively carrying out image denoising operation and image enhancement operation on the source infrared image and the source visible light image.
3. The method for image fusion based on multi-scale direction co-occurrence filter and intensity transfer as claimed in claim 2, wherein the image denoising operation adopts a filter-based denoising algorithm; the image enhancement operation adopts an image enhancement algorithm based on scene fitting.
4. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer as claimed in claim 2, wherein the specific operation steps of step S2 are as follows:
s2.1, performing primary decomposition;
distributing pixel weights by utilizing co-occurrence information in the preprocessed source infrared and visible light images, and respectively adopting a co-occurrence filter to carry out filtering operation on the preprocessed source infrared and visible light images to realize preliminary decomposition to obtain a 0-level infrared basic layer sub-image and a 0-level visible light basic layer sub-image; subtracting the preprocessed source infrared image from the 0-level infrared basic layer sub-image to obtain a 0-level infrared detail layer sub-image, and simultaneously subtracting the preprocessed source visible light image from the 0-level visible light basic layer sub-image to obtain a 0-level visible light detail layer sub-image;
s2.2, carrying out multi-scale decomposition;
respectively and iteratively using a co-occurrence filter filtering level subtraction operator to carry out multi-scale decomposition on the 0-level infrared basic layer sub-image and the 0-level visible light basic layer sub-image to obtain a multi-scale infrared basic layer sub-imageMulti-scale visible light base layer sub-imageMulti-scale infraredDetail layer sub-imagesAnd multi-scale visible detail layer subimagesI =1 … k, k represents the decomposition order.
5. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 4, wherein the specific operation steps of step S3 are as follows:
respectively utilizing the constructed horizontal discrete shear wave transformation and vertical discrete shear wave transformation to carry out multi-scale infrared basic layer sub-imageAnd a multiscale visible light base layer sub-imageDecomposing the horizontal direction and the vertical direction in K directions to obtain K multi-scale multi-direction infrared basic layer sub-images in the horizontal directionPerpendicular K multi-scale multi-directional infrared base layer sub-imagesK multi-scale multi-directional visible light base layer subimages in horizontal directionPerpendicular K multi-scale multi-directional visible light base layer sub-imagesK multi-scale infrared detail layer sub-imagesAnd K multi-scale visible detail layer sub-images 。
6. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 5, wherein the calculation formula of the horizontal discrete shear wave transformation operation is shown in formula (6), and the calculation formula of the vertical discrete shear wave transformation operation is shown in formula (7);
in the formula, k represents the decomposition order; n is 1 And n 2 Which represents two translation factors, the number of which,,represents the square of the integer field; s represents a direction factor, s =1,0, -1,s =1 represents a 45 ° direction, s =0 represents a 90 ° direction, and s = -1 represents a 180 ° direction;represents a round-down operation;andrepresenting horizontal and vertical discrete shear wave transformations, respectivelyTransforming;the infrared basic layer sub-image adopts translation transformation when the kth decomposition is expressed,representing the equivalent transformation of the infrared base layer sub-image subjected to horizontal discrete shear wave transformation at the k-th decomposition,the visible light base layer sub-image adopts translation transformation when representing the k-th decomposition,representing the equivalent transformation of the visible light base layer sub-image subjected to the horizontal discrete shear wave transformation at the k-th decomposition,representing the equivalent transformation of the visible light base layer sub-image subjected to the vertical discrete shear wave transformation at the k-th decomposition.
7. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 5, wherein the specific operation steps of step S4 are as follows:
s4.1, the K multi-scale infrared detail layer sub-images obtained in the step S2And K multi-scale visible detail layer sub-imagesObtaining a phase consistency measure operator by adopting phase consistency operation; the phase congruency measure operatorThe calculation formula of (a) is as follows;
in the formula, theta k Representing the direction angle at the decomposition level k,the n-th direction angle is theta k The magnitude of the Fourier series, [ e ] n,θk ,o n,θk Is the result of the convolution of the image with a log-Gabor filter at position (x, y); sigma k Represents the summation operation, sigma, on the variable k n Representing a summation operation on a variable n;is a small positive constant;
s4.2, designing a strength measurement strategy by a windowing method, wherein the strength measurement strategy is shown as a formula (10);
in the formula (I), the compound is shown in the specification,(. Represents an intensity measurement operator at the decomposition level l, I l (x, y) represents the multi-scale detail layer sub-image pixel value at a position (x, y) below the decomposition level l, I l (x 0 ,y 0 ) Expressed at the position (x) below the decomposition level l 0 ,y 0 ) The sub-image pixel values of the multi-scale detail layer, omega, are expressed in terms of positionA partially windowed area that is a center of the window,indicating in windowed areasA center pixel;
s4.3, a fusion weight strategy is constructed by combining a phase consistency method and an intensity measurement strategy, multi-scale detail layer sub-image fusion is realized, and a series of fused multi-scale detail layer sub-images are obtained; the fusion weight strategy is shown as a formula (11);
in the formula (I), the compound is shown in the specification,representing the fused multi-scale detail layer sub-image, D IR (x, y) denotes an infrared detail layer sub-image, D VI (x, y) represents a visible light detail layer sub-image,(. Cndot.) represents the phase consistency measure operator at the decomposition level l.
8. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 7, wherein the specific operation steps of step S5 are as follows:
s5.1, calculating local energy of the image pixel by pixel according to the K multi-scale multi-direction infrared base layer sub-images in the horizontal direction and the K multi-scale multi-direction infrared base layer sub-images in the vertical direction obtained in the step S3;
in the formula, E (x, y) represents local energy of an image, and W le (i, j) represents a local energy window of size i × j, i ≦ 1 ≦ 3,1 ≦ j ≦ 3,B IR (x + i, y + j) represents performing a pixel-by-pixel traversal operation on the image;
s5.2, carrying out normalization operation on the local energy of the image to obtain a feature distribution expression operator P of the subimage of the basic layer;
in the formula (I), the compound is shown in the specification,,E max (x, y) represents a local energy maximum value calculated according to the local energy window, and E (x + i, y + j) represents traversing operation on the local energy value in the local energy window;
s5.3, introducing a proportion parameter into the characteristic distribution expression operator P of the base layer subimage, constructing an adaptive intensity transfer function, and fusing the multi-scale and multi-direction base layer subimages in the horizontal direction and the vertical direction respectively to obtain a fused multi-scale and multi-direction infrared base layer subimage in the horizontal direction, a multi-scale and multi-direction visible light base layer subimage in the horizontal direction, a multi-scale and multi-direction infrared base layer subimage in the vertical direction and a multi-scale and multi-direction visible light base layer subimage in the vertical direction; the calculation formula of the fusion operation is shown as formula (14);
in the formula (I), the compound is shown in the specification,representing multi-scale multi-directional base layer sub-image fusion weights; gamma represents the introduced ratio parameter; arctan (·) represents an arctangent operator; p represents a base layer sub-image feature distribution representation operator;
s5.4, the expression of the finally fused multi-scale and multi-direction base layer sub-images in different directions is as follows:
in the formula (I), the compound is shown in the specification,means after fusionThe co-directional, multi-scale, multi-directional base layer sub-image, when a =0,a multi-scale multi-directional base layer sub-image representing the fused horizontal direction,a multi-scale multi-directional infrared base layer sub-image representing the fused horizontal direction,a multi-scale multi-directional visible light base layer sub-image representing the fused horizontal direction; when a =1, the number of the bits is set to a =1,a multi-scale multi-directional base layer sub-image representing the fused vertical direction,a multi-scale multi-directional infrared base layer sub-image representing the fused vertical direction,a multi-scale multi-directional visible light base layer sub-image representing the fused vertical direction; k represents a direction factor.
9. The method for image fusion based on multi-scale direction co-occurrence filter and intensity transfer according to claim 8, wherein the specific operation steps of step S6 are as follows:
reconstructing the fused multi-scale and multi-directional base layer sub-images in different directions by utilizing inverse operation of discrete tight support shear wave transformation to obtain a fused base layer imageFusing the base layer imageAnd adding the fused multi-scale detail layer sub-image to obtain a final fused image F.
10. The method of claim 9, wherein the fusing base layer images is based on image fusion of multi-scale direction co-occurrence filter and intensity transferThe formula (2) is shown in formula (16); the calculation formula of the fusion image F is shown as a formula (17);
in the formula (I), the compound is shown in the specification,representing the inverse operator of the discrete tightly-supported shear wave transformation,representing the fused ith-level multi-scale detail layer sub-image;a multi-scale, multi-directional base layer sub-image representing the fused horizontal direction;a multi-scale, multi-directional base layer sub-image representing the fused vertical direction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310069737.8A CN115797244A (en) | 2023-02-07 | 2023-02-07 | Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310069737.8A CN115797244A (en) | 2023-02-07 | 2023-02-07 | Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115797244A true CN115797244A (en) | 2023-03-14 |
Family
ID=85430098
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310069737.8A Pending CN115797244A (en) | 2023-02-07 | 2023-02-07 | Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115797244A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104809734A (en) * | 2015-05-11 | 2015-07-29 | 中国人民解放军总装备部军械技术研究所 | Infrared image and visible image fusion method based on guide filtering |
CN112017139A (en) * | 2020-09-14 | 2020-12-01 | 南昌航空大学 | Infrared and visible light image perception fusion method |
CN113222877A (en) * | 2021-06-03 | 2021-08-06 | 北京理工大学 | Infrared and visible light image fusion method and application thereof in airborne photoelectric video |
CN114897751A (en) * | 2022-04-12 | 2022-08-12 | 北京理工大学 | Infrared and visible light image perception fusion method based on multi-scale structural decomposition |
-
2023
- 2023-02-07 CN CN202310069737.8A patent/CN115797244A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104809734A (en) * | 2015-05-11 | 2015-07-29 | 中国人民解放军总装备部军械技术研究所 | Infrared image and visible image fusion method based on guide filtering |
CN112017139A (en) * | 2020-09-14 | 2020-12-01 | 南昌航空大学 | Infrared and visible light image perception fusion method |
CN113222877A (en) * | 2021-06-03 | 2021-08-06 | 北京理工大学 | Infrared and visible light image fusion method and application thereof in airborne photoelectric video |
CN114897751A (en) * | 2022-04-12 | 2022-08-12 | 北京理工大学 | Infrared and visible light image perception fusion method based on multi-scale structural decomposition |
Non-Patent Citations (1)
Title |
---|
韩玺钰: "基于多尺度及显著区域分析的红外与可见光图像融合算法研究" * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110119780B (en) | Hyper-spectral image super-resolution reconstruction method based on generation countermeasure network | |
CN112767251B (en) | Image super-resolution method based on multi-scale detail feature fusion neural network | |
CN110443768B (en) | Single-frame image super-resolution reconstruction method based on multiple consistency constraints | |
CN104318569B (en) | Space salient region extraction method based on depth variation model | |
CN113284051B (en) | Face super-resolution method based on frequency decomposition multi-attention machine system | |
CN107123089A (en) | Remote sensing images super-resolution reconstruction method and system based on depth convolutional network | |
CN108830796A (en) | Based on the empty high spectrum image super-resolution reconstructing method combined and gradient field is lost of spectrum | |
CN112001843B (en) | Infrared image super-resolution reconstruction method based on deep learning | |
CN109272447A (en) | A kind of depth map super-resolution method | |
CN111738948B (en) | Underwater image enhancement method based on double U-nets | |
CN110675462A (en) | Gray level image colorizing method based on convolutional neural network | |
CN102243711A (en) | Neighbor embedding-based image super-resolution reconstruction method | |
CN109472743A (en) | The super resolution ratio reconstruction method of remote sensing images | |
Yang et al. | Image super-resolution based on deep neural network of multiple attention mechanism | |
CN114581560A (en) | Attention mechanism-based multi-scale neural network infrared image colorizing method | |
CN112163998A (en) | Single-image super-resolution analysis method matched with natural degradation conditions | |
Zhu et al. | Super-resolving commercial satellite imagery using realistic training data | |
CN101739670B (en) | Non-local mean space domain time varying image filtering method | |
CN115326809A (en) | Apparent crack detection method and detection device for tunnel lining | |
CN117593188B (en) | Super-resolution method based on unsupervised deep learning and corresponding equipment | |
CN117953310A (en) | Remote sensing multi-mode image classification method based on continuous scale feature network | |
CN112686804B (en) | Image super-resolution reconstruction method and device for mine low-light environment | |
CN113724139A (en) | Unsupervised infrared single-image hyper-resolution for generation of countermeasure network based on dual discriminators | |
CN111652826B (en) | Method for homogenizing multiple/hyperspectral remote sensing images based on Wallis filtering and histogram matching | |
CN115797244A (en) | Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20230314 |
|
RJ01 | Rejection of invention patent application after publication |