CN109410161B - Fusion method of infrared polarization images based on YUV and multi-feature separation - Google Patents
Fusion method of infrared polarization images based on YUV and multi-feature separation Download PDFInfo
- Publication number
- CN109410161B CN109410161B CN201811180887.1A CN201811180887A CN109410161B CN 109410161 B CN109410161 B CN 109410161B CN 201811180887 A CN201811180887 A CN 201811180887A CN 109410161 B CN109410161 B CN 109410161B
- Authority
- CN
- China
- Prior art keywords
- image
- images
- local
- feature
- dark
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000010287 polarization Effects 0.000 title claims abstract description 97
- 238000000926 separation method Methods 0.000 title claims abstract description 21
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 12
- 230000004927 fusion Effects 0.000 claims abstract description 80
- 238000000034 method Methods 0.000 claims abstract description 31
- 239000013598 vector Substances 0.000 claims abstract description 13
- 239000011159 matrix material Substances 0.000 claims description 18
- 150000001875 compounds Chemical class 0.000 claims description 12
- 239000000126 substance Substances 0.000 claims description 6
- 238000010586 diagram Methods 0.000 claims description 4
- 238000002834 transmittance Methods 0.000 claims description 3
- 230000000295 complement effect Effects 0.000 abstract description 5
- 230000000694 effects Effects 0.000 abstract description 4
- 230000000007 visual effect Effects 0.000 abstract description 4
- 230000009286 beneficial effect Effects 0.000 abstract description 3
- 238000012800 visualization Methods 0.000 abstract description 3
- 238000003384 imaging method Methods 0.000 description 5
- 238000003331 infrared imaging Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000005855 radiation Effects 0.000 description 4
- 230000015556 catabolic process Effects 0.000 description 3
- 238000006731 degradation reaction Methods 0.000 description 3
- 230000002349 favourable effect Effects 0.000 description 3
- 238000007499 fusion processing Methods 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 2
- 210000000887 face Anatomy 0.000 description 2
- 239000012141 concentrate Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses an infrared polarization image fusion method based on YUV and multi-feature separation, which comprises the following steps: firstly, expressing the polarization state of light by using a Stokes vector, and calculating a polarization degree image and a polarization angle image; then solving the unique parts P 'and R' of the polarization degree and the polarization angle images respectively; and fusing the images P 'and R' by utilizing a multi-feature separation method of a dark primary color theory, and fusing the fusion result and the total intensity image I in a YUV space to obtain a final infrared polarization image fusion result. The method is based on human visual characteristics YUV color space for fusion, effectively improves the visualization effect of the fusion result of the polarized images, enhances the detail information such as the edge of the images, improves the contrast of the images, fuses the complementary information among all polarized quantities, makes the fused image scene richer, and is beneficial to the identification of the camouflage target.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to an infrared polarization image fusion method based on YUV and multi-feature separation.
Background
The traditional infrared imaging system mainly images the infrared radiation intensity of a scene, and the imaging is mainly related to the temperature, the radiance and the like of the scene. When a noise source with the same temperature is placed around a target object, the existing thermal infrared imager cannot identify the camouflaged target, and the infrared imaging technology faces serious limitations and challenges. Compared with the traditional infrared imaging, the light polarization imaging can reduce the degradation influence of the complex scene, and meanwhile, the structure and distance information of the scene can be obtained. Just because the light polarization information is another information characterizing an object than radiation, the object to be measured of the same radiation may have different degrees of polarization, and useful signals can be detected in a complex background by using the polarization means.
The polarization state of light can be described in terms of stokes vectors, with the individual polarization quantities being redundant and complementary in terms of expressing polarization information. In the process of polarization image fusion, the situation that only single difference characteristics are considered exists, which cannot effectively describe all uncertain and randomly-changed image characteristics in an image, so that some valuable information is lost in the fusion process, and fusion and identification fail; meanwhile, the visual effect of the fused image influences the resolving power of human eyes and the recognition effect of the camouflage target.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, provides an infrared polarization image fusion method based on YUV and multi-feature separation, realizes the fusion of polarization components, fuses complementary information among the polarization components, enriches the fused image scene, enhances the detailed information such as the edge of an image and the like, improves the image contrast, and effectively improves the visualization effect of the polarization image fusion result by fusing the YUV color space based on the human visual characteristics.
The technical scheme of the invention is that an infrared polarization image fusion method based on YUV and multi-feature separation comprises the following steps:
s1, representing the polarization state of the light by using a Stokes vector, and calculating a polarization degree image P and a polarization angle image R according to the Stokes vector;
s2, solving the unique part P 'of the polarization degree image and the unique part R' of the polarization angle image according to the common part Co of the polarization degree and the polarization angle image;
s3, fusing images P 'and R' by a multi-feature separation method based on a dark primary color theory;
and S4, fusing the fusion result and the total intensity image I in the YUV space by using the fusion result in the step S3 to obtain a final fusion result.
Further, the representation of the Stokes vector S in step S1 is shown in formula (1):
in the formula I1、I2、I3And I4Respectively representing the acquired light intensity images with the polarization directions of 0 degree, 45 degrees, 90 degrees and 135 degrees; i represents the total intensity of light; q represents the intensity difference between horizontal and vertical polarization, and U represents the intensity difference between 45 ° and 135 ° in the polarization direction; v represents the intensity difference between the left-and right-hand circular polarization components of the light;
the polarization degree image P and the polarization angle image R of the polarized light are expressed as:
further, in the above step S2, each pixel in the images P and R is operated on by the formula (4), and the common portion Co of the images P and R is obtained:
Co=min(p,R) (4)
the unique part P 'of the polarization degree image and the unique part R' of the polarization angle image are obtained using equations (5) and (6), respectively:
P'=P-Co (5)
R'=R-Co (6)
still further, the step S3 includes the following steps:
s3.1, performing multi-feature separation on the images P 'and R' based on a dark primary color theory to obtain dark feature images, bright feature images and detail feature images of the images P 'and R' respectively;
s3.2, fusing bright feature images of the images P 'and R' by adopting a matching method based on local region energy features;
s3.3, fusing dark feature images of the images P 'and R' by adopting a matching method based on local area weighted variance features;
s3.4, driving and fusing the detail characteristic images of the images P 'and R' by using fuzzy logic and characteristic difference;
s3.5, fusing the results of the steps S3.2, S3.3 and S3.4 to obtain a fused result of the images P 'and R'.
It should be noted that steps S3.2, S3.3 and S3.4 are not time-sequenced and can be described in one step, and the description of steps S3.2, S3.3 and S3.4 is only made for the sake of clarity in the following description: steps S3.2, S3.3 and S3.4 may be performed synchronously; any two can be carried out simultaneously, and then the last one is carried out; or any one of the two can be carried out firstly, and then the rest two can be carried out synchronously; or one of them in sequence.
Still further, in step S3.1 above,
the dark primary color is He and the like, and is used for estimating the transmittance in the atmospheric scattering model to realize the rapid defogging of the natural image, and the solving method of the dark channel map is shown as the formula (7):
where C is the three color channels R, G, B of the image; n (x) is a pixel in the window area with the pixel point x as the center; l isC(y) a color channel map of the image; l isdark(x) The image fog degree is reflected for the dark channel image, and the dark channel L of the fog-free imagedark(x) The value tends to 0; for foggy images, Ldark(x) The value is large;
obtaining a dark primary color image of the unique portion P' of the polarization degree image by equation (7)
Obtaining a dark primary color image of the unique portion R' of the polarization angle image by equation (7)
The method comprises the following steps of respectively obtaining bright features, dark features and detail features of two images by adopting multi-feature separation based on a dark primary color theory:
s3.1.1 obtaining the image after inversion of the image P' using equations (8) and (9), respectivelyAnd the image after the inversion of the image RThen the dark primary color image is processedAndrespectively associated with the imagesAndfusing according to the rule that the absolute value is small to obtain the dark feature image D of the image PP'Dark feature image D of sum image RR'As shown in equations (10) and (11);
s3.1.2, willAndimages D respectively corresponding theretoP'And DR'Taking the difference to obtain a bright feature image L of the image PP'And bright feature image L of image RR'As shown in equations (12) and (13);
s3.1.3, and associating image P 'and image R' with the dark primary color image, respectivelyAnd dark primary color imageTaking the difference to obtain a detail image P of the image PP'Detail image P of sum image RR'As shown in formulas (14) and (15);
still further, in step S3.2, the fusion is carried outClosed-up feature image LP'And bright feature image LR'The method comprises the following steps:
s3.2.1, the gaussian weighted local energy function is used to obtain the gaussian weighted local energy of the two bright feature images, and the gaussian weighted local energy function is shown in formula (16):
in the formula (I), the compound is shown in the specification,representing a gaussian-weighted local energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents P 'or R';
the bright feature images L are obtained by the formula (16)P'Gaussian weighted local energy ofAnd bright feature image LR'Gaussian weighted local energy of
S3.2.2 solving image LP'And an image LR'Matching degree of the Gaussian weighted local energy;
image LP'And an image LR'The matching degree of the Gaussian weighted local energy is as follows:
s3.2.3 fusing the image L by using the matching degree of the local energy and the local energy weighted by GaussianP'And LR'Obtaining a bright feature image fusion result FL;
Image LP'And LR'The fusion rule of (1) is:
in the formula, TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LP'And LR'The regions centered on the point (m, n) are not similar, and the two images L areP'And LR'Selecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LP'And LR'The fusion result of (2) is a coefficient weighted average.
Still further, the dark feature image D is fused in the above step S3.3P'And dark feature image DR'The method comprises the following steps:
s3.3.1, respectively obtaining the Gaussian weighted local energy of the two dark feature images by using the local area weighted variance energy function; the weighted variance energy function for a local region is shown in equation (20):
in the formula (I), the compound is shown in the specification,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents P 'or R';
respectively obtaining dark feature images D by using a formula (20)P'Local area weighted variance energy ofAnd dark feature image DR'Local area weighted variance energy of
S3.3.2 solving two images DP'And DR'The local region weighted variance energy matching degree;
image DP'And an image DR'The matching degree of the local area weighted variance energy is obtained by the formula (21):
s3.3.3 fusing two images D by using local weighted variance energy and local weighted variance energy matching degreeP'And DR'Obtaining a fusion result F of the dark feature imagesD;
Two images DP'And DR'The fusion formula of (a) is:
in the formula, ThA threshold value for judging the similarity of the darkness features is fused, and the value is 0.5-1; if it is ME(m,n)<ThThen two images DP'And DR'The regions centered on the point (m, n) are not similar, and the two images DP'And DR'Selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DP'And DR'The fusion result of (2) is a coefficient weighted average.
Still further, the step S3.4 of fusing detail features of the polarization degree and polarization angle images by using a feature difference driving method based on local gradients and local variances includes the following steps:
s3.4.1, finding the detail feature image P of the image PP'Detail feature image P of sum image RR'The local gradient of (a);
the formula for solving the local gradient is:
in the formula (I), the compound is shown in the specification,representing the local gradient at the pixel point (m, n); k represents P 'or R'; respectively representing horizontal and vertical edge images obtained by convolution of horizontal and vertical templates of a Sobel operator and the detail characteristic image;
obtaining the detail feature image P using the formula (24)P'Local gradient ofAnd detail feature image PR'Local gradient of
S3.4.2 finding the detail feature image PP'And PR'The local weighted variance of (c);
the local weighted variance is the same as equation (20), i.e., as shown in equation (25):
in the formula (I), the compound is shown in the specification,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents P 'or R';
the detail feature image P is obtained using the formula (25)P'Local weighted variance ofAnd detail feature image PR'Local gradient of
S3.4.3, obtaining the local gradient matching degree, the local weighted variance matching degree, the local difference gradient and the local difference variance of the two detail characteristic images:
s3.4.4, obtaining a pixel-based decision map PDG (M, n) according to the local difference gradient Delta T (M, n) and the local difference variance Delta V (M, n), and matching the local gradient with the degree MT(M, n) and local weighted variance matching MV1(m, n) obtaining a feature difference degree decision graph DDG, comprising the steps of:
(1) according to bureauThe local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) obtain a pixel-based decision map PDG (m, n): when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9(ii) a Wherein p is1~p9A decision diagram showing that the pixel position is 1 and the other pixel positions are 0 when the above condition is satisfied;
(2) according to local gradient matching degree MT(M, n) and local weighted variance matching MV(m, n) a feature difference degree decision graph DDG can be obtained, as shown in equation (30):
in the formula (d)1And d2The corresponding pixel position satisfying the formula (30) is 1, and the other pixel positions are 0.
S3.4.5, judging the determined region and the uncertain region according to the decision graph PDG (m, n) based on the pixel and the feature difference degree decision graph DDG;
it can be judged by the PDG that p satisfying the (1) corresponding condition of step S3.4.41、p2、p5、p6、p7And p8To determine the area: since for p1And p2For the two situations, the two difference characteristics can reflect whether the gray value of the corresponding pixel point is reserved in the fused image; for p5、p6、p7And p8For the four conditions, whether the gray value of the corresponding pixel point is reserved in the fusion image can be reflected by using one difference characteristic;
p can be judged by PDG and DDG3And p4To determine the area: because the DDG can determine the difference degree of the local features of the two images and then select the difference feature with larger difference degree, the difference feature can reflect whether the gray value of the corresponding pixel point is reserved in the fused image;
however for region p9The PDG and DDG judgment method cannot be carried out according to the two decision graphs, and the area belongs to an uncertainty area;
s3.4.6, fusing the determined areas of the two detail feature images by using feature difference drive;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as a fusion driving factor for determining the region, which is denoted as DIF (m, n), as shown in the following formula:
DIF(m,n)=ΔT(m,n)·ΔV(m,n) (31)
then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area asComprises the following steps:
where "· represents the product of values at corresponding pixel locations in the matrix;
s3.4.7, fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
suppose "detail feature image PP'And PR'The local gradient of (a) is large is a membership function of [ mu ] respectivelyT(Pp'(m, n)) and μT(PR'(m, n)), "detail feature image PP'And PR'The local weighted variance of (1) is large, and the membership functions of (1) are respectively muV(PP'(m, n)) and μV(PR'(m, n)), as shown in formulas (33) and (34):
in the formula, k represents P 'or R'.
Two detail characteristic images P can be respectively calculated by using fuzzy logic traffic operation ruleP'And PR'The membership functions of the pixel values at the position (m, n) to the importance degree of the fused image of the uncertain region are respectively marked as muT∩V(PP'(m, n)) and μT∩V(PR'(m, n)), as shown in formula (35):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))] (35)
wherein k represents P 'or R';
then, the fusion result of the uncertain region of the two image detail feature images is:
wherein "·" represents the product of the values at the corresponding pixel locations in the matrix, "·/" represents the division of the values at the corresponding pixel locations in the matrix;
s3.4.8, fusionAndobtaining a fusion result of two detail characteristic images as FDIF(m, n) and for FDIF(m, n) performing consistency check;
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) move up, verifying the central pixel with the pixels around the window.
Still further, the result of fusing steps S3.2, S3.3 and S3.4 in step S3.5 above is to fuse an image F containing bright featuresL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n), the fusion formula is shown as formula (38):
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
in the formula, alpha, beta and gamma are fusion weight coefficients, and the value range is [0,1 ]; more preferably, the value of α is 1, the value of β is 0.3, and the value of γ is 1, so as to satisfy the supersaturation of less fused images and improve the contrast.
Still further, the method for fusing in YUV space in step S4 by using the fusion result and the total intensity image I in step S3 includes: and inputting the total intensity image I into a Y channel of a YUV space, negating the fusion result obtained in the step S3 to obtain an image F ', namely F ' is 255-F, inputting the F ' into a U channel, and finally inputting the fusion result F obtained in the step S3 into a V channel to obtain a final fusion image.
Compared with the prior art, the invention has the following beneficial effects:
1. the method disclosed by the invention fuses all the polarization quantities of the polarization images, effectively solves the problem of information redundancy among all the polarization quantities in the polarization image fusion process, fuses the complementary information among all the polarization quantities, enriches the fused image scene and is beneficial to identifying the disguised target.
2. The method integrates the local area energy characteristic representing the correlation between adjacent pixels of the image and the local area brightness difference, the local area variance characteristic reflecting the local area gray change difference and the local area gradient reflecting the detail variance of the image pixels in the image fusion process, the local area energy characteristic representing the correlation between the adjacent pixels of the image and the local area brightness difference is favorable for improving the image contrast, the local area variance characteristic reflecting the local area gray change difference is favorable for improving the image definition, and the local area gradient reflecting the detail variance of the image pixels is favorable for enhancing the detail information of the image, thereby improving the image definition and contrast, and enhancing the detail information of the edge of the image and the like.
3. The method disclosed by the invention is used for fusing based on the human eye visual characteristic YUV color space, so that the visualization effect of the polarization image fusion result is effectively improved.
Drawings
These and/or other aspects and advantages of the present invention will become more apparent and more readily appreciated from the following detailed description of the embodiments of the invention, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a fusion method of infrared polarization images based on YUV and multi-feature separation according to an embodiment of the present invention;
fig. 2 is a flowchart of a detail feature image fusion method of images P 'and R' driven by fuzzy logic and feature difference in the fusion method of infrared polarization images based on YUV and multi-feature separation according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, the following detailed description of the invention is provided in conjunction with the accompanying drawings and the detailed description of the invention.
Example 1
A specific operation flow of the fusion method of the infrared polarization images based on YUV and multi-feature separation is shown in FIG. 1, and the fusion method comprises the following steps S1-S4.
S1, representing the polarization state of light by using Stokes vectors, and calculating a polarization degree image P and a polarization angle image R according to the Stokes vectors;
the infrared radiation intensity imaging is mainly related to the temperature, radiance and the like of a scenery, when a noise source with the same temperature is placed around a target object, an existing thermal infrared imager cannot identify the target, and the infrared imaging technology faces serious limitations and challenges. Compared with the traditional infrared imaging, the light polarization imaging can reduce the degradation influence of the complex scene, and the light polarization imaging can reduce the degradation influence of the complex scene, so that the identification of the disguised target is facilitated.
Polarization is a basic characteristic of light and cannot be directly observed by human eyes, so that polarization information needs to be displayed in a certain form and is perceived by human eyes or convenient for computer processing. The polarization state of the light is represented by a Stokes vector, and the Stokes vector describes the polarization state and the intensity of the light by four Stokes parameters, which are all time average values of light intensity, have intensity dimensions and can be directly detected by a detector. The representation of the Stokes vector S is as follows:
in the formula I1、I2、I3And I4Respectively representing the acquired light intensity images with the polarization directions of 0 degree, 45 degrees, 90 degrees and 135 degrees; i represents the total intensity of light; q represents the intensity difference between horizontal and vertical polarization, and U represents the intensity difference between 45 ° and 135 ° in the polarization direction; v represents the intensity difference between the left-and right-hand circularly polarized components of the light.
In actual polarization, a phase retarder is not needed, and the Stokes parameters can be obtained only by rotating a linear polarizer. The polarization degree image P and the polarization angle image R of the polarized light can be expressed as:
s2, solving a unique part P 'of the polarization degree and a unique part R' of the polarization angle image according to the common part Co of the polarization degree and the polarization angle image;
redundant and mutually complementary information exists between the polarization degree and polarization angle images, and the common part Co of the images P and R can be obtained by performing the operation of the formula (4) on each pixel of the images P and R.
Co=min(p,R) (4)
Unique portions of the polarization degree and polarization angle images can be obtained using equations (5) and (6), respectively.
P'=P-Co (5)
R'=R-Co (6)
S3 fusing the images P 'and R' by a multi-feature separation method based on the dark primary color theory;
s3.1, performing multi-feature separation on the images P 'and R' based on a dark primary color theory to obtain dark feature images, bright feature images and detail feature images of the images P 'and R' respectively,
the dark primary color is He and the like used for estimating the transmittance in the atmospheric scattering model, and quick defogging of the natural image is realized. The solution method of the dark channel map is shown in equation (7).
Where C is the three color channels R, G, B of the image; n (x) is a pixel in the window area with the pixel point x as the center; l isC(y) a color channel map of the image; l isdark(x) The image fog degree is reflected for the dark channel image, and the dark channel L of the fog-free imagedark(x) The value tends to 0; for foggy images, Ldark(x) The value is large. In a natural image, the region obviously affected by fog is generally the brightest pixel point in the dark primary color, and the pixel value of the fog-free region in the dark primary color is very low. Therefore, for the gray-scale image, the dark primary color image includes a bright area in the original image, and reflects the low-frequency part of the original image, that is, an area with relatively smooth gray-scale change in the original image is reserved, so that the difference of the bright and dark characteristics is more prominent, and the local area information with relatively violent gray-scale change and high contrast, especially the edge detail information, is lost.
Obtaining a dark primary color image of the unique portion P' of the polarization degree image by equation (7)
Obtaining a dark primary color image of a portion unique to the polarization angle image R' by equation (7)
The method comprises the following steps of respectively obtaining bright features, dark features and detail features of two images by adopting multi-feature separation based on a dark primary color theory:
s3.1.1 obtaining the image after inversion of the image P' by using the formulas (8) and (9), respectivelyAnd the image after the inversion of the image RThen the dark primary color image is processedAndrespectively associated with the imagesAndfusing according to the rule that the absolute value is small to obtain the dark feature image D of the image PP'Dark feature image D of sum image RR'As shown in equations (10) and (11);
s3.1..2 willAndimages D respectively corresponding theretoP'And DRTaking the difference to obtain a bright feature image L of the image PP'And bright feature image L of image RR'As shown in equations (12) and (13);
s3.1.3 associating image P 'and image R' with dark primary color images, respectivelyAnd dark primary color imageTaking the difference to obtain a detail image P of the image PP'Detail image P of sum image RR'As shown in formulas (14) and (15);
also, it should be noted that the following steps S3.2, S3.3 and S3.4 are not time-series and can be described in one step, and the step numbers are only performed for convenience of the following description: steps S3.2, S3.3 and S3.4 may be performed synchronously; any two can be carried out simultaneously, and then the last one is carried out; or any one of the two can be carried out firstly, and then the rest two can be carried out synchronously; or one of them in sequence.
S3.2, fusing bright feature images of the images P 'and R' by adopting a matching method based on local region energy features;
the bright feature information concentrates the bright area in the original image, reflects the low-frequency component in the original image, and fuses the bright feature image LP'And bright feature image LR'Comprises the following steps:
s3.2.1, respectively obtaining the Gaussian weighted local energy of the two bright characteristic images by using the Gaussian weighted local energy function;
the gaussian weighted local energy function is shown in equation (16):
in the formula (I), the compound is shown in the specification,representing a gaussian-weighted local energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents P 'or R'; then, the bright feature images L can be obtained respectively by using the formula (16)P'Gaussian weighted local energy ofAnd bright feature image LR'Gaussian weighted local energy of
S3.2.2 solving for image LP'And an image LR'Matching degree of the Gaussian weighted local energy;
image LP'And an image LR'The matching degree of the Gaussian weighted local energy is as follows:
s3.2.3 fusing the image L with the Gaussian-weighted local energy and the matching degree of the Gaussian-weighted local energyP'And LR'Obtaining a bright feature image fusion result FL;
Image LP'And LR'The fusion rule of (1) is:
in the formula, TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LP'And LR'The regions centered on the point (m, n) are not similar, and the two images L areP'And LR'Selecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LP'And LR'The fusion result of (2) is a coefficient weighted average.
S3.3, fusing dark feature images of the images P 'and R' by adopting a matching method based on local region weighted variance features;
dark feature images lack bright areas in the source image, but can still be viewed as an approximation of the source imageThe image contains the main energy of the image and the basic outline of the image, and the dark characteristic image D is fusedP'And dark feature image DR'Comprises the following steps:
s3.3.1, respectively using local area weighted variance energy function to obtain the Gaussian weighted local energy of two dark feature images;
the weighted variance energy function for a local region is shown in equation (20):
in the formula (I), the compound is shown in the specification,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents P 'or R'.
Then, the dark feature image D can be obtained by the formula (20)P'Local area weighted variance energy ofAnd dark feature image DR'Local area weighted variance energy of
S3.3.2 solving for two images DP'And DR'The local region weighted variance energy matching degree;
image DP'And an image DR'Local area weighted variance energy matching:
s3.3.3 fusion of two images D by using energy of local weighted variance and energy matching degree of local weighted varianceP'And DR'Obtaining a fusion result F of the dark feature imagesD;
Two images DP'And DR'The fusion rule of (1) is:
in the formula, ThA threshold value for judging the similarity of the darkness features is fused, and the value is 0.5-1; if it is
ME(m,n)<ThThen two images DP'And DR'The regions centered on the point (m, n) are not similar, and the two images DP'And DR'Selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DP'And DR'The fusion result of (2) is a coefficient weighted average.
S3.4 driving the detail feature images of the fused images P 'and R' using fuzzy logic and feature differences.
The local gradient and the local variance can well reflect the detail information of the image, and the definition of the image is expressed. In order to retain the detail information of the detail feature image as much as possible and improve the definition, the feature difference driving method based on the local gradient and the local variance is adopted to fuse the detail features of the polarization degree and the polarization angle image, as shown in fig. 2, the specific steps are as follows:
s3.4.1 finding the detail feature image P of the image PP'Detail feature image P of sum image RR'The local gradient of (a);
the formula for solving the local gradient is:
in the formula (I), the compound is shown in the specification,representing the local gradient at the pixel point (m, n); k represents P 'or R'; respectively representing horizontal and vertical edge images obtained by convolving the horizontal and vertical templates of the Sobel operator with the detail feature image.
Obtaining the detail feature image P using the formula (24)P'Local gradient ofAnd detail feature image PR'Local gradient of
S3.4.2 finding a detail feature image PP'And PR'The local weighted variance of (c);
the local weighted variance is obtained by the same method as the formula (20):
in the formula (I), the compound is shown in the specification,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents P 'or R'.
The detail feature image P is obtained using the formula (25)P'Is added topicallyVariance of weightAnd detail feature image PR'Local gradient of
S3.4.3, obtaining local difference gradient and local difference variance of the two detail feature images, and matching degree of the local gradient feature and the local weighted variance feature;
s3.4.4 obtaining a decision graph based on pixel according to the feature difference and a feature difference degree decision graph by using the feature matching degree;
(1) from the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n), a pixel-based decision map PDG (m, n) is obtained: when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ 0p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9. Wherein p is1~p9A decision diagram showing that the pixel position is 1 and the other pixel positions are 0 when the above condition is satisfied;
(2) according to local gradient matching degree MT(M, n) and local weighted variance matching MV(m, n) a feature difference degree decision graph DDG can be obtained, as shown in equation (30):
in the formula (d)1And d2The corresponding pixel position satisfying the formula (30) is 1, and the other pixel positions are 0.
S3.4.5 determining the determined area and the uncertain area according to the decision map based on the pixel and the feature difference degree decision map;
it can be judged by the PDG that p satisfying the (1) corresponding condition of step S3.4.41、p2、p5、p6、p7And p8To determine the region. Since for p1And p2For the two situations, the two difference characteristics can reflect whether the gray value of the corresponding pixel point is reserved in the fused image; for p5、p6、p7And p8For the four cases, whether the gray value of the corresponding pixel point is kept in the fused image can be reflected by using one of the difference characteristics.
P can be judged by PDG and DDG3And p4To determine the region. Because the DDG can be used for determining the difference degree of the local features of the two images, and then the difference feature with larger difference degree is selected, the difference feature can reflect whether the gray value of the corresponding pixel point is reserved in the fused image.
However for region p9It is not possible to judge the method from the two decision graphs PDG and DDG, which belongs to the uncertainty region.
S3.4.6 fusing the determined regions of the two detail feature images using feature difference drive;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as a fusion driving factor for determining the region, which is denoted as DIF (m, n), as shown in the following formula:
DIF(m,n)=ΔT(m,n)·ΔV(m,n) (31)
then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area asComprises the following steps:
where "· denotes the product of the values at the corresponding pixel locations in the matrix.
S3.4.7 fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
fusing uncertain regions by using fuzzy logic theory and aiming at detail feature image PP'And PR'Considering whether the local gradient of the detail feature image is large or the local weighted variance of the detail feature image is large, the membership function of the detail feature image is constructed according to the group of relations. Then assume "detail feature image PP'And PR'Is large, respectively, as muT(Pp'(m, n)) and μT(PR'(m, n)), "detail feature image PP'And PR'The local weighted variance of (1) is large, and the membership functions of (1) are respectively muV(PP'(m, n)) and μV(PR'(m, n)), as shown in formulas (33) and (34):
in the formula, k represents P 'or R'.
Two detail characteristic images P can be respectively calculated by using fuzzy logic traffic operation ruleP'And PR'The membership functions of the pixel values at the position (m, n) to the importance degree of the fused image of the uncertain region are respectively marked as muT∩V(PP'(m, n)) and μT∩V(PR'(m, n)), as shown in formula (35):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))] (35)
in the formula, k represents P 'or R'.
Then, the fusion result of the uncertain region of the two image detail feature images is:
where "·" represents the product of the values at the corresponding pixel locations in the matrix and "·/" represents the division of the values at the corresponding pixel locations in the matrix.
S3.4.8 fusionAndobtaining a fusion result of two detail characteristic images as FDIF(m, n) and for FDIF(m, n) performing consistency check;
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) move up, verifying the central pixel with the pixels around the window. If the central pixel is from PrAnd PpIn one of the images, the surrounding s (4 < s < 8) pixels of the central pixelThe pixels are from the other image, the central pixel value is changed to the pixel value of the other image at that location, and the window traverses the entire image FDIF(m, n) to obtain corrected FDIF(m,n)。
S3.5 the results of steps S3.2, S3.3 and S3.4 are fused to obtain a fused result of images P 'and R'.
Fusing images F containing bright featuresL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n), the fusion formula is as follows:
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
in the formula, alpha, beta and gamma are fusion weight coefficients, and the value range is [0,1 ]. To reduce the oversaturation of the fused image and to improve the contrast, α is 1, β is 0.3, and γ is 1.
S4, fusing the fusion result and the total intensity image I in the YUV space by using the fusion result in the step S3 to obtain a final fusion result;
and inputting the total intensity image I into a Y channel of a YUV space, negating the fusion result obtained in the step S3 to obtain an image F ', namely F ' is 255-F, inputting the F ' into a U channel, and finally inputting the fusion result F obtained in the step S3 into a V channel to obtain a final fusion image.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (2)
1. A fusion method of infrared polarization images based on YUV and multi-feature separation is characterized by comprising the following steps:
s1, representing the polarization state of the light by using a Stokes vector, and then calculating a polarization degree image P and a polarization angle image R according to the Stokes vector;
s2, solving a unique part P 'of the polarization degree image and a unique part R' of the polarization angle image according to the common part Co of the polarization degree and the polarization angle image;
s3, fusing images P 'and R' by a multi-feature separation method based on a dark primary color theory;
s4, fusing the fusion result and the total intensity image I in the YUV space by using the fusion result in the step S3 to obtain a final fusion result;
the representation of the Stokes vector S in step S1 is shown in formula (1):
in the formula I1、I2、I3And I4Respectively representing the acquired light intensity images with the polarization directions of 0 degree, 45 degrees, 90 degrees and 135 degrees; i represents the total intensity of light; q represents the intensity difference between horizontal and vertical polarization, and U represents the intensity difference between 45 ° and 135 ° in the polarization direction; v represents the intensity difference between the left-and right-hand circular polarization components of the light;
the polarization degree image P and the polarization angle image R of the polarized light are expressed as:
in the step S2, in the above step,
the common portion Co of the images P and R is obtained for each pixel in the images P and R using equation (4):
Co=min(P,R) (4)
then, the unique part P 'of the polarization degree image and the unique part R' of the polarization angle image are obtained using equations (5) and (6), respectively:
P'=P-Co (5)
R'=R-Co (6);
the step S3 includes the steps of:
s3.1, performing multi-feature separation on the images P 'and R' based on a dark primary color theory to obtain dark feature images, bright feature images and detail feature images of the images P 'and R' respectively;
s3.2, fusing bright feature images of the images P 'and R' by adopting a matching method based on local region energy features; fusing dark feature images of the images P 'and R' by adopting a matching method based on local area weighted variance features; driving the detail feature images of the fused images P 'and R' by using fuzzy logic and feature difference;
s3.3, fusing the results of the bright characteristic image, the dark characteristic image and the detail characteristic image in the step S3.2 to obtain a fusion result of the images P 'and R';
in said step S3.1, the first step,
the dark primary color is used for estimating the transmittance in the atmospheric scattering model to realize the fast defogging of the natural image, and the solving method of the dark channel map is shown as the formula (7):
where C is the three color channels R, G, B of the image; n (x) is a pixel in the window area with the pixel point x as the center; l isC(y) a color channel map of the image; l isdark(x) The image fog degree is reflected for the dark channel image, and the dark channel L of the fog-free imagedark(x) The value tends to 0; for foggy images, Ldark(x) The value is large;
obtaining a dark primary color image of the unique portion P' of the polarization degree image by equation (7)
The method comprises the following steps of respectively obtaining bright features, dark features and detail features of two images by adopting multi-feature separation based on a dark primary color theory:
s3.1.1 obtaining the image after inversion of the image P' using equations (8) and (9), respectivelyAnd the image after the inversion of the image RThen the dark primary color image is processedAndrespectively associated with the imagesAndfusing according to the rule that the absolute value is small to obtain the dark feature image D of the image PP'Dark feature image D of sum image RR'As shown in equations (10) and (11);
s3.1.2, willAndimages D respectively corresponding theretoP'And DR'Taking the difference to obtain a bright feature image L of the image PP'And bright feature image L of image RR'As shown in equations (12) and (13):
s3.1.3, and associating image P 'and image R' with the dark primary color image, respectivelyAnd dark primary color imageTaking the difference to obtain a detail image P of the image PP'Detail image P of sum image RR'As shown in formulas (14) and (15);
in the step S3.2, a matching method based on local region energy features is adopted to fuse bright feature images of the images P 'and R', and the method includes the following steps:
s3.2.1, the gaussian weighted local energy function is used to obtain the gaussian weighted local energy of the two bright feature images, and the gaussian weighted local energy function is shown in formula (16):
in the formula (I), the compound is shown in the specification,representing a gaussian-weighted local energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents P 'or R';
the bright feature images L are obtained by the formula (16)P'Gaussian weighted local energy ofAnd bright feature image LR'Gaussian weighted local energy of
S3.2.2 solving image LP'And an image LR'Matching degree of the Gaussian weighted local energy;
image LP'And an image LR'The matching degree of the Gaussian weighted local energy is as follows:
s3.2.3 fusing the image L by using the matching degree of the local energy and the local energy weighted by GaussianP'And LR'Obtaining a bright feature image fusion result FL;
Image LP'And LR'The fusion rule of (1) is:
in the formula, TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LP'And LR'The regions centered on the point (m, n) are not similar, and the two images L areP'And LR'Selecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LP'And LR'The fusion result of (1) is coefficient weighted average;
in the step S3.2, the dark feature images of the images P 'and R' are fused by using a matching method based on local area weighted variance features, which includes the following steps:
s3.2.4, respectively obtaining the Gaussian weighted local energy of the two dark feature images by using the local area weighted variance energy function; the weighted variance energy function for a local region is shown in equation (20):
in the formula (I), the compound is shown in the specification,represents the local region weighted variance energy centered at point (m, n); w is a(i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents P 'or R';
respectively obtaining dark feature images D by using a formula (20)P'Local area weighted variance energy ofAnd dark feature image DR'Local area weighted variance energy of
S3.2.5 solving two images DP'And DR'The local region weighted variance energy matching degree;
image DP'And an image DR'The matching degree of the local area weighted variance energy is obtained by the formula (21):
s3.2.6 fusing two images D by using local weighted variance energy and local weighted variance energy matching degreeP'And DR'Obtaining a fusion result F of the dark feature imagesD;
Two images DP'And DR'The fusion formula of (a) is:
in the formula, ThFor fusing darkness characteristicsTaking the value of the threshold value of similarity judgment as 0.5-1; if M isE(m,n)<ThThen two images DP'And DR'The regions centered on the point (m, n) are not similar, and the two images DP'And DR'Selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DP'And DR'The fusion result of (1) is coefficient weighted average;
in the step S3.2, the detail feature images of the fused images P 'and R' are driven by fuzzy logic and feature difference, and the method includes the following steps:
s3.2.7, finding the detail feature image P of the image PP'Detail feature image P of sum image RR'The local gradient of (a);
the formula for solving the local gradient is:
in the formula (I), the compound is shown in the specification,representing the local gradient at the pixel point (m, n); k represents P 'or R'; respectively representing horizontal and vertical edge images obtained by convolution of horizontal and vertical templates of a Sobel operator and the detail characteristic image;
obtaining the detail feature image P using the formula (24)P'Local gradient ofAnd detail feature image PR'Local gradient of
S3.2.8 finding the detail feature image PP'And PR'The local weighted variance of (c);
the local weighted variance is shown in equation (25):
in the formula (I), the compound is shown in the specification,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents P 'or R';
the detail feature image P is obtained using the formula (25)P'Local weighted variance ofAnd detail feature image PR'Local gradient of
S3.2.9, obtaining the local gradient matching degree, the local weighted variance matching degree, the local difference gradient and the local difference variance of the two detail characteristic images:
s3.2.10, obtaining a pixel-based decision map PDG (M, n) according to the local difference gradient Delta T (M, n) and the local difference variance Delta V (M, n), and matching the local gradient with the degree MT(M, n) and local weighted variance matching MV1(m, n) obtaining a feature difference degree decision graph DDG, comprising the steps of:
(1) obtaining a pixel-based decision map PDG (m, n) from the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n): when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9(ii) a Wherein p is1~p9A decision diagram showing that the pixel position is 1 and the other pixel positions are 0 when the above condition is satisfied;
(2) according to local gradient matching degree MT(M, n) and local weighted variance matching MV(m, n) a feature difference degree decision graph DDG can be obtained, as shown in equation (30):
in the formula (d)1And d2Representing a decision diagram that the pixel position corresponding to the formula (30) is 1 and the other pixel positions are 0;
s3.2.11, deciding the determined region and the uncertain region according to the pixel-based decision graph PDG (m, n) and the feature difference degree decision graph DDG:
s3.2.12, fusing the determined areas of the two detail feature images by using feature difference drive;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as the fusion driving factor for determining the region, which is denoted as DIF (m, n), as shown in equation (31):
DIF(m,n)=ΔT(m,n)·ΔV(m,n) (31)
then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area asComprises the following steps:
where "· represents the product of values at corresponding pixel locations in the matrix;
s3.2.13, fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
suppose "detail feature image PP'And PR'If the local gradient of (a) is large, the membership functions are respectively μT(Pp'(m, n)) and μT(PR'(m, n)), assume "detailed feature image PP'And PR'If the local weighted variance of (2) is large, the membership functions are respectively μV(PP'(m, n)) and μV(PR'(m, n)), as shown in formulas (33) and (34):
wherein k represents P 'or R';
two detail characteristic images P can be respectively calculated by using fuzzy logic traffic operation ruleP'And PR'The membership functions of the pixel values at the position (m, n) to the importance degree of the fused image of the uncertain region are respectively marked as muT∩V(PP'(m, n)) and μT∩V(PR'(m, n)), as shown in formula (35):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))] (35)
wherein k represents P 'or R';
then, the fusion result of the uncertain region of the two image detail feature images is:
wherein "·" represents the product of the values at the corresponding pixel locations in the matrix, "·/" represents the division of the values at the corresponding pixel locations in the matrix;
s3.2.14, fusionAndobtaining a fusion result of two detail characteristic images as FDIF(m, n) and for FDIF(m, n) performing consistency check;
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) moving up, verifying the central pixel with the pixels around the window;
the method for fusing the fusion result and the total intensity image I in the YUV space in the step S4 by using the fusion result and the total intensity image I in the step S3 includes:
and inputting the total intensity image I into a Y channel of a YUV space, negating the fusion result obtained in the step S3 to obtain an image F ', namely F ' is 255-F, inputting the F ' into a U channel, and finally inputting the fusion result F obtained in the step S3 into a V channel to obtain a final fusion image.
2. The method for fusing infrared polarization images based on YUV and multi-feature separation as claimed in claim 1, wherein the fusing of the bright feature image, the dark feature image and the detail feature image in step S3.2 in step S3.3 results in fusing an image F containing bright featuresL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n), the fusion formula is shown as formula (38):
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
in the formula, alpha, beta and gamma are fusion weight coefficients, alpha is 1, beta is 0.3, and gamma is 1, so as to satisfy the supersaturation of less fusion images and improve the contrast.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811180887.1A CN109410161B (en) | 2018-10-09 | 2018-10-09 | Fusion method of infrared polarization images based on YUV and multi-feature separation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811180887.1A CN109410161B (en) | 2018-10-09 | 2018-10-09 | Fusion method of infrared polarization images based on YUV and multi-feature separation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109410161A CN109410161A (en) | 2019-03-01 |
CN109410161B true CN109410161B (en) | 2020-11-13 |
Family
ID=65467525
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811180887.1A Active CN109410161B (en) | 2018-10-09 | 2018-10-09 | Fusion method of infrared polarization images based on YUV and multi-feature separation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109410161B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111307065B (en) * | 2020-03-05 | 2021-10-26 | 中国铁道科学研究院集团有限公司基础设施检测研究所 | Steel rail profile detection method, system and device based on polarization beam splitting |
CN111383242B (en) * | 2020-05-29 | 2020-09-29 | 浙江大华技术股份有限公司 | Image fog penetration processing method and device |
CN112164017B (en) * | 2020-09-27 | 2023-11-17 | 中国兵器工业集团第二一四研究所苏州研发中心 | Polarization colorization method based on deep learning |
CN113421205B (en) * | 2021-07-16 | 2022-11-15 | 合肥工业大学 | Small target detection method combined with infrared polarization imaging |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107253485A (en) * | 2017-05-16 | 2017-10-17 | 北京交通大学 | Foreign matter invades detection method and foreign matter intrusion detection means |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108389158A (en) * | 2018-02-12 | 2018-08-10 | 河北大学 | A kind of infrared and visible light image interfusion method |
-
2018
- 2018-10-09 CN CN201811180887.1A patent/CN109410161B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107253485A (en) * | 2017-05-16 | 2017-10-17 | 北京交通大学 | Foreign matter invades detection method and foreign matter intrusion detection means |
Non-Patent Citations (4)
Title |
---|
Camouflaged Target Separation by Spectral-polarimetric Imagery Fusion with Shearlet Transform and Clustering Segmentation;ZHOU Pu-cheng 等;《International Symposium on Photoelectronic Detection and Imaging 2013:Imaging Sensors and Applications》;20130821;第89081G-4-89081G-5页 * |
一种红外偏振与光强图像的暗原色多特征分离融合方法;郭喆 等;《科学技术与工程》;20170928;第76-78页 * |
模糊逻辑与特征差异驱动的红外偏振图像融合模型;安富 等;《红外技术》;20140421;第308-309页 * |
红外偏振与红外光强图像的伪彩色融合研究;李伟伟 等;《红外技术》;20120220;第109页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109410161A (en) | 2019-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109410161B (en) | Fusion method of infrared polarization images based on YUV and multi-feature separation | |
He et al. | Haze removal using the difference-structure-preservation prior | |
Galdran et al. | Enhanced variational image dehazing | |
Negru et al. | Exponential contrast restoration in fog conditions for driving assistance | |
Moeslund | Introduction to video and image processing: Building real systems and applications | |
US8457437B2 (en) | System and method for enhancing registered images using edge overlays | |
Toet et al. | Merging thermal and visual images by a contrast pyramid | |
CN109409186B (en) | Driver assistance system and method for object detection and notification | |
Yuan et al. | A region-wised medium transmission based image dehazing method | |
CN106096604A (en) | Multi-spectrum fusion detection method based on unmanned platform | |
CN105678318B (en) | The matching process and device of traffic sign | |
Chen et al. | An advanced visibility restoration algorithm for single hazy images | |
Halmaoui et al. | Contrast restoration of road images taken in foggy weather | |
Wang et al. | Variational based smoke removal in laparoscopic images | |
Yu et al. | A false color image fusion method based on multi-resolution color transfer in normalization YCBCR space | |
CN116091372B (en) | Infrared and visible light image fusion method based on layer separation and heavy parameters | |
CN110827218A (en) | Airborne image defogging method based on image HSV transmissivity weighted correction | |
CN106023108A (en) | Image defogging algorithm based on boundary constraint and context regularization | |
Stone et al. | Forward looking anomaly detection via fusion of infrared and color imagery | |
CN109410160B (en) | Infrared polarization image fusion method based on multi-feature and feature difference driving | |
Asmare et al. | Image Enhancement by Fusion in Contourlet Transform. | |
Pal et al. | Visibility enhancement techniques for fog degraded images: a comparative analysis with performance evaluation | |
de Oliveira Gaya et al. | Single image restoration for participating media based on prior fusion | |
CN107301625B (en) | Image defogging method based on brightness fusion network | |
Qian et al. | Effective contrast enhancement method for color night vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |