CN109410161B - Fusion method of infrared polarization images based on YUV and multi-feature separation - Google Patents

Fusion method of infrared polarization images based on YUV and multi-feature separation Download PDF

Info

Publication number
CN109410161B
CN109410161B CN201811180887.1A CN201811180887A CN109410161B CN 109410161 B CN109410161 B CN 109410161B CN 201811180887 A CN201811180887 A CN 201811180887A CN 109410161 B CN109410161 B CN 109410161B
Authority
CN
China
Prior art keywords
image
images
local
feature
dark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811180887.1A
Other languages
Chinese (zh)
Other versions
CN109410161A (en
Inventor
宋斌
冉骏
陈蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Yuanxin Optoelectronics Technology Co ltd
Original Assignee
Hunan Yuanxin Optoelectronics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Yuanxin Optoelectronics Technology Co ltd filed Critical Hunan Yuanxin Optoelectronics Technology Co ltd
Priority to CN201811180887.1A priority Critical patent/CN109410161B/en
Publication of CN109410161A publication Critical patent/CN109410161A/en
Application granted granted Critical
Publication of CN109410161B publication Critical patent/CN109410161B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an infrared polarization image fusion method based on YUV and multi-feature separation, which comprises the following steps: firstly, expressing the polarization state of light by using a Stokes vector, and calculating a polarization degree image and a polarization angle image; then solving the unique parts P 'and R' of the polarization degree and the polarization angle images respectively; and fusing the images P 'and R' by utilizing a multi-feature separation method of a dark primary color theory, and fusing the fusion result and the total intensity image I in a YUV space to obtain a final infrared polarization image fusion result. The method is based on human visual characteristics YUV color space for fusion, effectively improves the visualization effect of the fusion result of the polarized images, enhances the detail information such as the edge of the images, improves the contrast of the images, fuses the complementary information among all polarized quantities, makes the fused image scene richer, and is beneficial to the identification of the camouflage target.

Description

Fusion method of infrared polarization images based on YUV and multi-feature separation
Technical Field
The invention relates to the technical field of image processing, in particular to an infrared polarization image fusion method based on YUV and multi-feature separation.
Background
The traditional infrared imaging system mainly images the infrared radiation intensity of a scene, and the imaging is mainly related to the temperature, the radiance and the like of the scene. When a noise source with the same temperature is placed around a target object, the existing thermal infrared imager cannot identify the camouflaged target, and the infrared imaging technology faces serious limitations and challenges. Compared with the traditional infrared imaging, the light polarization imaging can reduce the degradation influence of the complex scene, and meanwhile, the structure and distance information of the scene can be obtained. Just because the light polarization information is another information characterizing an object than radiation, the object to be measured of the same radiation may have different degrees of polarization, and useful signals can be detected in a complex background by using the polarization means.
The polarization state of light can be described in terms of stokes vectors, with the individual polarization quantities being redundant and complementary in terms of expressing polarization information. In the process of polarization image fusion, the situation that only single difference characteristics are considered exists, which cannot effectively describe all uncertain and randomly-changed image characteristics in an image, so that some valuable information is lost in the fusion process, and fusion and identification fail; meanwhile, the visual effect of the fused image influences the resolving power of human eyes and the recognition effect of the camouflage target.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, provides an infrared polarization image fusion method based on YUV and multi-feature separation, realizes the fusion of polarization components, fuses complementary information among the polarization components, enriches the fused image scene, enhances the detailed information such as the edge of an image and the like, improves the image contrast, and effectively improves the visualization effect of the polarization image fusion result by fusing the YUV color space based on the human visual characteristics.
The technical scheme of the invention is that an infrared polarization image fusion method based on YUV and multi-feature separation comprises the following steps:
s1, representing the polarization state of the light by using a Stokes vector, and calculating a polarization degree image P and a polarization angle image R according to the Stokes vector;
s2, solving the unique part P 'of the polarization degree image and the unique part R' of the polarization angle image according to the common part Co of the polarization degree and the polarization angle image;
s3, fusing images P 'and R' by a multi-feature separation method based on a dark primary color theory;
and S4, fusing the fusion result and the total intensity image I in the YUV space by using the fusion result in the step S3 to obtain a final fusion result.
Further, the representation of the Stokes vector S in step S1 is shown in formula (1):
Figure BDA0001822600850000021
in the formula I1、I2、I3And I4Respectively representing the acquired light intensity images with the polarization directions of 0 degree, 45 degrees, 90 degrees and 135 degrees; i represents the total intensity of light; q represents the intensity difference between horizontal and vertical polarization, and U represents the intensity difference between 45 ° and 135 ° in the polarization direction; v represents the intensity difference between the left-and right-hand circular polarization components of the light;
the polarization degree image P and the polarization angle image R of the polarized light are expressed as:
Figure BDA0001822600850000031
Figure BDA0001822600850000032
further, in the above step S2, each pixel in the images P and R is operated on by the formula (4), and the common portion Co of the images P and R is obtained:
Co=min(p,R) (4)
the unique part P 'of the polarization degree image and the unique part R' of the polarization angle image are obtained using equations (5) and (6), respectively:
P'=P-Co (5)
R'=R-Co (6)
still further, the step S3 includes the following steps:
s3.1, performing multi-feature separation on the images P 'and R' based on a dark primary color theory to obtain dark feature images, bright feature images and detail feature images of the images P 'and R' respectively;
s3.2, fusing bright feature images of the images P 'and R' by adopting a matching method based on local region energy features;
s3.3, fusing dark feature images of the images P 'and R' by adopting a matching method based on local area weighted variance features;
s3.4, driving and fusing the detail characteristic images of the images P 'and R' by using fuzzy logic and characteristic difference;
s3.5, fusing the results of the steps S3.2, S3.3 and S3.4 to obtain a fused result of the images P 'and R'.
It should be noted that steps S3.2, S3.3 and S3.4 are not time-sequenced and can be described in one step, and the description of steps S3.2, S3.3 and S3.4 is only made for the sake of clarity in the following description: steps S3.2, S3.3 and S3.4 may be performed synchronously; any two can be carried out simultaneously, and then the last one is carried out; or any one of the two can be carried out firstly, and then the rest two can be carried out synchronously; or one of them in sequence.
Still further, in step S3.1 above,
the dark primary color is He and the like, and is used for estimating the transmittance in the atmospheric scattering model to realize the rapid defogging of the natural image, and the solving method of the dark channel map is shown as the formula (7):
Figure BDA0001822600850000041
where C is the three color channels R, G, B of the image; n (x) is a pixel in the window area with the pixel point x as the center; l isC(y) a color channel map of the image; l isdark(x) The image fog degree is reflected for the dark channel image, and the dark channel L of the fog-free imagedark(x) The value tends to 0; for foggy images, Ldark(x) The value is large;
obtaining a dark primary color image of the unique portion P' of the polarization degree image by equation (7)
Figure BDA0001822600850000042
Figure BDA0001822600850000043
Obtaining a dark primary color image of the unique portion R' of the polarization angle image by equation (7)
Figure BDA0001822600850000044
Figure BDA0001822600850000045
The method comprises the following steps of respectively obtaining bright features, dark features and detail features of two images by adopting multi-feature separation based on a dark primary color theory:
s3.1.1 obtaining the image after inversion of the image P' using equations (8) and (9), respectively
Figure BDA0001822600850000046
And the image after the inversion of the image R
Figure BDA0001822600850000047
Then the dark primary color image is processed
Figure BDA0001822600850000048
And
Figure BDA0001822600850000049
respectively associated with the images
Figure BDA00018226008500000410
And
Figure BDA00018226008500000411
fusing according to the rule that the absolute value is small to obtain the dark feature image D of the image PP'Dark feature image D of sum image RR'As shown in equations (10) and (11);
Figure BDA00018226008500000412
Figure BDA00018226008500000413
Figure BDA00018226008500000414
Figure BDA00018226008500000415
s3.1.2, will
Figure BDA0001822600850000051
And
Figure BDA0001822600850000052
images D respectively corresponding theretoP'And DR'Taking the difference to obtain a bright feature image L of the image PP'And bright feature image L of image RR'As shown in equations (12) and (13);
Figure BDA0001822600850000053
Figure BDA0001822600850000054
s3.1.3, and associating image P 'and image R' with the dark primary color image, respectively
Figure BDA0001822600850000055
And dark primary color image
Figure BDA0001822600850000056
Taking the difference to obtain a detail image P of the image PP'Detail image P of sum image RR'As shown in formulas (14) and (15);
Figure BDA0001822600850000057
Figure BDA0001822600850000058
still further, in step S3.2, the fusion is carried outClosed-up feature image LP'And bright feature image LR'The method comprises the following steps:
s3.2.1, the gaussian weighted local energy function is used to obtain the gaussian weighted local energy of the two bright feature images, and the gaussian weighted local energy function is shown in formula (16):
Figure BDA0001822600850000059
in the formula (I), the compound is shown in the specification,
Figure BDA00018226008500000510
representing a gaussian-weighted local energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents P 'or R';
the bright feature images L are obtained by the formula (16)P'Gaussian weighted local energy of
Figure BDA00018226008500000511
And bright feature image LR'Gaussian weighted local energy of
Figure BDA00018226008500000512
S3.2.2 solving image LP'And an image LR'Matching degree of the Gaussian weighted local energy;
image LP'And an image LR'The matching degree of the Gaussian weighted local energy is as follows:
Figure BDA00018226008500000513
s3.2.3 fusing the image L by using the matching degree of the local energy and the local energy weighted by GaussianP'And LR'Obtaining a bright feature image fusion result FL
Image LP'And LR'The fusion rule of (1) is:
Figure BDA0001822600850000061
wherein the content of the first and second substances,
Figure BDA0001822600850000062
in the formula, TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LP'And LR'The regions centered on the point (m, n) are not similar, and the two images L areP'And LR'Selecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LP'And LR'The fusion result of (2) is a coefficient weighted average.
Still further, the dark feature image D is fused in the above step S3.3P'And dark feature image DR'The method comprises the following steps:
s3.3.1, respectively obtaining the Gaussian weighted local energy of the two dark feature images by using the local area weighted variance energy function; the weighted variance energy function for a local region is shown in equation (20):
Figure BDA0001822600850000063
in the formula (I), the compound is shown in the specification,
Figure BDA0001822600850000064
represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;
Figure BDA0001822600850000065
represents the local area average centered at point (m, n); k represents P 'or R';
respectively obtaining dark feature images D by using a formula (20)P'Local area weighted variance energy of
Figure BDA0001822600850000066
And dark feature image DR'Local area weighted variance energy of
Figure BDA0001822600850000067
S3.3.2 solving two images DP'And DR'The local region weighted variance energy matching degree;
image DP'And an image DR'The matching degree of the local area weighted variance energy is obtained by the formula (21):
Figure BDA0001822600850000071
s3.3.3 fusing two images D by using local weighted variance energy and local weighted variance energy matching degreeP'And DR'Obtaining a fusion result F of the dark feature imagesD
Two images DP'And DR'The fusion formula of (a) is:
Figure BDA0001822600850000072
wherein the content of the first and second substances,
Figure BDA0001822600850000073
in the formula, ThA threshold value for judging the similarity of the darkness features is fused, and the value is 0.5-1; if it is ME(m,n)<ThThen two images DP'And DR'The regions centered on the point (m, n) are not similar, and the two images DP'And DR'Selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DP'And DR'The fusion result of (2) is a coefficient weighted average.
Still further, the step S3.4 of fusing detail features of the polarization degree and polarization angle images by using a feature difference driving method based on local gradients and local variances includes the following steps:
s3.4.1, finding the detail feature image P of the image PP'Detail feature image P of sum image RR'The local gradient of (a);
the formula for solving the local gradient is:
Figure BDA0001822600850000074
in the formula (I), the compound is shown in the specification,
Figure BDA0001822600850000075
representing the local gradient at the pixel point (m, n); k represents P 'or R';
Figure BDA0001822600850000076
Figure BDA0001822600850000077
respectively representing horizontal and vertical edge images obtained by convolution of horizontal and vertical templates of a Sobel operator and the detail characteristic image;
obtaining the detail feature image P using the formula (24)P'Local gradient of
Figure BDA0001822600850000081
And detail feature image PR'Local gradient of
Figure BDA0001822600850000082
S3.4.2 finding the detail feature image PP'And PR'The local weighted variance of (c);
the local weighted variance is the same as equation (20), i.e., as shown in equation (25):
Figure BDA0001822600850000083
in the formula (I), the compound is shown in the specification,
Figure BDA0001822600850000084
represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;
Figure BDA0001822600850000085
represents the local area average centered at point (m, n); k represents P 'or R';
the detail feature image P is obtained using the formula (25)P'Local weighted variance of
Figure BDA0001822600850000086
And detail feature image PR'Local gradient of
Figure BDA0001822600850000087
S3.4.3, obtaining the local gradient matching degree, the local weighted variance matching degree, the local difference gradient and the local difference variance of the two detail characteristic images:
local gradient matching degree:
Figure BDA0001822600850000088
local weighted variance matching:
Figure BDA0001822600850000089
local differential gradient:
Figure BDA00018226008500000810
local variance of difference:
Figure BDA00018226008500000811
s3.4.4, obtaining a pixel-based decision map PDG (M, n) according to the local difference gradient Delta T (M, n) and the local difference variance Delta V (M, n), and matching the local gradient with the degree MT(M, n) and local weighted variance matching MV1(m, n) obtaining a feature difference degree decision graph DDG, comprising the steps of:
(1) according to bureauThe local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) obtain a pixel-based decision map PDG (m, n): when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9(ii) a Wherein p is1~p9A decision diagram showing that the pixel position is 1 and the other pixel positions are 0 when the above condition is satisfied;
(2) according to local gradient matching degree MT(M, n) and local weighted variance matching MV(m, n) a feature difference degree decision graph DDG can be obtained, as shown in equation (30):
Figure BDA0001822600850000091
in the formula (d)1And d2The corresponding pixel position satisfying the formula (30) is 1, and the other pixel positions are 0.
S3.4.5, judging the determined region and the uncertain region according to the decision graph PDG (m, n) based on the pixel and the feature difference degree decision graph DDG;
it can be judged by the PDG that p satisfying the (1) corresponding condition of step S3.4.41、p2、p5、p6、p7And p8To determine the area: since for p1And p2For the two situations, the two difference characteristics can reflect whether the gray value of the corresponding pixel point is reserved in the fused image; for p5、p6、p7And p8For the four conditions, whether the gray value of the corresponding pixel point is reserved in the fusion image can be reflected by using one difference characteristic;
p can be judged by PDG and DDG3And p4To determine the area: because the DDG can determine the difference degree of the local features of the two images and then select the difference feature with larger difference degree, the difference feature can reflect whether the gray value of the corresponding pixel point is reserved in the fused image;
however for region p9The PDG and DDG judgment method cannot be carried out according to the two decision graphs, and the area belongs to an uncertainty area;
s3.4.6, fusing the determined areas of the two detail feature images by using feature difference drive;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as a fusion driving factor for determining the region, which is denoted as DIF (m, n), as shown in the following formula:
DIF(m,n)=ΔT(m,n)·ΔV(m,n) (31)
then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area as
Figure BDA0001822600850000101
Comprises the following steps:
Figure BDA0001822600850000102
where "· represents the product of values at corresponding pixel locations in the matrix;
s3.4.7, fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
suppose "detail feature image PP'And PR'The local gradient of (a) is large is a membership function of [ mu ] respectivelyT(Pp'(m, n)) and μT(PR'(m, n)), "detail feature image PP'And PR'The local weighted variance of (1) is large, and the membership functions of (1) are respectively muV(PP'(m, n)) and μV(PR'(m, n)), as shown in formulas (33) and (34):
Figure BDA0001822600850000103
Figure BDA0001822600850000104
in the formula, k represents P 'or R'.
Two detail characteristic images P can be respectively calculated by using fuzzy logic traffic operation ruleP'And PR'The membership functions of the pixel values at the position (m, n) to the importance degree of the fused image of the uncertain region are respectively marked as muT∩V(PP'(m, n)) and μT∩V(PR'(m, n)), as shown in formula (35):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))] (35)
wherein k represents P 'or R';
then, the fusion result of the uncertain region of the two image detail feature images is:
Figure BDA0001822600850000111
wherein "·" represents the product of the values at the corresponding pixel locations in the matrix, "·/" represents the division of the values at the corresponding pixel locations in the matrix;
s3.4.8, fusion
Figure BDA0001822600850000112
And
Figure BDA0001822600850000113
obtaining a fusion result of two detail characteristic images as FDIF(m, n) and for FDIF(m, n) performing consistency check;
Figure BDA0001822600850000114
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) move up, verifying the central pixel with the pixels around the window.
Still further, the result of fusing steps S3.2, S3.3 and S3.4 in step S3.5 above is to fuse an image F containing bright featuresL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n), the fusion formula is shown as formula (38):
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
in the formula, alpha, beta and gamma are fusion weight coefficients, and the value range is [0,1 ]; more preferably, the value of α is 1, the value of β is 0.3, and the value of γ is 1, so as to satisfy the supersaturation of less fused images and improve the contrast.
Still further, the method for fusing in YUV space in step S4 by using the fusion result and the total intensity image I in step S3 includes: and inputting the total intensity image I into a Y channel of a YUV space, negating the fusion result obtained in the step S3 to obtain an image F ', namely F ' is 255-F, inputting the F ' into a U channel, and finally inputting the fusion result F obtained in the step S3 into a V channel to obtain a final fusion image.
Compared with the prior art, the invention has the following beneficial effects:
1. the method disclosed by the invention fuses all the polarization quantities of the polarization images, effectively solves the problem of information redundancy among all the polarization quantities in the polarization image fusion process, fuses the complementary information among all the polarization quantities, enriches the fused image scene and is beneficial to identifying the disguised target.
2. The method integrates the local area energy characteristic representing the correlation between adjacent pixels of the image and the local area brightness difference, the local area variance characteristic reflecting the local area gray change difference and the local area gradient reflecting the detail variance of the image pixels in the image fusion process, the local area energy characteristic representing the correlation between the adjacent pixels of the image and the local area brightness difference is favorable for improving the image contrast, the local area variance characteristic reflecting the local area gray change difference is favorable for improving the image definition, and the local area gradient reflecting the detail variance of the image pixels is favorable for enhancing the detail information of the image, thereby improving the image definition and contrast, and enhancing the detail information of the edge of the image and the like.
3. The method disclosed by the invention is used for fusing based on the human eye visual characteristic YUV color space, so that the visualization effect of the polarization image fusion result is effectively improved.
Drawings
These and/or other aspects and advantages of the present invention will become more apparent and more readily appreciated from the following detailed description of the embodiments of the invention, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a fusion method of infrared polarization images based on YUV and multi-feature separation according to an embodiment of the present invention;
fig. 2 is a flowchart of a detail feature image fusion method of images P 'and R' driven by fuzzy logic and feature difference in the fusion method of infrared polarization images based on YUV and multi-feature separation according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, the following detailed description of the invention is provided in conjunction with the accompanying drawings and the detailed description of the invention.
Example 1
A specific operation flow of the fusion method of the infrared polarization images based on YUV and multi-feature separation is shown in FIG. 1, and the fusion method comprises the following steps S1-S4.
S1, representing the polarization state of light by using Stokes vectors, and calculating a polarization degree image P and a polarization angle image R according to the Stokes vectors;
the infrared radiation intensity imaging is mainly related to the temperature, radiance and the like of a scenery, when a noise source with the same temperature is placed around a target object, an existing thermal infrared imager cannot identify the target, and the infrared imaging technology faces serious limitations and challenges. Compared with the traditional infrared imaging, the light polarization imaging can reduce the degradation influence of the complex scene, and the light polarization imaging can reduce the degradation influence of the complex scene, so that the identification of the disguised target is facilitated.
Polarization is a basic characteristic of light and cannot be directly observed by human eyes, so that polarization information needs to be displayed in a certain form and is perceived by human eyes or convenient for computer processing. The polarization state of the light is represented by a Stokes vector, and the Stokes vector describes the polarization state and the intensity of the light by four Stokes parameters, which are all time average values of light intensity, have intensity dimensions and can be directly detected by a detector. The representation of the Stokes vector S is as follows:
Figure BDA0001822600850000131
in the formula I1、I2、I3And I4Respectively representing the acquired light intensity images with the polarization directions of 0 degree, 45 degrees, 90 degrees and 135 degrees; i represents the total intensity of light; q represents the intensity difference between horizontal and vertical polarization, and U represents the intensity difference between 45 ° and 135 ° in the polarization direction; v represents the intensity difference between the left-and right-hand circularly polarized components of the light.
In actual polarization, a phase retarder is not needed, and the Stokes parameters can be obtained only by rotating a linear polarizer. The polarization degree image P and the polarization angle image R of the polarized light can be expressed as:
Figure BDA0001822600850000141
Figure BDA0001822600850000142
s2, solving a unique part P 'of the polarization degree and a unique part R' of the polarization angle image according to the common part Co of the polarization degree and the polarization angle image;
redundant and mutually complementary information exists between the polarization degree and polarization angle images, and the common part Co of the images P and R can be obtained by performing the operation of the formula (4) on each pixel of the images P and R.
Co=min(p,R) (4)
Unique portions of the polarization degree and polarization angle images can be obtained using equations (5) and (6), respectively.
P'=P-Co (5)
R'=R-Co (6)
S3 fusing the images P 'and R' by a multi-feature separation method based on the dark primary color theory;
s3.1, performing multi-feature separation on the images P 'and R' based on a dark primary color theory to obtain dark feature images, bright feature images and detail feature images of the images P 'and R' respectively,
the dark primary color is He and the like used for estimating the transmittance in the atmospheric scattering model, and quick defogging of the natural image is realized. The solution method of the dark channel map is shown in equation (7).
Figure BDA0001822600850000143
Where C is the three color channels R, G, B of the image; n (x) is a pixel in the window area with the pixel point x as the center; l isC(y) a color channel map of the image; l isdark(x) The image fog degree is reflected for the dark channel image, and the dark channel L of the fog-free imagedark(x) The value tends to 0; for foggy images, Ldark(x) The value is large. In a natural image, the region obviously affected by fog is generally the brightest pixel point in the dark primary color, and the pixel value of the fog-free region in the dark primary color is very low. Therefore, for the gray-scale image, the dark primary color image includes a bright area in the original image, and reflects the low-frequency part of the original image, that is, an area with relatively smooth gray-scale change in the original image is reserved, so that the difference of the bright and dark characteristics is more prominent, and the local area information with relatively violent gray-scale change and high contrast, especially the edge detail information, is lost.
Obtaining a dark primary color image of the unique portion P' of the polarization degree image by equation (7)
Figure BDA0001822600850000151
Figure BDA0001822600850000152
Obtaining a dark primary color image of a portion unique to the polarization angle image R' by equation (7)
Figure BDA0001822600850000153
Figure BDA0001822600850000154
The method comprises the following steps of respectively obtaining bright features, dark features and detail features of two images by adopting multi-feature separation based on a dark primary color theory:
s3.1.1 obtaining the image after inversion of the image P' by using the formulas (8) and (9), respectively
Figure BDA0001822600850000155
And the image after the inversion of the image R
Figure BDA0001822600850000156
Then the dark primary color image is processed
Figure BDA0001822600850000157
And
Figure BDA0001822600850000158
respectively associated with the images
Figure BDA0001822600850000159
And
Figure BDA00018226008500001510
fusing according to the rule that the absolute value is small to obtain the dark feature image D of the image PP'Dark feature image D of sum image RR'As shown in equations (10) and (11);
Figure BDA00018226008500001511
Figure BDA00018226008500001512
Figure BDA00018226008500001513
Figure BDA00018226008500001514
s3.1..2 will
Figure BDA00018226008500001515
And
Figure BDA00018226008500001516
images D respectively corresponding theretoP'And DRTaking the difference to obtain a bright feature image L of the image PP'And bright feature image L of image RR'As shown in equations (12) and (13);
Figure BDA00018226008500001517
Figure BDA0001822600850000161
s3.1.3 associating image P 'and image R' with dark primary color images, respectively
Figure BDA0001822600850000162
And dark primary color image
Figure BDA0001822600850000163
Taking the difference to obtain a detail image P of the image PP'Detail image P of sum image RR'As shown in formulas (14) and (15);
Figure BDA0001822600850000164
Figure BDA0001822600850000165
also, it should be noted that the following steps S3.2, S3.3 and S3.4 are not time-series and can be described in one step, and the step numbers are only performed for convenience of the following description: steps S3.2, S3.3 and S3.4 may be performed synchronously; any two can be carried out simultaneously, and then the last one is carried out; or any one of the two can be carried out firstly, and then the rest two can be carried out synchronously; or one of them in sequence.
S3.2, fusing bright feature images of the images P 'and R' by adopting a matching method based on local region energy features;
the bright feature information concentrates the bright area in the original image, reflects the low-frequency component in the original image, and fuses the bright feature image LP'And bright feature image LR'Comprises the following steps:
s3.2.1, respectively obtaining the Gaussian weighted local energy of the two bright characteristic images by using the Gaussian weighted local energy function;
the gaussian weighted local energy function is shown in equation (16):
Figure BDA0001822600850000166
in the formula (I), the compound is shown in the specification,
Figure BDA0001822600850000167
representing a gaussian-weighted local energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents P 'or R'; then, the bright feature images L can be obtained respectively by using the formula (16)P'Gaussian weighted local energy of
Figure BDA0001822600850000168
And bright feature image LR'Gaussian weighted local energy of
Figure BDA0001822600850000169
S3.2.2 solving for image LP'And an image LR'Matching degree of the Gaussian weighted local energy;
image LP'And an image LR'The matching degree of the Gaussian weighted local energy is as follows:
Figure BDA0001822600850000171
s3.2.3 fusing the image L with the Gaussian-weighted local energy and the matching degree of the Gaussian-weighted local energyP'And LR'Obtaining a bright feature image fusion result FL
Image LP'And LR'The fusion rule of (1) is:
Figure BDA0001822600850000172
wherein the content of the first and second substances,
Figure BDA0001822600850000173
in the formula, TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LP'And LR'The regions centered on the point (m, n) are not similar, and the two images L areP'And LR'Selecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LP'And LR'The fusion result of (2) is a coefficient weighted average.
S3.3, fusing dark feature images of the images P 'and R' by adopting a matching method based on local region weighted variance features;
dark feature images lack bright areas in the source image, but can still be viewed as an approximation of the source imageThe image contains the main energy of the image and the basic outline of the image, and the dark characteristic image D is fusedP'And dark feature image DR'Comprises the following steps:
s3.3.1, respectively using local area weighted variance energy function to obtain the Gaussian weighted local energy of two dark feature images;
the weighted variance energy function for a local region is shown in equation (20):
Figure BDA0001822600850000181
in the formula (I), the compound is shown in the specification,
Figure BDA0001822600850000182
represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;
Figure BDA0001822600850000183
represents the local area average centered at point (m, n); k represents P 'or R'.
Then, the dark feature image D can be obtained by the formula (20)P'Local area weighted variance energy of
Figure BDA0001822600850000184
And dark feature image DR'Local area weighted variance energy of
Figure BDA0001822600850000185
S3.3.2 solving for two images DP'And DR'The local region weighted variance energy matching degree;
image DP'And an image DR'Local area weighted variance energy matching:
Figure BDA0001822600850000186
s3.3.3 fusion of two images D by using energy of local weighted variance and energy matching degree of local weighted varianceP'And DR'Obtaining a fusion result F of the dark feature imagesD
Two images DP'And DR'The fusion rule of (1) is:
Figure BDA0001822600850000187
wherein the content of the first and second substances,
Figure BDA0001822600850000188
in the formula, ThA threshold value for judging the similarity of the darkness features is fused, and the value is 0.5-1; if it is
ME(m,n)<ThThen two images DP'And DR'The regions centered on the point (m, n) are not similar, and the two images DP'And DR'Selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DP'And DR'The fusion result of (2) is a coefficient weighted average.
S3.4 driving the detail feature images of the fused images P 'and R' using fuzzy logic and feature differences.
The local gradient and the local variance can well reflect the detail information of the image, and the definition of the image is expressed. In order to retain the detail information of the detail feature image as much as possible and improve the definition, the feature difference driving method based on the local gradient and the local variance is adopted to fuse the detail features of the polarization degree and the polarization angle image, as shown in fig. 2, the specific steps are as follows:
s3.4.1 finding the detail feature image P of the image PP'Detail feature image P of sum image RR'The local gradient of (a);
the formula for solving the local gradient is:
Figure BDA0001822600850000191
in the formula (I), the compound is shown in the specification,
Figure BDA0001822600850000192
representing the local gradient at the pixel point (m, n); k represents P 'or R';
Figure BDA0001822600850000193
Figure BDA0001822600850000194
respectively representing horizontal and vertical edge images obtained by convolving the horizontal and vertical templates of the Sobel operator with the detail feature image.
Obtaining the detail feature image P using the formula (24)P'Local gradient of
Figure BDA0001822600850000195
And detail feature image PR'Local gradient of
Figure BDA0001822600850000196
S3.4.2 finding a detail feature image PP'And PR'The local weighted variance of (c);
the local weighted variance is obtained by the same method as the formula (20):
Figure BDA0001822600850000197
in the formula (I), the compound is shown in the specification,
Figure BDA0001822600850000198
represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;
Figure BDA0001822600850000199
represents the local area average centered at point (m, n); k represents P 'or R'.
The detail feature image P is obtained using the formula (25)P'Is added topicallyVariance of weight
Figure BDA00018226008500001910
And detail feature image PR'Local gradient of
Figure BDA00018226008500001911
S3.4.3, obtaining local difference gradient and local difference variance of the two detail feature images, and matching degree of the local gradient feature and the local weighted variance feature;
local gradient matching degree:
Figure BDA0001822600850000201
local weighted variance matching:
Figure BDA0001822600850000202
local differential gradient:
Figure BDA0001822600850000203
local variance of difference:
Figure BDA0001822600850000204
s3.4.4 obtaining a decision graph based on pixel according to the feature difference and a feature difference degree decision graph by using the feature matching degree;
(1) from the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n), a pixel-based decision map PDG (m, n) is obtained: when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ 0p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9. Wherein p is1~p9A decision diagram showing that the pixel position is 1 and the other pixel positions are 0 when the above condition is satisfied;
(2) according to local gradient matching degree MT(M, n) and local weighted variance matching MV(m, n) a feature difference degree decision graph DDG can be obtained, as shown in equation (30):
Figure BDA0001822600850000205
in the formula (d)1And d2The corresponding pixel position satisfying the formula (30) is 1, and the other pixel positions are 0.
S3.4.5 determining the determined area and the uncertain area according to the decision map based on the pixel and the feature difference degree decision map;
it can be judged by the PDG that p satisfying the (1) corresponding condition of step S3.4.41、p2、p5、p6、p7And p8To determine the region. Since for p1And p2For the two situations, the two difference characteristics can reflect whether the gray value of the corresponding pixel point is reserved in the fused image; for p5、p6、p7And p8For the four cases, whether the gray value of the corresponding pixel point is kept in the fused image can be reflected by using one of the difference characteristics.
P can be judged by PDG and DDG3And p4To determine the region. Because the DDG can be used for determining the difference degree of the local features of the two images, and then the difference feature with larger difference degree is selected, the difference feature can reflect whether the gray value of the corresponding pixel point is reserved in the fused image.
However for region p9It is not possible to judge the method from the two decision graphs PDG and DDG, which belongs to the uncertainty region.
S3.4.6 fusing the determined regions of the two detail feature images using feature difference drive;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as a fusion driving factor for determining the region, which is denoted as DIF (m, n), as shown in the following formula:
DIF(m,n)=ΔT(m,n)·ΔV(m,n) (31)
then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area as
Figure BDA0001822600850000211
Comprises the following steps:
Figure BDA0001822600850000212
where "· denotes the product of the values at the corresponding pixel locations in the matrix.
S3.4.7 fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
fusing uncertain regions by using fuzzy logic theory and aiming at detail feature image PP'And PR'Considering whether the local gradient of the detail feature image is large or the local weighted variance of the detail feature image is large, the membership function of the detail feature image is constructed according to the group of relations. Then assume "detail feature image PP'And PR'Is large, respectively, as muT(Pp'(m, n)) and μT(PR'(m, n)), "detail feature image PP'And PR'The local weighted variance of (1) is large, and the membership functions of (1) are respectively muV(PP'(m, n)) and μV(PR'(m, n)), as shown in formulas (33) and (34):
Figure BDA0001822600850000221
Figure BDA0001822600850000222
in the formula, k represents P 'or R'.
Two detail characteristic images P can be respectively calculated by using fuzzy logic traffic operation ruleP'And PR'The membership functions of the pixel values at the position (m, n) to the importance degree of the fused image of the uncertain region are respectively marked as muT∩V(PP'(m, n)) and μT∩V(PR'(m, n)), as shown in formula (35):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))] (35)
in the formula, k represents P 'or R'.
Then, the fusion result of the uncertain region of the two image detail feature images is:
Figure BDA0001822600850000223
where "·" represents the product of the values at the corresponding pixel locations in the matrix and "·/" represents the division of the values at the corresponding pixel locations in the matrix.
S3.4.8 fusion
Figure BDA0001822600850000224
And
Figure BDA0001822600850000225
obtaining a fusion result of two detail characteristic images as FDIF(m, n) and for FDIF(m, n) performing consistency check;
Figure BDA0001822600850000226
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) move up, verifying the central pixel with the pixels around the window. If the central pixel is from PrAnd PpIn one of the images, the surrounding s (4 < s < 8) pixels of the central pixelThe pixels are from the other image, the central pixel value is changed to the pixel value of the other image at that location, and the window traverses the entire image FDIF(m, n) to obtain corrected FDIF(m,n)。
S3.5 the results of steps S3.2, S3.3 and S3.4 are fused to obtain a fused result of images P 'and R'.
Fusing images F containing bright featuresL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n), the fusion formula is as follows:
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
in the formula, alpha, beta and gamma are fusion weight coefficients, and the value range is [0,1 ]. To reduce the oversaturation of the fused image and to improve the contrast, α is 1, β is 0.3, and γ is 1.
S4, fusing the fusion result and the total intensity image I in the YUV space by using the fusion result in the step S3 to obtain a final fusion result;
and inputting the total intensity image I into a Y channel of a YUV space, negating the fusion result obtained in the step S3 to obtain an image F ', namely F ' is 255-F, inputting the F ' into a U channel, and finally inputting the fusion result F obtained in the step S3 into a V channel to obtain a final fusion image.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (2)

1. A fusion method of infrared polarization images based on YUV and multi-feature separation is characterized by comprising the following steps:
s1, representing the polarization state of the light by using a Stokes vector, and then calculating a polarization degree image P and a polarization angle image R according to the Stokes vector;
s2, solving a unique part P 'of the polarization degree image and a unique part R' of the polarization angle image according to the common part Co of the polarization degree and the polarization angle image;
s3, fusing images P 'and R' by a multi-feature separation method based on a dark primary color theory;
s4, fusing the fusion result and the total intensity image I in the YUV space by using the fusion result in the step S3 to obtain a final fusion result;
the representation of the Stokes vector S in step S1 is shown in formula (1):
Figure FDA0002677448770000011
in the formula I1、I2、I3And I4Respectively representing the acquired light intensity images with the polarization directions of 0 degree, 45 degrees, 90 degrees and 135 degrees; i represents the total intensity of light; q represents the intensity difference between horizontal and vertical polarization, and U represents the intensity difference between 45 ° and 135 ° in the polarization direction; v represents the intensity difference between the left-and right-hand circular polarization components of the light;
the polarization degree image P and the polarization angle image R of the polarized light are expressed as:
Figure FDA0002677448770000012
Figure FDA0002677448770000013
in the step S2, in the above step,
the common portion Co of the images P and R is obtained for each pixel in the images P and R using equation (4):
Co=min(P,R) (4)
then, the unique part P 'of the polarization degree image and the unique part R' of the polarization angle image are obtained using equations (5) and (6), respectively:
P'=P-Co (5)
R'=R-Co (6);
the step S3 includes the steps of:
s3.1, performing multi-feature separation on the images P 'and R' based on a dark primary color theory to obtain dark feature images, bright feature images and detail feature images of the images P 'and R' respectively;
s3.2, fusing bright feature images of the images P 'and R' by adopting a matching method based on local region energy features; fusing dark feature images of the images P 'and R' by adopting a matching method based on local area weighted variance features; driving the detail feature images of the fused images P 'and R' by using fuzzy logic and feature difference;
s3.3, fusing the results of the bright characteristic image, the dark characteristic image and the detail characteristic image in the step S3.2 to obtain a fusion result of the images P 'and R';
in said step S3.1, the first step,
the dark primary color is used for estimating the transmittance in the atmospheric scattering model to realize the fast defogging of the natural image, and the solving method of the dark channel map is shown as the formula (7):
Figure FDA0002677448770000021
where C is the three color channels R, G, B of the image; n (x) is a pixel in the window area with the pixel point x as the center; l isC(y) a color channel map of the image; l isdark(x) The image fog degree is reflected for the dark channel image, and the dark channel L of the fog-free imagedark(x) The value tends to 0; for foggy images, Ldark(x) The value is large;
obtaining a dark primary color image of the unique portion P' of the polarization degree image by equation (7)
Figure FDA0002677448770000022
Figure FDA0002677448770000023
By formula (7)Obtaining a dark primary image of a unique portion R' of a polarization angle image
Figure FDA0002677448770000024
Figure FDA0002677448770000025
The method comprises the following steps of respectively obtaining bright features, dark features and detail features of two images by adopting multi-feature separation based on a dark primary color theory:
s3.1.1 obtaining the image after inversion of the image P' using equations (8) and (9), respectively
Figure FDA0002677448770000026
And the image after the inversion of the image R
Figure FDA0002677448770000027
Then the dark primary color image is processed
Figure FDA0002677448770000028
And
Figure FDA0002677448770000029
respectively associated with the images
Figure FDA00026774487700000210
And
Figure FDA00026774487700000211
fusing according to the rule that the absolute value is small to obtain the dark feature image D of the image PP'Dark feature image D of sum image RR'As shown in equations (10) and (11);
Figure FDA00026774487700000212
Figure FDA00026774487700000213
Figure FDA00026774487700000214
Figure FDA00026774487700000215
s3.1.2, will
Figure FDA00026774487700000216
And
Figure FDA00026774487700000217
images D respectively corresponding theretoP'And DR'Taking the difference to obtain a bright feature image L of the image PP'And bright feature image L of image RR'As shown in equations (12) and (13):
Figure FDA00026774487700000218
Figure FDA00026774487700000219
s3.1.3, and associating image P 'and image R' with the dark primary color image, respectively
Figure FDA00026774487700000220
And dark primary color image
Figure FDA00026774487700000221
Taking the difference to obtain a detail image P of the image PP'Detail image P of sum image RR'As shown in formulas (14) and (15);
Figure FDA00026774487700000222
Figure FDA00026774487700000223
in the step S3.2, a matching method based on local region energy features is adopted to fuse bright feature images of the images P 'and R', and the method includes the following steps:
s3.2.1, the gaussian weighted local energy function is used to obtain the gaussian weighted local energy of the two bright feature images, and the gaussian weighted local energy function is shown in formula (16):
Figure FDA0002677448770000031
in the formula (I), the compound is shown in the specification,
Figure FDA0002677448770000032
representing a gaussian-weighted local energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents P 'or R';
the bright feature images L are obtained by the formula (16)P'Gaussian weighted local energy of
Figure FDA0002677448770000033
And bright feature image LR'Gaussian weighted local energy of
Figure FDA0002677448770000034
S3.2.2 solving image LP'And an image LR'Matching degree of the Gaussian weighted local energy;
image LP'And an image LR'The matching degree of the Gaussian weighted local energy is as follows:
Figure FDA0002677448770000035
s3.2.3 fusing the image L by using the matching degree of the local energy and the local energy weighted by GaussianP'And LR'Obtaining a bright feature image fusion result FL
Image LP'And LR'The fusion rule of (1) is:
Figure FDA0002677448770000036
wherein the content of the first and second substances,
Figure FDA0002677448770000037
in the formula, TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LP'And LR'The regions centered on the point (m, n) are not similar, and the two images L areP'And LR'Selecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LP'And LR'The fusion result of (1) is coefficient weighted average;
in the step S3.2, the dark feature images of the images P 'and R' are fused by using a matching method based on local area weighted variance features, which includes the following steps:
s3.2.4, respectively obtaining the Gaussian weighted local energy of the two dark feature images by using the local area weighted variance energy function; the weighted variance energy function for a local region is shown in equation (20):
Figure FDA0002677448770000038
in the formula (I), the compound is shown in the specification,
Figure FDA0002677448770000039
represents the local region weighted variance energy centered at point (m, n); w is a(i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;
Figure FDA00026774487700000310
represents the local area average centered at point (m, n); k represents P 'or R';
respectively obtaining dark feature images D by using a formula (20)P'Local area weighted variance energy of
Figure FDA00026774487700000311
And dark feature image DR'Local area weighted variance energy of
Figure FDA00026774487700000312
S3.2.5 solving two images DP'And DR'The local region weighted variance energy matching degree;
image DP'And an image DR'The matching degree of the local area weighted variance energy is obtained by the formula (21):
Figure FDA0002677448770000041
s3.2.6 fusing two images D by using local weighted variance energy and local weighted variance energy matching degreeP'And DR'Obtaining a fusion result F of the dark feature imagesD
Two images DP'And DR'The fusion formula of (a) is:
Figure FDA0002677448770000042
wherein the content of the first and second substances,
Figure FDA0002677448770000043
in the formula, ThFor fusing darkness characteristicsTaking the value of the threshold value of similarity judgment as 0.5-1; if M isE(m,n)<ThThen two images DP'And DR'The regions centered on the point (m, n) are not similar, and the two images DP'And DR'Selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DP'And DR'The fusion result of (1) is coefficient weighted average;
in the step S3.2, the detail feature images of the fused images P 'and R' are driven by fuzzy logic and feature difference, and the method includes the following steps:
s3.2.7, finding the detail feature image P of the image PP'Detail feature image P of sum image RR'The local gradient of (a);
the formula for solving the local gradient is:
Figure FDA0002677448770000044
in the formula (I), the compound is shown in the specification,
Figure FDA0002677448770000045
representing the local gradient at the pixel point (m, n); k represents P 'or R';
Figure FDA0002677448770000046
Figure FDA0002677448770000047
respectively representing horizontal and vertical edge images obtained by convolution of horizontal and vertical templates of a Sobel operator and the detail characteristic image;
obtaining the detail feature image P using the formula (24)P'Local gradient of
Figure FDA0002677448770000048
And detail feature image PR'Local gradient of
Figure FDA0002677448770000049
S3.2.8 finding the detail feature image PP'And PR'The local weighted variance of (c);
the local weighted variance is shown in equation (25):
Figure FDA00026774487700000410
in the formula (I), the compound is shown in the specification,
Figure FDA00026774487700000411
represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;
Figure FDA00026774487700000412
represents the local area average centered at point (m, n); k represents P 'or R';
the detail feature image P is obtained using the formula (25)P'Local weighted variance of
Figure FDA00026774487700000413
And detail feature image PR'Local gradient of
Figure FDA0002677448770000051
S3.2.9, obtaining the local gradient matching degree, the local weighted variance matching degree, the local difference gradient and the local difference variance of the two detail characteristic images:
local gradient matching degree:
Figure FDA0002677448770000052
local weighted variance matching:
Figure FDA0002677448770000053
local differential gradient:
Figure FDA0002677448770000054
local variance of difference:
Figure FDA0002677448770000055
s3.2.10, obtaining a pixel-based decision map PDG (M, n) according to the local difference gradient Delta T (M, n) and the local difference variance Delta V (M, n), and matching the local gradient with the degree MT(M, n) and local weighted variance matching MV1(m, n) obtaining a feature difference degree decision graph DDG, comprising the steps of:
(1) obtaining a pixel-based decision map PDG (m, n) from the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n): when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9(ii) a Wherein p is1~p9A decision diagram showing that the pixel position is 1 and the other pixel positions are 0 when the above condition is satisfied;
(2) according to local gradient matching degree MT(M, n) and local weighted variance matching MV(m, n) a feature difference degree decision graph DDG can be obtained, as shown in equation (30):
Figure FDA0002677448770000056
in the formula (d)1And d2Representing a decision diagram that the pixel position corresponding to the formula (30) is 1 and the other pixel positions are 0;
s3.2.11, deciding the determined region and the uncertain region according to the pixel-based decision graph PDG (m, n) and the feature difference degree decision graph DDG:
s3.2.12, fusing the determined areas of the two detail feature images by using feature difference drive;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as the fusion driving factor for determining the region, which is denoted as DIF (m, n), as shown in equation (31):
DIF(m,n)=ΔT(m,n)·ΔV(m,n) (31)
then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area as
Figure FDA0002677448770000057
Comprises the following steps:
Figure FDA0002677448770000061
where "· represents the product of values at corresponding pixel locations in the matrix;
s3.2.13, fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
suppose "detail feature image PP'And PR'If the local gradient of (a) is large, the membership functions are respectively μT(Pp'(m, n)) and μT(PR'(m, n)), assume "detailed feature image PP'And PR'If the local weighted variance of (2) is large, the membership functions are respectively μV(PP'(m, n)) and μV(PR'(m, n)), as shown in formulas (33) and (34):
Figure FDA0002677448770000062
Figure FDA0002677448770000063
wherein k represents P 'or R';
two detail characteristic images P can be respectively calculated by using fuzzy logic traffic operation ruleP'And PR'The membership functions of the pixel values at the position (m, n) to the importance degree of the fused image of the uncertain region are respectively marked as muT∩V(PP'(m, n)) and μT∩V(PR'(m, n)), as shown in formula (35):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))] (35)
wherein k represents P 'or R';
then, the fusion result of the uncertain region of the two image detail feature images is:
Figure FDA0002677448770000064
wherein "·" represents the product of the values at the corresponding pixel locations in the matrix, "·/" represents the division of the values at the corresponding pixel locations in the matrix;
s3.2.14, fusion
Figure FDA0002677448770000065
And
Figure FDA0002677448770000066
obtaining a fusion result of two detail characteristic images as FDIF(m, n) and for FDIF(m, n) performing consistency check;
Figure FDA0002677448770000067
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) moving up, verifying the central pixel with the pixels around the window;
the method for fusing the fusion result and the total intensity image I in the YUV space in the step S4 by using the fusion result and the total intensity image I in the step S3 includes:
and inputting the total intensity image I into a Y channel of a YUV space, negating the fusion result obtained in the step S3 to obtain an image F ', namely F ' is 255-F, inputting the F ' into a U channel, and finally inputting the fusion result F obtained in the step S3 into a V channel to obtain a final fusion image.
2. The method for fusing infrared polarization images based on YUV and multi-feature separation as claimed in claim 1, wherein the fusing of the bright feature image, the dark feature image and the detail feature image in step S3.2 in step S3.3 results in fusing an image F containing bright featuresL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n), the fusion formula is shown as formula (38):
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
in the formula, alpha, beta and gamma are fusion weight coefficients, alpha is 1, beta is 0.3, and gamma is 1, so as to satisfy the supersaturation of less fusion images and improve the contrast.
CN201811180887.1A 2018-10-09 2018-10-09 Fusion method of infrared polarization images based on YUV and multi-feature separation Active CN109410161B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811180887.1A CN109410161B (en) 2018-10-09 2018-10-09 Fusion method of infrared polarization images based on YUV and multi-feature separation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811180887.1A CN109410161B (en) 2018-10-09 2018-10-09 Fusion method of infrared polarization images based on YUV and multi-feature separation

Publications (2)

Publication Number Publication Date
CN109410161A CN109410161A (en) 2019-03-01
CN109410161B true CN109410161B (en) 2020-11-13

Family

ID=65467525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811180887.1A Active CN109410161B (en) 2018-10-09 2018-10-09 Fusion method of infrared polarization images based on YUV and multi-feature separation

Country Status (1)

Country Link
CN (1) CN109410161B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111307065B (en) * 2020-03-05 2021-10-26 中国铁道科学研究院集团有限公司基础设施检测研究所 Steel rail profile detection method, system and device based on polarization beam splitting
CN111383242B (en) * 2020-05-29 2020-09-29 浙江大华技术股份有限公司 Image fog penetration processing method and device
CN112164017B (en) * 2020-09-27 2023-11-17 中国兵器工业集团第二一四研究所苏州研发中心 Polarization colorization method based on deep learning
CN113421205B (en) * 2021-07-16 2022-11-15 合肥工业大学 Small target detection method combined with infrared polarization imaging

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107253485A (en) * 2017-05-16 2017-10-17 北京交通大学 Foreign matter invades detection method and foreign matter intrusion detection means

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108389158A (en) * 2018-02-12 2018-08-10 河北大学 A kind of infrared and visible light image interfusion method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107253485A (en) * 2017-05-16 2017-10-17 北京交通大学 Foreign matter invades detection method and foreign matter intrusion detection means

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Camouflaged Target Separation by Spectral-polarimetric Imagery Fusion with Shearlet Transform and Clustering Segmentation;ZHOU Pu-cheng 等;《International Symposium on Photoelectronic Detection and Imaging 2013:Imaging Sensors and Applications》;20130821;第89081G-4-89081G-5页 *
一种红外偏振与光强图像的暗原色多特征分离融合方法;郭喆 等;《科学技术与工程》;20170928;第76-78页 *
模糊逻辑与特征差异驱动的红外偏振图像融合模型;安富 等;《红外技术》;20140421;第308-309页 *
红外偏振与红外光强图像的伪彩色融合研究;李伟伟 等;《红外技术》;20120220;第109页 *

Also Published As

Publication number Publication date
CN109410161A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
CN109410161B (en) Fusion method of infrared polarization images based on YUV and multi-feature separation
He et al. Haze removal using the difference-structure-preservation prior
Galdran et al. Enhanced variational image dehazing
Negru et al. Exponential contrast restoration in fog conditions for driving assistance
Moeslund Introduction to video and image processing: Building real systems and applications
US8457437B2 (en) System and method for enhancing registered images using edge overlays
Toet et al. Merging thermal and visual images by a contrast pyramid
CN109409186B (en) Driver assistance system and method for object detection and notification
Yuan et al. A region-wised medium transmission based image dehazing method
CN106096604A (en) Multi-spectrum fusion detection method based on unmanned platform
CN105678318B (en) The matching process and device of traffic sign
Chen et al. An advanced visibility restoration algorithm for single hazy images
Halmaoui et al. Contrast restoration of road images taken in foggy weather
Wang et al. Variational based smoke removal in laparoscopic images
Yu et al. A false color image fusion method based on multi-resolution color transfer in normalization YCBCR space
CN116091372B (en) Infrared and visible light image fusion method based on layer separation and heavy parameters
CN110827218A (en) Airborne image defogging method based on image HSV transmissivity weighted correction
CN106023108A (en) Image defogging algorithm based on boundary constraint and context regularization
Stone et al. Forward looking anomaly detection via fusion of infrared and color imagery
CN109410160B (en) Infrared polarization image fusion method based on multi-feature and feature difference driving
Asmare et al. Image Enhancement by Fusion in Contourlet Transform.
Pal et al. Visibility enhancement techniques for fog degraded images: a comparative analysis with performance evaluation
de Oliveira Gaya et al. Single image restoration for participating media based on prior fusion
CN107301625B (en) Image defogging method based on brightness fusion network
Qian et al. Effective contrast enhancement method for color night vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant