CN109410160B - Infrared polarization image fusion method based on multi-feature and feature difference driving - Google Patents

Infrared polarization image fusion method based on multi-feature and feature difference driving Download PDF

Info

Publication number
CN109410160B
CN109410160B CN201811180813.8A CN201811180813A CN109410160B CN 109410160 B CN109410160 B CN 109410160B CN 201811180813 A CN201811180813 A CN 201811180813A CN 109410160 B CN109410160 B CN 109410160B
Authority
CN
China
Prior art keywords
image
feature
local
images
polarization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811180813.8A
Other languages
Chinese (zh)
Other versions
CN109410160A (en
Inventor
冉骏
宋斌
陈蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Yuanxin Optoelectronics Technology Co ltd
Original Assignee
Hunan Yuanxin Optoelectronics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Yuanxin Optoelectronics Technology Co ltd filed Critical Hunan Yuanxin Optoelectronics Technology Co ltd
Priority to CN201811180813.8A priority Critical patent/CN109410160B/en
Publication of CN109410160A publication Critical patent/CN109410160A/en
Application granted granted Critical
Publication of CN109410160B publication Critical patent/CN109410160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention provides a method based on multi-featureThe infrared polarization image fusion method driven by the difference of the characteristic comprises the following steps: representing the polarization of light by using a Stokes vector, and calculating a polarization degree image P and a polarization angle image R; carrying out linear weighting on the polarization angle images R and U to obtain an image R'; calculating respective unique portions between the images R', I and P, excluding common portions, denoted as R1、I1And P1(ii) a Image P1、I1And R1Mapping to an R channel, a G channel and a B channel in an RGB space to obtain an RGB image, converting the RGB image into a YUV image, and extracting a brightness component Y; fusing images I by a method based on multi-feature separation1And P1Obtaining F; and replacing Y with F to obtain a replaced YUV image, and performing inverse transformation on the replaced YUV image to obtain an RGB image, namely a polarization image fusion result. A plurality of polarization images of the infrared polarization images are fused, so that the fused image scenes are richer, and the camouflage target can be identified. The invention is applied to the field of computer vision.

Description

Infrared polarization image fusion method based on multi-feature and feature difference driving
Technical Field
The invention relates to the field of computer vision, in particular to an infrared polarization image fusion method based on multi-feature and feature difference driving.
Background
At present, with the rapid development of the requirements of the infrared imaging detection technology in the application fields of military affairs, medical treatment, security protection, earth observation and the like, the traditional infrared detection technology is not satisfactory for some extent under the upgrading of precision, complex environment and camouflage technology. The traditional infrared imaging system mainly images the infrared radiation intensity of a scene, and the traditional infrared imaging system is mainly related to the temperature, the radiance and the like of the scene. When a noise source with the same temperature is placed around a target object, the existing thermal infrared imager cannot identify the target, and the infrared imaging technology faces serious limitations and challenges.
Compared with the traditional infrared imaging, the polarization imaging of the light can reduce the degradation influence of a complex scene, and meanwhile, the structure and distance information of the scene can be obtained. The infrared polarization imaging technology can detect the infrared intensity information of a target scene, can also obtain the polarization information of the target scene, can obviously improve the contrast between a target object and a natural background, and has the capability of reflecting the outline and the details of the object, thereby improving the quality of an infrared image, and a useful signal can be detected under a complex background by using a polarization means.
Polarization is a basic characteristic of light and cannot be directly observed by human eyes, so that polarization information needs to be displayed in a certain form and is perceived by human eyes or convenient for computer processing. The polarization state of light is represented by a stokes vector, which describes the polarization state and the intensity of light by four stokes parameters, which are all time-averaged values of light intensity, have a dimension of intensity, and can be directly detected by a detector. The stokes vector S is represented as:
Figure BDA0001822584950000011
in the formula I1、I2、I3And I4Respectively representing the acquired light intensity images with the polarization directions of 0 degree, 45 degrees, 90 degrees and 135 degrees; i represents the total intensity of light; q represents the intensity difference between horizontal and vertical polarization, and U represents the intensity difference between 45 ° and 135 ° in the polarization direction; v represents the intensity difference between the left-and right-hand circularly polarized components of the light.
The polarization angle image can better describe different surface orientations, and the polarization degree image contains the polarization information of an object, can better represent an artificial target and improve the contrast of the target and a background; the total light intensity image reflects the intensity information of the scene. The existing polarization image fusion methods only consider the image fusion algorithm of a single difference characteristic and cannot effectively describe all uncertain and randomly changed image characteristics in an image, so that valuable information is lost in the fusion process, and fusion and identification are failed; meanwhile, the fusion process has the problem that the contrast, the bright characteristic and the edge detail characteristic are difficult to be considered simultaneously.
Disclosure of Invention
Aiming at the problem that in the prior art, only a single-difference-characteristic image fusion algorithm is considered, and further all uncertain and randomly-changed image characteristics in an image cannot be effectively described, the invention aims to provide an infrared polarization image fusion method based on multi-characteristic and characteristic difference driving, wherein a plurality of polarization images of the infrared polarization image are fused, and a plurality of polarization quantities comprise an image Q, an image U, an image V, a total light intensity image I, a polarization degree image P and a polarization angle image R, so that fused image scenes are richer, and a plurality of uncertain and randomly-changed image characteristics are simultaneously integrated, so that a plurality of characteristics of the image are considered in the fusion process, the edge detail information of the image is enhanced, the contrast of the image is improved, the fusion result is favorable for identifying a camouflage target, and meanwhile, the problem of information redundancy among the polarization quantities is effectively solved by obtaining respective unique parts of the polarization images in the image fusion process .
The technical scheme adopted by the invention is as follows:
an infrared polarization image fusion method based on multi-feature and feature difference driving specifically comprises the following steps:
s1, representing the polarization of light by using a stokes vector, i.e., S ═ I, Q, U, V, and calculating a polarization degree image P and a polarization angle image R from the S vector;
s2, carrying out linear weighting on the polarization angle image R and the polarization angle image U to obtain an image R';
s3, calculating image R', each unique part except the common part between the total light intensity image I and the polarization degree image P, and respectively recording as R1、I1And P1
S4, image P1、I1And R1Mapping the RGB image to an R channel, a G channel and a B channel in an RGB space respectively to obtain an RGB image, converting the RGB image into a YUV image, and extracting a brightness component Y;
s5 fusing the images I through a method based on multi-feature separation1And P1Obtaining a fusion result F;
s6, replacing the brightness component Y in the step S4 with the fusion result F in the step S5 to obtain a replaced YUV image, and then performing inverse transformation on the replaced YUV image to obtain an RGB image, namely a final polarization image fusion result.
As a further improvement of the above technical solution, step S1 specifically includes:
s11, calculating a polarization degree image P:
Figure BDA0001822584950000021
wherein Q represents an intensity difference between horizontal polarization and vertical polarization, U represents an intensity difference between 45 ° and 135 ° in the polarization direction, and I represents a total light intensity image;
s12, calculating a polarization angle image R:
Figure BDA0001822584950000031
as a further improvement of the above technical solution, step S3 specifically includes:
s31, calculation image R', common portion Co between total light intensity image I and polarization degree image P:
Co=R'∩I∩P=min{R',I,P};
s32, calculating R' of image, total light intensity image I and unique part R of polarization degree image P1、I1And P1
Figure BDA0001822584950000032
As a further improvement of the above technical solution, step S4 specifically includes:
s41, image P1、I1And R1Respectively mapping to an R channel, a G channel and a B channel in an RGB space to obtain an RGB image;
s42, converting the RGB image into a YUV image:
Figure BDA0001822584950000033
s43, extracting luminance component Y:
Y=0.299R+0.587G+0.114B。
as a further improvement of the above technical solution, step S5 specifically includes:
s51, for image I1And P1Performing multi-feature separation to obtain image I1Bright feature image, dark feature image and detail feature image of, and image P1Bright feature images, dark feature images, and detail feature images;
s52 fusion image I1The bright feature image and the image P1Obtaining a bright feature fusion result FL
S53 fusion image I1The dark feature image and the image P1Obtaining a dark feature fusion result FD
S54 fusion image I1The detail characteristic image and the image P1Obtaining a detailed feature fusion result FDIF
S55, fusion FL、FDAnd FDIFTo obtain a fusion result F.
As a further improvement of the above technical solution, in step S51, a multi-feature separation method based on the dark primary theory is used for each image I1And P1Performing multi-feature separation, specifically comprising:
s511, obtaining an image I1And P1Dark primary color image of (1):
Figure BDA0001822584950000034
Figure BDA0001822584950000041
in the formula (I), the compound is shown in the specification,
Figure BDA0001822584950000042
as an image I1The dark primary-color image of (a),
Figure BDA0001822584950000043
is an image P1The dark primary-color image of (a),c is an image I1Or P1The three color channels R, G, B, N (x) are pixels in the window area centered on pixel point x, (I)1)C(y) and (P)1)C(y) are respectively represented as image I1And P1A color channel map of;
s512, image I1And picture P1Respectively negating to obtain images
Figure BDA0001822584950000044
And images
Figure BDA0001822584950000045
Image of dark primary color
Figure BDA0001822584950000046
And
Figure BDA0001822584950000047
respectively associated with the images
Figure BDA0001822584950000048
And
Figure BDA0001822584950000049
fusing according to the rule of taking the absolute value to be small to obtain an image I1Dark feature image of
Figure BDA00018225849500000410
And image P1Dark feature image of
Figure BDA00018225849500000411
Figure BDA00018225849500000412
Figure BDA00018225849500000413
Figure BDA00018225849500000414
Figure BDA00018225849500000415
S513, the dark primary color image
Figure BDA00018225849500000416
And
Figure BDA00018225849500000417
respectively corresponding to dark feature images
Figure BDA00018225849500000418
And
Figure BDA00018225849500000419
making a difference to obtain an image I1Bright feature image of
Figure BDA00018225849500000420
And image P1Bright feature image of
Figure BDA00018225849500000421
Figure BDA00018225849500000422
Figure BDA00018225849500000423
S514, image I1And P1Respectively corresponding to dark primary color images
Figure BDA00018225849500000424
And
Figure BDA00018225849500000425
making a difference to obtain an image I1Detail feature image of
Figure BDA00018225849500000426
And image P1Detail feature image of
Figure BDA00018225849500000427
Figure BDA00018225849500000428
Figure BDA00018225849500000429
As a further improvement of the above technical solution, in step S52, the image I is fused by using a matching method based on local region energy features1The bright feature image and the image P1The bright feature image specifically includes:
s521, obtaining a bright feature image
Figure BDA00018225849500000430
And
Figure BDA00018225849500000431
gaussian weighted local energy of:
Figure BDA00018225849500000432
wherein k is I1Or P1
Figure BDA0001822584950000051
Representing bright feature images
Figure BDA0001822584950000052
Or
Figure BDA0001822584950000053
Gaussian weighted local energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, and t is (N-1)/2;
s522, obtaining bright characteristic image
Figure BDA0001822584950000054
And
Figure BDA0001822584950000055
matching degree of gaussian weighted local energy of (1):
Figure BDA0001822584950000056
in the formula, ME(m, n) represents a bright feature image
Figure BDA0001822584950000057
And
Figure BDA0001822584950000058
the degree of matching of the local energies is weighted by the gaussian,
Figure BDA0001822584950000059
representing bright feature images
Figure BDA00018225849500000510
A Gaussian-weighted local energy centered at point (m, n),
Figure BDA00018225849500000511
representing bright feature images
Figure BDA00018225849500000512
Gaussian weighted local energy centered at point (m, n);
s523, fusing the bright feature images through the Gaussian weighted local energy and the Gaussian weighted local energy matching degree
Figure BDA00018225849500000513
And
Figure BDA00018225849500000514
Figure BDA00018225849500000515
Figure BDA00018225849500000516
in the formula, FL(m, n) is a bright feature image
Figure BDA00018225849500000517
And
Figure BDA00018225849500000518
fusion result of (1), TlFusing the threshold value for judging similarity for bright features, if the threshold value is ME(m,n)<TlThen the characteristic image is bright
Figure BDA00018225849500000519
And
Figure BDA00018225849500000520
regions centered on point (m, n) are dissimilar, bright feature images
Figure BDA00018225849500000521
And
Figure BDA00018225849500000522
the fusion result selects the one with larger energy in the Gaussian weighted region, otherwise, the image with bright features
Figure BDA00018225849500000523
And
Figure BDA00018225849500000524
the fusion result of (2) is a coefficient weighted average.
As a further improvement of the above technical solution, in step S53, image I is fused by using a matching method based on local area weighted variance features1The dark feature image and the image P1The dark feature image specifically includes:
s531, obtaining a dark feature image
Figure BDA00018225849500000525
And
Figure BDA00018225849500000526
local area weighted variance energy of:
Figure BDA00018225849500000527
wherein k is I1Or P1
Figure BDA00018225849500000531
Representing dark feature images
Figure BDA00018225849500000528
Or
Figure BDA00018225849500000529
Local region weighted variance energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, t ═ N-1/2,
Figure BDA00018225849500000530
represents the local area average centered at point (m, n);
s532, dark feature image is obtained
Figure BDA0001822584950000061
And
Figure BDA0001822584950000062
local area weighted variance energy matching:
Figure BDA0001822584950000063
in the formula, MV(m, n) denotes a dark feature image
Figure BDA0001822584950000064
And
Figure BDA0001822584950000065
the degree of matching of the local region weighted variance energies,
Figure BDA0001822584950000066
representing dark feature images
Figure BDA0001822584950000067
Local region weighted variance energy centered at point (m, n),
Figure BDA0001822584950000068
representing dark feature images
Figure BDA0001822584950000069
Local region weighted variance energy centered at point (m, n);
s533, fusing two dark feature images through the local area weighted variance energy and the local area weighted variance energy matching degree
Figure BDA00018225849500000610
And
Figure BDA00018225849500000611
Figure BDA00018225849500000612
Figure BDA00018225849500000613
in the formula, FD(m, n) is a dark feature image
Figure BDA00018225849500000614
And
Figure BDA00018225849500000615
fusion result of (1), ThIf the threshold value for judging the similarity of the dark feature fusion is ME(m,n)<ThThen two images
Figure BDA00018225849500000616
And
Figure BDA00018225849500000617
the regions centered on the point (m, n) are not similar, the two images
Figure BDA00018225849500000618
And
Figure BDA00018225849500000619
selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images
Figure BDA00018225849500000620
And
Figure BDA00018225849500000621
the fusion result of (2) is a coefficient weighted average.
As a further improvement of the above technical solution, in step S54, the fused image I is driven by fuzzy logic and feature difference1The detail characteristic image and the image P1The detailed feature image specifically includes:
s541, obtaining detail characteristic image
Figure BDA00018225849500000622
And
Figure BDA00018225849500000623
local gradient of (d):
Figure BDA00018225849500000624
wherein k is I1Or P1
Figure BDA00018225849500000630
Image representing detail feature
Figure BDA00018225849500000625
Or
Figure BDA00018225849500000626
Local gradients at the middle pixel (m, n),
Figure BDA00018225849500000627
respectively representing horizontal and vertical edge images obtained by convolution of horizontal and vertical templates of a Sobel operator and the detail characteristic image;
s542, obtaining detail characteristic image
Figure BDA00018225849500000628
And
Figure BDA00018225849500000629
local area weighted variance energy of:
Figure BDA0001822584950000071
wherein k is I1Or P1,Vk P(m, n) represents a detail feature image
Figure BDA0001822584950000072
Or
Figure BDA0001822584950000073
Local region weighted variance energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, t ═ N-1/2,
Figure BDA0001822584950000074
represents the local area average centered at point (m, n);
s543, obtaining detail characteristic image
Figure BDA0001822584950000075
And
Figure BDA0001822584950000076
local difference gradient Δ T (m, n), local difference variance Δ V (m, n), local differenceDegree of partial gradient matching MT(M, n) and local weighted variance matching MV1(m,n):
Figure BDA0001822584950000077
Figure BDA0001822584950000078
Figure BDA0001822584950000079
Figure BDA00018225849500000710
In the formula (I), the compound is shown in the specification,
Figure BDA00018225849500000711
image representing detail feature
Figure BDA00018225849500000712
Local gradients at the middle pixel (m, n),
Figure BDA00018225849500000713
image representing detail feature
Figure BDA00018225849500000714
Local gradients at the middle pixel (m, n),
Figure BDA00018225849500000715
image representing detail feature
Figure BDA00018225849500000716
Local region weighted variance energy centered at point (m, n),
Figure BDA00018225849500000717
image representing detail feature
Figure BDA00018225849500000718
Local region weighted variance energy centered at point (m, n);
s544, a decision graph based on pixels is obtained according to the local difference gradient and the local difference variance, and the matching degree of the local gradient and the local weighted variance is obtained
Figure BDA00018225849500000719
Figure BDA0001822584950000081
Where PDG (m, n) is a pixel-based decision graph g1~g9A decision graph showing that the pixel position of the time point (m, n) satisfying the corresponding condition is 1 and the other pixel positions are 0, DDG (m, n) is a feature difference degree decision graph, d1And d2A decision diagram in which the pixel position of the point (m, n) satisfying the corresponding condition is 1 and the other pixel positions are 0;
s545, judging the detail feature image P according to the pixel-based decision map PDG (m, n) and the feature difference degree decision map DDG (m, n)I1And PP1The definite area and the uncertain area of (2): g1、g2、g3、g4、g5、g6、g7And g8Belongs to a certain area, g9Belonging to an uncertain region;
s546, fusing detail feature images P by using feature difference drivingI1And PP1The determination region of (2):
Figure BDA0001822584950000082
DIF(m,n)=ΔT(m,n)·ΔV(m,n)
in the formula (I), the compound is shown in the specification,
Figure BDA0001822584950000083
image representing detail feature
Figure BDA0001822584950000084
And
Figure BDA0001822584950000085
determining a fused image of the region, DIF (m, n) representing the fusion driving factor of the determined region, "· represents the product of the values at the corresponding pixel positions in the matrix;
s547, fusing detail characteristic images by using fuzzy logic theory
Figure BDA0001822584950000086
And
Figure BDA0001822584950000087
the uncertainty region of (a);
Figure BDA0001822584950000088
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))]
Figure BDA0001822584950000089
Figure BDA00018225849500000810
in the formula (I), the compound is shown in the specification,
Figure BDA00018225849500000811
image representing detail feature
Figure BDA00018225849500000812
And
Figure BDA00018225849500000813
a fused image of the uncertain region, ". represents the product of the values at the corresponding pixel locations in the matrix,./represents the division of the values at the corresponding pixel locations in the matrix,
Figure BDA00018225849500000814
image representing detail feature
Figure BDA00018225849500000815
Membership functions of pixel values at position (m, n) to the degree of importance of the fused image of the uncertain region,
Figure BDA0001822584950000091
image representing detail feature
Figure BDA0001822584950000092
Membership function, μ, of pixel value at location (m, n) to the importance of the fused image of the uncertain regionT(Pk(m, n)) represents a "detail feature image
Figure BDA0001822584950000093
And
Figure BDA0001822584950000094
is a membership function of the large "case, muV(Pk(m, n)) represents a "detail feature image
Figure BDA0001822584950000095
And
Figure BDA0001822584950000096
is large, k is I1Or P1
S548, fusion
Figure BDA0001822584950000097
And
Figure BDA0001822584950000098
obtaining detail characteristic image
Figure BDA0001822584950000099
And
Figure BDA00018225849500000910
the fusion result of (2):
Figure BDA00018225849500000911
in the formula, FDIF(m, n) represents a detail feature image
Figure BDA00018225849500000912
And
Figure BDA00018225849500000913
the fused image of (1).
S549, to FDIF(m, n) performing consistency check:
using a size 3 × 3 window in image FDIF(m, n) move up, verify the center pixel with the pixels around the window if the center pixel is from
Figure BDA00018225849500000914
And
Figure BDA00018225849500000915
in one of the images, and s (4 < s < 8) surrounding the central pixel are from the other image, the central pixel value is changed to the pixel value of the other image at that location, and the window traverses the entire image FDIF(m, n) to obtain corrected FDIF(m,n)。
As a further improvement of the above technical solution, in step S55, the fusion result F is obtained by:
F=αFL+βFD+γFDIF
wherein α, β, and γ are fusion weight coefficients.
The invention has the beneficial technical effects that:
1. the method of the invention fuses a plurality of polarization quantities of the infrared polarization image, wherein the plurality of polarization quantities comprise an image Q, an image U, an image V, a total light intensity image I, a polarization degree image P and a polarization angle image R, so that the fused image scene is richer, the identification of a camouflage target is facilitated, and meanwhile, the problem of information redundancy among the polarization quantities is effectively solved by obtaining respective unique parts except a common part among the polarization images in the image fusion process.
2. The method of the invention separates a plurality of characteristics of the image and integrates a plurality of uncertain and randomly changed image characteristics, thereby giving consideration to a plurality of characteristics of the image in the fusion process, enhancing the edge detail information of the image and improving the contrast of the image.
Drawings
FIG. 1 is a flowchart of an infrared polarization image fusion method based on multi-feature and feature difference driving according to the present embodiment;
fig. 2 is a flowchart of a fusion method based on multi-feature separation according to this embodiment.
Detailed Description
In order to facilitate the practice of the invention, further description is provided below with reference to specific examples.
As shown in fig. 1, an infrared polarization image fusion method based on multi-feature and feature difference driving specifically includes the following steps:
s1, the polarization of light is expressed by a stokes vector, i.e., S ═ I, Q, U, V, and the degree-of-polarization image P and the angle-of-polarization image R are calculated from the S vector:
in practical polarization, a phase retarder is not needed, and the Stokes parameters can be obtained only by rotating a linear polarizer. The polarization degree image P and the polarization angle image R of the polarized light can be expressed as:
Figure BDA0001822584950000101
s2, linearly weighting the polarization angle images R and U to obtain an image R':
R'=(R+U)/2。
s3, calculating image R', each unique part except the common part between the total light intensity image I and the polarization degree image P, and respectively recording as R1、I1And P1The method specifically comprises the following steps:
s31, image R ', total intensity image I, and degree of polarization image P have redundant and complementary information therebetween, and the common portion Co between image R', total intensity image I, and degree of polarization image P is calculated using the following formula:
Co=R'∩I∩P=min{R',I,P};
s32, calculating R' of image, total light intensity image I and unique part R of polarization degree image P1、I1And P1
Figure BDA0001822584950000102
S4, image P1、I1And R1Mapping to an R channel, a G channel and a B channel in an RGB space respectively to obtain an RGB image, converting the RGB image into a YUV image, and extracting a brightness component Y, wherein UV in the YUV image is a color component, and Y is a brightness component:
s41, image P1Mapping to R channel in RGB space, image I1Mapping to G channel in RGB space, image R1Mapping to a channel B in an RGB space to obtain an RGB image;
s42, converting the RGB image into a YUV image:
Figure BDA0001822584950000111
s43, extracting luminance component Y:
Y=0.299R+0.587G+0.114B。
s5, referring to FIG. 2, fusing the image I by a method based on multi-feature separation1And P1Obtaining a fusion result F, wherein the method based on multi-feature separation firstly carries out on the image I1And P1Performing multi-feature separation, and then fusing image images I1And P1The same characteristic is added, and then the fusion results of different characteristics are fused again, namely the image I is completed1And P1The method based on multi-feature separation in this embodiment separates dark features, bright features, and detail features of an image, and integrates local area energy features, local area variance features, and local area laddersThe method considers the relationship between image pixels by a plurality of uncertain and randomly-changed image features of the degree, thereby giving consideration to the bright and dark features in the fusion process, enhancing the edge detail information of the image, and improving the contrast of the image, and specifically comprises the following steps:
s51, respectively carrying out multi-feature separation on the images I by adopting a multi-feature separation method based on the dark primary color theory1And P1Performing multi-feature separation to obtain image I1Bright feature image, dark feature image and detail feature image of, and image P1The dark primary color is He and the like and is used for estimating the transmissivity in the atmospheric scattering model, and the natural image is defogged quickly. Therefore, for the gray level image, the dark primary color image includes a bright area in the original image, and a low-frequency part of the original image is embodied, that is, an area with relatively smooth gray level change in the original image is reserved, so that the difference of the bright and dark features is more prominent, and the information of the local area with relatively violent gray level change and high contrast, especially the edge detail information, is lost, and the obtaining process specifically includes:
s511, obtaining an image I1And P1Dark primary color image of (1):
Figure BDA0001822584950000112
Figure BDA0001822584950000113
in the formula (I), the compound is shown in the specification,
Figure BDA0001822584950000114
as an image I1The dark primary-color image of (a),
Figure BDA0001822584950000115
is an image P1C is the image I1Or P1R, G, B, N: (x) is a pixel in the window area centered on pixel point x, (I)1)C(y) and (P)1)C(y) are respectively represented as image I1And P1A color channel map of;
s512, image I1And picture P1Respectively negating to obtain images
Figure BDA0001822584950000121
And images
Figure BDA0001822584950000122
Image of dark primary color
Figure BDA0001822584950000123
And
Figure BDA0001822584950000124
respectively associated with the images
Figure BDA0001822584950000125
And
Figure BDA0001822584950000126
fusing according to the rule of taking the absolute value to be small to obtain an image I1Dark feature image of
Figure BDA0001822584950000127
And image P1Dark feature image of
Figure BDA0001822584950000128
Figure BDA0001822584950000129
Figure BDA00018225849500001210
Figure BDA00018225849500001211
Figure BDA00018225849500001212
S513, the dark primary color image
Figure BDA00018225849500001213
And
Figure BDA00018225849500001214
respectively corresponding to dark feature images
Figure BDA00018225849500001215
And
Figure BDA00018225849500001216
making a difference to obtain an image I1Bright feature image of
Figure BDA00018225849500001217
And image P1Bright feature image of
Figure BDA00018225849500001218
Figure BDA00018225849500001219
Figure BDA00018225849500001220
S514, image I1And P1Respectively corresponding to dark primary color images
Figure BDA00018225849500001221
And
Figure BDA00018225849500001222
making a difference to obtain an image I1Detail feature image of
Figure BDA00018225849500001223
And image P1Detail feature image of
Figure BDA00018225849500001224
Figure BDA00018225849500001225
Figure BDA00018225849500001232
S52, fusing the image I by adopting a matching method based on local region energy characteristics1The bright feature image and the image P1Obtaining a bright feature fusion result FLThe bright characteristic information concentrates on a bright area in the original image, and reflects low-frequency components in the original image, and the calculation process specifically comprises the following steps:
s521, obtaining a bright feature image
Figure BDA00018225849500001226
And
Figure BDA00018225849500001227
gaussian weighted local energy of:
Figure BDA00018225849500001228
wherein k is I1Or P1
Figure BDA00018225849500001229
Representing bright feature images
Figure BDA00018225849500001230
Or
Figure BDA00018225849500001231
Gaussian weighted local energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, and t is (N-1)/2;
s522, obtaining bright characteristic image
Figure BDA0001822584950000131
And
Figure BDA0001822584950000132
matching degree of gaussian weighted local energy of (1):
Figure BDA0001822584950000133
in the formula, ME(m, n) represents a bright feature image
Figure BDA0001822584950000134
And
Figure BDA0001822584950000135
the degree of matching of the local energies is weighted by the gaussian,
Figure BDA0001822584950000136
representing bright feature images
Figure BDA0001822584950000137
A Gaussian-weighted local energy centered at point (m, n),
Figure BDA0001822584950000138
representing bright feature images
Figure BDA0001822584950000139
Gaussian weighted local energy centered at point (m, n);
s523, fusing the bright feature images through the Gaussian weighted local energy and the Gaussian weighted local energy matching degree
Figure BDA00018225849500001310
And
Figure BDA00018225849500001311
Figure BDA00018225849500001312
Figure BDA00018225849500001313
in the formula, FL(m, n) is a bright feature image
Figure BDA00018225849500001314
And
Figure BDA00018225849500001315
fusion result of (1), TlAnd (3) fusing a threshold value for judging similarity for bright features, wherein the value is 0-0.5, and if the value is ME(m,n)<TlThen the characteristic image is bright
Figure BDA00018225849500001316
And
Figure BDA00018225849500001317
regions centered on point (m, n) are dissimilar, bright feature images
Figure BDA00018225849500001318
And
Figure BDA00018225849500001319
the fusion result selects the one with larger energy in the Gaussian weighted region, otherwise, the image with bright features
Figure BDA00018225849500001320
And
Figure BDA00018225849500001321
the fusion result of (2) is a coefficient weighted average.
S53, fusing the image I by adopting a matching method based on local region weighted variance characteristics1The dark feature image and the image P1Obtaining a dark feature fusion result FDDark feature images lack bright areas in the source image but can still be considered as an approximation of the source image, packetThe method comprises the following steps of containing main energy of an image and embodying a basic outline of the image, wherein the calculation process specifically comprises the following steps:
s531, obtaining a dark feature image
Figure BDA00018225849500001322
And
Figure BDA00018225849500001323
local area weighted variance energy of:
Figure BDA00018225849500001324
wherein k is I1Or P1
Figure BDA00018225849500001328
Representing dark feature images
Figure BDA00018225849500001325
Or
Figure BDA00018225849500001326
Local region weighted variance energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, t ═ N-1/2,
Figure BDA00018225849500001327
represents the local area average centered at point (m, n);
s532, dark feature image is obtained
Figure BDA0001822584950000141
And
Figure BDA0001822584950000142
local area weighted variance energy matching:
Figure BDA0001822584950000143
in the formula, MV(m, n) denotes darkCharacteristic image
Figure BDA0001822584950000144
And
Figure BDA0001822584950000145
the degree of matching of the local region weighted variance energies,
Figure BDA0001822584950000146
representing dark feature images
Figure BDA0001822584950000147
Local region weighted variance energy centered at point (m, n),
Figure BDA0001822584950000148
representing dark feature images
Figure BDA0001822584950000149
Local region weighted variance energy centered at point (m, n);
s533, fusing two dark feature images through the local area weighted variance energy and the local area weighted variance energy matching degree
Figure BDA00018225849500001410
And
Figure BDA00018225849500001411
Figure BDA00018225849500001412
Figure BDA00018225849500001413
in the formula, FD(m, n) is a dark feature image
Figure BDA00018225849500001414
And
Figure BDA00018225849500001415
fusion result of (1), ThThe threshold value for judging the similarity of the dark features is selected from 0.5 to 1, and if the threshold value is ME(m,n)<ThThen two images
Figure BDA00018225849500001416
And
Figure BDA00018225849500001417
the regions centered on the point (m, n) are not similar, the two images
Figure BDA00018225849500001418
And
Figure BDA00018225849500001419
selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images
Figure BDA00018225849500001420
And
Figure BDA00018225849500001421
the fusion result of (2) is a coefficient weighted average.
S54, the local gradient and the local variance can well reflect the detail information of the image and express the definition of the image. In order to keep the detail information of the detail feature image as much as possible and improve the definition, the fuzzy logic and the feature difference are adopted to drive the fusion image I1The detail characteristic image and the image P1Obtaining a detailed feature fusion result FDIFThe calculation process specifically includes:
s541, obtaining detail characteristic image
Figure BDA00018225849500001422
And
Figure BDA00018225849500001423
local gradient of (d):
Figure BDA00018225849500001424
wherein k is I1Or P1
Figure BDA00018225849500001427
Image representing detail feature
Figure BDA00018225849500001425
Or
Figure BDA00018225849500001426
Local gradients at the middle pixel (m, n),
Figure BDA0001822584950000151
respectively representing horizontal and vertical edge images obtained by convolution of horizontal and vertical templates of a Sobel operator and the detail characteristic image;
s542, obtaining detail characteristic image
Figure BDA0001822584950000152
And
Figure BDA0001822584950000153
local area weighted variance energy of:
Figure BDA0001822584950000154
wherein k is I1Or P1
Figure BDA00018225849500001522
Image representing detail feature
Figure BDA0001822584950000155
Or
Figure BDA0001822584950000156
Local region weighted variance energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, t ═ N-1/2,
Figure BDA0001822584950000157
represents the local area average centered at point (m, n);
s543, obtaining detail characteristic image
Figure BDA0001822584950000158
And
Figure BDA0001822584950000159
local difference gradient Δ T (M, n), local difference variance Δ V (M, n), local gradient matching degree MT(M, n) and local weighted variance matching MV1(m,n):
Figure BDA00018225849500001510
Figure BDA00018225849500001511
Figure BDA00018225849500001512
Figure BDA00018225849500001513
In the formula (I), the compound is shown in the specification,
Figure BDA00018225849500001514
image representing detail feature
Figure BDA00018225849500001515
Local gradients at the middle pixel (m, n),
Figure BDA00018225849500001516
image representing detail feature
Figure BDA00018225849500001517
Local gradients at the middle pixel (m, n),
Figure BDA00018225849500001518
image representing detail feature
Figure BDA00018225849500001519
Local region weighted variance energy centered at point (m, n),
Figure BDA00018225849500001520
image representing detail feature
Figure BDA00018225849500001521
Local region weighted variance energy centered at point (m, n);
s544, a decision graph based on pixels is obtained according to the local difference gradient and the local difference variance, and a feature difference degree decision graph is obtained according to the local gradient matching degree and the local weighted variance matching degree:
Figure BDA0001822584950000161
Figure BDA0001822584950000162
where PDG (m, n) is a pixel-based decision graph g1~g9A decision graph showing that the pixel position of the time point (m, n) satisfying the corresponding condition is 1 and the other pixel positions are 0, DDG (m, n) is a feature difference degree decision graph, d1And d2A decision diagram in which the pixel position of the point (m, n) satisfying the corresponding condition is 1 and the other pixel positions are 0;
s545, judging the detail feature image according to the pixel-based decision graph PDG (m, n) and the feature difference degree decision graph DDG (m, n)
Figure BDA0001822584950000163
And
Figure BDA0001822584950000164
the definite area and the uncertain area of (2):
the definite region indicates that the PDG (m, n) or DDG (m, n) decision graph can reflect that the gray scale value of the pixel point can be retained in the pixel of the fused image, and the indefinite region indicates that the PDG (m, n) or DDG (m, n) decision graph cannot reflect that the gray scale value of the pixel point can be retained in the pixel of the fused image, specifically:
for g1And g2In the two cases, the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) can both reflect whether the gray value of the corresponding pixel point remains in the fused image, so g1And g2Belongs to a determined area;
for g3And g4For the two situations, the feature difference degree decision graph DDG can be used for determining the difference degree of the local features of the two images, then the difference feature with larger difference degree is selected, and the difference feature can reflect whether the gray value of the corresponding pixel point is kept in the fusion image, so g3And g4Belongs to a determined area;
for g5、g6、g7And g8For the four cases, any one of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) can reflect whether the gray value of the corresponding pixel point remains in the fused image, so g5、g6、g7And g8Belongs to a determined area;
for g9In this case, it cannot be reflected whether the gray value of the corresponding pixel point remains in the fused image according to the two decision maps PDG and DDG, so g9Belonging to an uncertain region.
S546, fusing detail feature images by using feature difference driving
Figure BDA0001822584950000171
And
Figure BDA0001822584950000172
the determination region of (2):
Figure BDA0001822584950000173
DIF(m,n)=ΔT(m,n)·ΔV(m,n)
in the formula (I), the compound is shown in the specification,
Figure BDA0001822584950000174
image representing detail feature
Figure BDA0001822584950000175
And
Figure BDA0001822584950000176
determining a fused image of the region, DIF (m, n) representing a fusion driving factor of the determined region, expressed as a product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n), "· represents a product of values at corresponding pixel locations in the matrix;
s547, fusing detail characteristic images by using fuzzy logic theory
Figure BDA0001822584950000177
And
Figure BDA0001822584950000178
is determined.
For detail feature images
Figure BDA0001822584950000179
And
Figure BDA00018225849500001710
whether the local gradient of the detail feature image is large or the local weighted variance of the detail feature image is large needs to be considered, and the membership function of the detail feature image is constructed according to the group of relations. Suppose "detail feature image
Figure BDA00018225849500001711
And
Figure BDA00018225849500001712
the local gradient of (a) is large is respectively a membership function of
Figure BDA00018225849500001713
And
Figure BDA00018225849500001714
detail feature image
Figure BDA00018225849500001715
And
Figure BDA00018225849500001716
the local weighted variance of (a) is large is respectively
Figure BDA00018225849500001717
And
Figure BDA00018225849500001718
if so, then there is;
Figure BDA00018225849500001719
Figure BDA00018225849500001720
wherein k is I1Or P1
The detail characteristic images can be respectively calculated by using the delivery operation rule of fuzzy logic
Figure BDA00018225849500001721
And
Figure BDA00018225849500001722
the membership functions of the pixel value at the position (m, n) to the importance degree of the fused image of the uncertain region are respectively
Figure BDA00018225849500001723
And
Figure BDA00018225849500001724
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))]
wherein k is I1Or P1
The fusion image of the uncertain region of the two image detail characteristic images is as follows:
Figure BDA0001822584950000181
in the formula (I), the compound is shown in the specification,
Figure BDA0001822584950000182
image representing detail feature
Figure BDA0001822584950000183
And
Figure BDA0001822584950000184
a fused image of the uncertain region, ". represents the product of the values at the corresponding pixel locations in the matrix,./represents the division of the values at the corresponding pixel locations in the matrix,
Figure BDA0001822584950000185
image representing detail feature
Figure BDA0001822584950000186
Membership functions of pixel values at position (m, n) to the degree of importance of the fused image of the uncertain region,
Figure BDA0001822584950000187
image representing detail feature
Figure BDA0001822584950000188
Membership functions of the pixel values at the position (m, n) to the importance degree of the fused image of the uncertain region;
s548, fusion
Figure BDA0001822584950000189
And
Figure BDA00018225849500001810
obtaining detail characteristic image
Figure BDA00018225849500001811
And
Figure BDA00018225849500001812
the fusion result of (2):
Figure BDA00018225849500001813
in the formula, FDIF(m, n) represents a detail feature image
Figure BDA00018225849500001814
And
Figure BDA00018225849500001815
the fused image of (1);
s549, to FDIF(m, n) performing consistency check:
using a size 3 × 3 window in image FDIF(m, n) move up, verifying the central pixel with the pixels around the window. If the central pixel comes from
Figure BDA00018225849500001816
And
Figure BDA00018225849500001817
in one of the images, and s (4 < s < 8) surrounding the central pixel are from the other image, the central pixel value is changed to the pixel value of the other image at that location, and the window traverses the entire image FDIF(m, n) to obtain corrected FDIF(m,n)。
S55, fusion FL、FDAnd FDIFObtaining a fusion result F:
F=αFL+βFD+γFDIF
in the formula, α, β, and γ are fusion weight coefficients, and the value range is [0,1], in order to reduce the supersaturation of the fusion image and improve the contrast, the value of α is 1, the value of β is 0.3, and the value of γ is 1.
S6, replacing the luminance component Y in step S4 with the fusion result F in step S5 to obtain a replaced YUV image, and performing inverse transformation on the replaced YUV image to obtain an RGB image, that is, a final polarization image fusion result:
the brightness contrast of the image is reduced in the RGB color mapping process, so that the brightness component needs to be subjected to gray scale enhancement, and this implementation has already adopted the gray scale fusion image to replace the brightness component to enhance the brightness, that is, the fusion result F of step S5 replaces the brightness component Y, thereby obtaining the final polarization image fusion result.
The foregoing description of the preferred embodiments of the present invention has been included to describe the features of the invention in detail, and is not intended to limit the inventive concepts to the particular forms of the embodiments described, as other modifications and variations within the spirit of the inventive concepts will be protected by this patent. The subject matter of the present disclosure is defined by the claims, not by the detailed description of the embodiments.

Claims (10)

1. The infrared polarization image fusion method based on multi-feature and feature difference driving is characterized by specifically comprising the following steps of:
s1, representing the polarization of light by using a stokes vector, i.e., S ═ I, Q, U, V, and calculating a polarization degree image P and a polarization angle image R from the S vector;
s2, carrying out linear weighting on the polarization angle image R and the polarization angle image U to obtain an image R';
s3, calculating image R', each unique part except the common part between the total light intensity image I and the polarization degree image P, and respectively recording as R1、I1And P1
S4, image P1、I1And R1Mapping the RGB image to an R channel, a G channel and a B channel in an RGB space respectively to obtain an RGB image, converting the RGB image into a YUV image, and extracting a brightness component Y;
s5 fusing the images I through a method based on multi-feature separation1And P1Obtaining a fusion result F;
s6, replacing the brightness component Y in the step S4 with the fusion result F in the step S5 to obtain a replaced YUV image, and then performing inverse transformation on the replaced YUV image to obtain an RGB image, namely a final polarization image fusion result.
2. The infrared polarization image fusion method based on multi-feature and feature difference driving according to claim 1, wherein the step S1 specifically includes:
s11, calculating a polarization degree image P:
Figure FDA0002547694210000011
wherein Q represents an intensity difference between horizontal polarization and vertical polarization, U represents an intensity difference between 45 ° and 135 ° in the polarization direction, and I represents a total light intensity image;
s12, calculating a polarization angle image R:
Figure FDA0002547694210000012
3. the infrared polarization image fusion method based on multi-feature and feature difference driving according to claim 1, wherein the step S3 specifically includes:
s31, calculation image R', common portion Co between total light intensity image I and polarization degree image P:
Co=R'∩I∩P=min{R',I,P};
s32, calculating R' of image, total light intensity image I and unique part R of polarization degree image P1、I1And P1
Figure FDA0002547694210000021
4. The infrared polarization image fusion method based on multi-feature and feature difference driving according to claim 1, wherein the step S4 specifically includes:
s41, image P1、I1And R1Respectively mapping to an R channel, a G channel and a B channel in an RGB space to obtain an RGB image;
s42, converting the RGB image into a YUV image:
Figure FDA0002547694210000022
s43, extracting luminance component Y:
Y=0.299R+0.587G+0.114B。
5. the infrared polarization image fusion method based on multi-feature and feature difference driving according to claim 1, wherein the step S5 specifically includes:
s51, for image I1And P1Performing multi-feature separation to obtain image I1Bright feature image, dark feature image and detail feature image of, and image P1Bright feature images, dark feature images, and detail feature images;
s52 fusion image I1The bright feature image and the image P1Obtaining a bright feature fusion result FL
S53 fusion image I1The dark feature image and the image P1Obtaining a dark feature fusion result FD
S54 fusion image I1The detail characteristic image and the image P1Obtaining a detailed feature fusion result FDIF
S55, fusion FL、FDAnd FDIFTo obtain a fusion result F.
6. The infrared polarization image fusion method based on multi-feature and feature difference driving according to claim 5, which comprisesCharacterized in that in step S51, a multi-feature separation method based on the dark primary theory is adopted for each image I1And P1Performing multi-feature separation, specifically comprising:
s511, obtaining an image I1And P1Dark primary color image of (1):
Figure FDA0002547694210000023
Figure FDA0002547694210000024
in the formula (I), the compound is shown in the specification,
Figure FDA0002547694210000025
as an image I1The dark primary-color image of (a),
Figure FDA0002547694210000026
is an image P1C is the image I1Or P1The three color channels R, G, B, N (x) are pixels in the window area centered on pixel point x, (I)1)C(y) and (P)1)C(y) are respectively represented as image I1And P1A color channel map of;
s512, image I1And picture P1Respectively negating to obtain images
Figure FDA0002547694210000031
And images
Figure FDA0002547694210000032
Image of dark primary color
Figure FDA0002547694210000033
And
Figure FDA0002547694210000034
respectively associated with the images
Figure FDA0002547694210000035
And
Figure FDA0002547694210000036
fusing according to the rule of taking the absolute value to be small to obtain an image I1Dark feature image of
Figure FDA00025476942100000334
And image P1Dark feature image of
Figure FDA0002547694210000037
Figure FDA0002547694210000038
Figure FDA0002547694210000039
Figure FDA00025476942100000310
Figure FDA00025476942100000311
S513, the dark primary color image
Figure FDA00025476942100000312
And
Figure FDA00025476942100000313
respectively corresponding to dark feature images
Figure FDA00025476942100000314
And
Figure FDA00025476942100000315
making a difference to obtain an image I1Bright feature image of
Figure FDA00025476942100000316
And image P1Bright feature image of
Figure FDA00025476942100000317
Figure FDA00025476942100000318
Figure FDA00025476942100000319
S514, image I1And P1Respectively corresponding to dark primary color images
Figure FDA00025476942100000320
And
Figure FDA00025476942100000321
making a difference to obtain an image I1Detail feature image of
Figure FDA00025476942100000322
And image P1Detail feature image of
Figure FDA00025476942100000323
Figure FDA00025476942100000324
Figure FDA00025476942100000325
7. The multi-feature based on claim 5The infrared polarization image fusion method driven by feature difference is characterized in that in step S52, an image I is fused by adopting a matching method based on local region energy features1The bright feature image and the image P1The bright feature image specifically includes:
s521, obtaining a bright feature image
Figure FDA00025476942100000326
And
Figure FDA00025476942100000327
gaussian weighted local energy of:
Figure FDA00025476942100000328
wherein k is I1Or P1
Figure FDA00025476942100000329
Representing bright feature images
Figure FDA00025476942100000330
Or
Figure FDA00025476942100000331
Gaussian weighted local energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, and t is (N-1)/2;
s522, obtaining bright characteristic image
Figure FDA00025476942100000332
And
Figure FDA00025476942100000333
matching degree of gaussian weighted local energy of (1):
Figure FDA0002547694210000041
in the formula, ME(m, n) represents a bright feature image
Figure FDA0002547694210000042
And
Figure FDA0002547694210000043
the degree of matching of the local energies is weighted by the gaussian,
Figure FDA0002547694210000044
representing bright feature images
Figure FDA0002547694210000045
A Gaussian-weighted local energy centered at point (m, n),
Figure FDA0002547694210000046
representing bright feature images
Figure FDA0002547694210000047
Gaussian weighted local energy centered at point (m, n);
s523, fusing the bright feature images through the Gaussian weighted local energy and the Gaussian weighted local energy matching degree
Figure FDA0002547694210000048
And
Figure FDA0002547694210000049
Figure FDA00025476942100000410
Figure FDA00025476942100000411
in the formula, FL(m, n) is a bright feature image
Figure FDA00025476942100000412
And
Figure FDA00025476942100000413
fusion result of (1), TlFusing the threshold value for judging similarity for bright features, if the threshold value is ME(m,n)<TlThen the characteristic image is bright
Figure FDA00025476942100000414
And
Figure FDA00025476942100000415
regions centered on point (m, n) are dissimilar, bright feature images
Figure FDA00025476942100000416
And
Figure FDA00025476942100000417
the fusion result selects the one with larger energy in the Gaussian weighted region, otherwise, the image with bright features
Figure FDA00025476942100000418
And
Figure FDA00025476942100000419
the fusion result of (2) is a coefficient weighted average.
8. The infrared polarization image fusion method based on multi-feature and feature difference driving as claimed in claim 5, wherein in step S53, image I is fused by using matching method based on local area weighted variance features1The dark feature image and the image P1The dark feature image specifically includes:
s531, obtaining a dark feature image
Figure FDA00025476942100000420
And
Figure FDA00025476942100000421
local area weighted variance energy of:
Figure FDA00025476942100000422
wherein k is I1Or P1
Figure FDA00025476942100000423
Representing dark feature images
Figure FDA00025476942100000424
Or
Figure FDA00025476942100000425
Local region weighted variance energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, t ═ N-1/2,
Figure FDA00025476942100000426
represents the local area average centered at point (m, n);
s532, dark feature image is obtained
Figure FDA00025476942100000427
And
Figure FDA00025476942100000428
local area weighted variance energy matching:
Figure FDA00025476942100000429
in the formula, MV(m, n) denotes a dark feature image
Figure FDA0002547694210000051
And
Figure FDA0002547694210000052
the degree of matching of the local region weighted variance energies,
Figure FDA0002547694210000053
representing dark feature images
Figure FDA0002547694210000054
Local region weighted variance energy centered at point (m, n),
Figure FDA0002547694210000055
representing dark feature images
Figure FDA0002547694210000056
Local region weighted variance energy centered at point (m, n);
s533, fusing two dark feature images through the local area weighted variance energy and the local area weighted variance energy matching degree
Figure FDA0002547694210000057
And
Figure FDA0002547694210000058
Figure FDA0002547694210000059
Figure FDA00025476942100000510
in the formula, FD(m, n) is a dark feature image
Figure FDA00025476942100000511
And
Figure FDA00025476942100000512
fusion result of (1), ThIf the threshold value for judging the similarity of the dark feature fusion is ME(m,n)<ThThen two images
Figure FDA00025476942100000513
And
Figure FDA00025476942100000514
the regions centered on the point (m, n) are not similar, the two images
Figure FDA00025476942100000515
And
Figure FDA00025476942100000516
selecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images
Figure FDA00025476942100000517
And
Figure FDA00025476942100000518
the fusion result of (2) is a coefficient weighted average.
9. The infrared polarization image fusion method based on multi-feature and feature difference driving as claimed in claim 5, wherein in step S54, fused image I is driven by fuzzy logic and feature difference1The detail characteristic image and the image P1The detailed feature image specifically includes:
s541, obtaining detail characteristic image
Figure FDA00025476942100000519
And
Figure FDA00025476942100000520
local gradient of (d):
Figure FDA00025476942100000521
wherein k is I1Or P1,Tk P(m, n) represents a detail feature image
Figure FDA00025476942100000522
Or
Figure FDA00025476942100000523
Local gradients at the middle pixel (m, n),
Figure FDA00025476942100000524
respectively representing horizontal and vertical edge images obtained by convolution of horizontal and vertical templates of a Sobel operator and the detail characteristic image;
s542, obtaining detail characteristic image
Figure FDA00025476942100000525
And
Figure FDA00025476942100000526
local area weighted variance energy of:
Figure FDA00025476942100000527
wherein k is I1Or P1,Vk P(m, n) represents a detail feature image
Figure FDA00025476942100000528
Or
Figure FDA00025476942100000529
Local region weighted variance energy centered at point (m, N), w (i, j) is a gaussian filter matrix, N is the size of the region, t ═ N-1/2,
Figure FDA0002547694210000061
representing local area averages centred on point (m, n)A value;
s543, obtaining detail characteristic image
Figure FDA0002547694210000062
And
Figure FDA0002547694210000063
local difference gradient Δ T (M, n), local difference variance Δ V (M, n), local gradient matching degree MT(M, n) and local weighted variance matching MV1(m,n):
Figure FDA0002547694210000064
Figure FDA0002547694210000065
Figure FDA0002547694210000066
Figure FDA0002547694210000067
In the formula (I), the compound is shown in the specification,
Figure FDA0002547694210000068
image representing detail feature
Figure FDA0002547694210000069
Local gradients at the middle pixel (m, n),
Figure FDA00025476942100000610
image representing detail feature
Figure FDA00025476942100000611
Local gradients at the middle pixel (m, n),
Figure FDA00025476942100000612
image representing detail feature
Figure FDA00025476942100000613
Local region weighted variance energy centered at point (m, n),
Figure FDA00025476942100000614
image representing detail feature
Figure FDA00025476942100000615
Local region weighted variance energy centered at point (m, n);
s544, a decision graph based on pixels is obtained according to the local difference gradient and the local difference variance, and a feature difference degree decision graph is obtained according to the local gradient matching degree and the local weighted variance matching degree:
Figure FDA00025476942100000616
Figure FDA00025476942100000617
where PDG (m, n) is a pixel-based decision graph g1~g9A decision graph showing that the pixel position of the time point (m, n) satisfying the corresponding condition is 1 and the other pixel positions are 0, DDG (m, n) is a feature difference degree decision graph, d1And d2A decision diagram in which the pixel position of the point (m, n) satisfying the corresponding condition is 1 and the other pixel positions are 0;
s545, judging the detail feature image according to the pixel-based decision graph PDG (m, n) and the feature difference degree decision graph DDG (m, n)
Figure FDA0002547694210000071
And
Figure FDA0002547694210000072
the definite area and the uncertain area of (2): g1、g2、g3、g4、g5、g6、g7And g8Belongs to a certain area, g9Belonging to an uncertain region;
s546, fusing detail feature images by using feature difference driving
Figure FDA0002547694210000073
And
Figure FDA0002547694210000074
the determination region of (2):
Figure FDA0002547694210000075
DIF(m,n)=ΔT(m,n)·ΔV(m,n)
in the formula (I), the compound is shown in the specification,
Figure FDA0002547694210000076
image representing detail feature
Figure FDA0002547694210000077
And
Figure FDA0002547694210000078
determining a fused image of the region, DIF (m, n) representing the fusion driving factor of the determined region, "· represents the product of the values at the corresponding pixel positions in the matrix;
s547, fusing detail characteristic images by using fuzzy logic theory
Figure FDA0002547694210000079
And
Figure FDA00025476942100000710
the uncertainty region of (a);
Figure FDA00025476942100000711
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))]
Figure FDA00025476942100000712
Figure FDA00025476942100000713
in the formula (I), the compound is shown in the specification,
Figure FDA00025476942100000714
image representing detail feature
Figure FDA00025476942100000715
And
Figure FDA00025476942100000716
a fused image of the uncertain region, ". represents the product of the values at the corresponding pixel locations in the matrix,./represents the division of the values at the corresponding pixel locations in the matrix,
Figure FDA00025476942100000717
image representing detail feature
Figure FDA00025476942100000718
Membership functions of pixel values at position (m, n) to the degree of importance of the fused image of the uncertain region,
Figure FDA00025476942100000719
image representing detail feature
Figure FDA00025476942100000720
Membership function, μ, of pixel value at location (m, n) to the importance of the fused image of the uncertain regionT(Pk(m, n)) represents "Detail feature image
Figure FDA00025476942100000721
And
Figure FDA00025476942100000722
is a membership function of the large "case, muV(Pk(m, n)) represents a "detail feature image
Figure FDA00025476942100000723
And
Figure FDA00025476942100000724
is large, k is I1Or P1
S548, fusion
Figure FDA0002547694210000081
And
Figure FDA0002547694210000082
obtaining detail characteristic image
Figure FDA0002547694210000083
And
Figure FDA0002547694210000084
the fusion result of (2):
Figure FDA0002547694210000085
in the formula, FDIF(m, n) represents a detail feature image
Figure FDA0002547694210000086
And
Figure FDA0002547694210000087
the fused image of (1);
s549, to FDIF(m, n) performing consistency check:
using a size 3 × 3 window in image FDIF(m, n) move up, verify the center pixel with the pixels around the window if the center pixel is from
Figure FDA0002547694210000088
And
Figure FDA0002547694210000089
in one of the images, the surrounding s pixels of the central pixel are from the other image, where 4 < s < 8, then the central pixel value is changed to the pixel value of the other image at that location, and the window traverses the entire image FDIF(m, n) to obtain corrected FDIF(m,n)。
10. The method for fusing infrared polarization images based on multi-feature and feature difference driving as claimed in claim 5, wherein in step S55, the fusion result F is obtained by:
F=αFL+βFD+γFDIF
wherein α, β, and γ are fusion weight coefficients.
CN201811180813.8A 2018-10-09 2018-10-09 Infrared polarization image fusion method based on multi-feature and feature difference driving Active CN109410160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811180813.8A CN109410160B (en) 2018-10-09 2018-10-09 Infrared polarization image fusion method based on multi-feature and feature difference driving

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811180813.8A CN109410160B (en) 2018-10-09 2018-10-09 Infrared polarization image fusion method based on multi-feature and feature difference driving

Publications (2)

Publication Number Publication Date
CN109410160A CN109410160A (en) 2019-03-01
CN109410160B true CN109410160B (en) 2020-09-22

Family

ID=65467599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811180813.8A Active CN109410160B (en) 2018-10-09 2018-10-09 Infrared polarization image fusion method based on multi-feature and feature difference driving

Country Status (1)

Country Link
CN (1) CN109410160B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111292279B (en) * 2020-01-17 2022-07-29 中国科学院上海技术物理研究所 Polarization image visualization method based on color image fusion
CN115035210B (en) * 2022-08-10 2022-11-11 天津恒宇医疗科技有限公司 PS-OCT visibility improving method and system based on polarization multi-parameter fusion
CN116091361B (en) * 2023-03-23 2023-07-21 长春理工大学 Multi-polarization parameter image fusion method, system and terrain exploration monitor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102682443A (en) * 2012-05-10 2012-09-19 合肥工业大学 Rapid defogging algorithm based on polarization image guide
CN104835113A (en) * 2015-04-30 2015-08-12 北京环境特性研究所 Polarization image fusion method based on super-resolution image reconstruction
CN104978724A (en) * 2015-04-02 2015-10-14 中国人民解放军63655部队 Infrared polarization fusion method based on multi-scale transformation and pulse coupled neural network
CN105139347A (en) * 2015-07-10 2015-12-09 中国科学院西安光学精密机械研究所 Polarized image defogging method combined with dark channel prior principle
CN105279747A (en) * 2015-11-25 2016-01-27 中北大学 Infrared polarization and light intensity image fusing method guided by multi-feature objective function

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9008457B2 (en) * 2010-05-31 2015-04-14 Pesonify, Inc. Systems and methods for illumination correction of an image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102682443A (en) * 2012-05-10 2012-09-19 合肥工业大学 Rapid defogging algorithm based on polarization image guide
CN104978724A (en) * 2015-04-02 2015-10-14 中国人民解放军63655部队 Infrared polarization fusion method based on multi-scale transformation and pulse coupled neural network
CN104835113A (en) * 2015-04-30 2015-08-12 北京环境特性研究所 Polarization image fusion method based on super-resolution image reconstruction
CN105139347A (en) * 2015-07-10 2015-12-09 中国科学院西安光学精密机械研究所 Polarized image defogging method combined with dark channel prior principle
CN105279747A (en) * 2015-11-25 2016-01-27 中北大学 Infrared polarization and light intensity image fusing method guided by multi-feature objective function

Also Published As

Publication number Publication date
CN109410160A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
Galdran et al. Enhanced variational image dehazing
Shin et al. Radiance–reflectance combined optimization and structure-guided $\ell _0 $-Norm for single image dehazing
CN109410160B (en) Infrared polarization image fusion method based on multi-feature and feature difference driving
Negru et al. Exponential contrast restoration in fog conditions for driving assistance
US9870600B2 (en) Raw sensor image and video de-hazing and atmospheric light analysis methods and systems
CN109410161B (en) Fusion method of infrared polarization images based on YUV and multi-feature separation
Chen et al. Hazy image restoration by bi-histogram modification
Chen et al. An advanced visibility restoration algorithm for single hazy images
CN111292257B (en) Retinex-based image enhancement method in scotopic vision environment
CN110866889A (en) Multi-camera data fusion method in monitoring system
Huo et al. Fast fusion-based dehazing with histogram modification and improved atmospheric illumination prior
Cho et al. Channel invariant online visibility enhancement for visual SLAM in a turbid environment
Wang et al. Multiscale single image dehazing based on adaptive wavelet fusion
Wei et al. An image fusion dehazing algorithm based on dark channel prior and retinex
Pal et al. Visibility enhancement techniques for fog degraded images: a comparative analysis with performance evaluation
Ali et al. Boundary-constrained robust regularization for single image dehazing
Lai et al. Single image dehazing with optimal transmission map
Petrovic Multilevel image fusion
Hong et al. Single image dehazing based on pixel-wise transmission estimation with estimated radiance patches
CN107301625B (en) Image defogging method based on brightness fusion network
Thepade et al. Improved haze removal method using proportionate fusion of color attenuation prior and edge preserving
Pal et al. Visibility enhancement of fog degraded images using adaptive defogging function
Ali et al. A comparative study of various image dehazing techniques
Ancuti et al. A semi-global color correction for underwater image restoration
Naseeba et al. KP Visibility Restoration of Single Hazy Images Captured in Real-World Weather Conditions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant