CN109377468A - The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features - Google Patents

The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features Download PDF

Info

Publication number
CN109377468A
CN109377468A CN201811180835.4A CN201811180835A CN109377468A CN 109377468 A CN109377468 A CN 109377468A CN 201811180835 A CN201811180835 A CN 201811180835A CN 109377468 A CN109377468 A CN 109377468A
Authority
CN
China
Prior art keywords
image
local
images
feature
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811180835.4A
Other languages
Chinese (zh)
Inventor
冉骏
宋斌
陈蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Source Letter Photoelectric Polytron Technologies Inc
Original Assignee
Hunan Source Letter Photoelectric Polytron Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Source Letter Photoelectric Polytron Technologies Inc filed Critical Hunan Source Letter Photoelectric Polytron Technologies Inc
Priority to CN201811180835.4A priority Critical patent/CN109377468A/en
Publication of CN109377468A publication Critical patent/CN109377468A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The pseudo-colours fusion method of a kind of infra-red radiation based on multiple features and polarization image is disclosed, technical field of image processing is related to.This method carries out mapping based on the RGB color of infra-red radiation and polarization image first;Rgb space is converted into yuv space, extract light intensity level Y again;Multiple features separation is carried out to infra-red radiation and infrared polarization image again, obtains respective bright, dark and minutia;Bright, the dark characteristic image of two images is merged using the method for characteristic matching again and two image detail features are merged using fuzzy logic and feature difference driving;Fused bright, dark and minutia result is combined again, and combining result replaces luminance component Y, then carries out YUV inverse transformation, obtains final pseudo-colours fusion results.This method can effectively improve the quality of infrared image and improve the recognizable rate of target, effectively enhance image information, improve picture contrast and clarity.

Description

Pseudo-color fusion method based on multi-feature infrared radiation and polarization image
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a pseudo-color fusion method based on multi-feature infrared radiation and a polarization image.
Background
At present, with the rapid development of the requirements of the infrared imaging detection technology in the application fields of military affairs, medical treatment, security protection, earth observation and the like, the traditional infrared detection technology is not satisfactory for some extent under the upgrading of precision, complex environment and camouflage technology.
The infrared radiation image is good at describing the characteristics of a hot target, but the contrast between the target and the background in the image is low, the picture is generally darker, the signal-to-noise ratio is low, the edge is fuzzy, the detail texture is weaker, and the gray distribution is concentrated. However, infrared polarization information is another kind of information for characterizing things radiated by different domains, and objects to be measured radiated by the same radiation may have different polarization degrees, and infrared polarization images are good at describing object surface roughness, textures, edges and the like. Redundancy and complementarity exist between the infrared polarization image and the infrared radiation image of the same scene, the image fused with the infrared polarization image and the infrared radiation image not only contains specific target information of one image, but also enhances the information of both the infrared polarization image and the infrared radiation image, so that the scene description is more comprehensive and accurate, and the scene understanding and the recognition of an observer to the scene are facilitated. However, in the case of low contrast between the background and the target, the target is still difficult to recognize and may cause some visual errors to the observer; moreover, some image fusion algorithms only considering a single difference feature cannot effectively describe all uncertain and randomly-changed image features in the image, so that some valuable information is lost in the fusion process, and fusion and identification are failed.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, provides a pseudo-color fusion method based on multi-feature infrared radiation and polarization images, realizes the fusion of the infrared radiation images and the polarization images, enables scene information to be more comprehensive, improves the quality of the infrared images and improves the recognition rate of targets.
In order to achieve the purpose, the technical scheme of the invention is as follows: a pseudo-color fusion method based on multi-feature infrared radiation and polarization images comprises the following steps:
1. a pseudo-color fusion method based on multi-feature infrared radiation and polarized images is characterized by comprising the following steps:
s1, obtaining an RGB image based on RGB color mapping of the infrared radiation image and the infrared polarization image;
s2, converting the RGB space of the image obtained in the step S1 into YUV space, and extracting a brightness component Y;
s3, respectively carrying out multi-feature separation based on a dark primary color theory on the gray level images of the original infrared radiation image and the infrared polarization image to respectively obtain a bright feature, a dark feature and a detail feature of the two images;
s4, fusing the corresponding bright features of the two images obtained in the step S3 by adopting a matching method based on the local region energy features;
s5, fusing the corresponding dark features of the two images obtained in the step S3 by adopting a matching method based on the local area weighted variance features;
s6, driving and fusing the detail characteristics of the infrared radiation image and the infrared polarization image obtained in the step S3 by utilizing fuzzy logic and characteristic difference;
s7, fusing the results of the steps S4, S5 and S6 to obtain a gray level fusion result, replacing the brightness component Y in the step S2 with the fusion result, and then performing YUV inverse transformation to obtain a final pseudo-color fusion result;
it should be particularly noted that, the steps S4, S5, and S6 are not time-series, and can be described in one step, and the step numbers are only performed here for convenience of the following description: steps S4, S5, S6 may be performed synchronously; any two can be carried out simultaneously, and then the last one is carried out; or any one of the two can be carried out firstly, and then the rest two can be carried out synchronously; or one of them in sequence.
Further, in the step S2, the formula for converting the RGB space into the YUV space is shown as formula (1):
the formula in which the luminance component Y is extracted is shown in formula (2):
y is 0.299R +0.587G +0.114B, formula (2).
Further, in step S3, let r be the gray image of the original infrared radiation image, p be the gray image of the original infrared polarization image, and the solution method of the dark primary color map is shown in formula (3):
in formula (3), C is the three color channels R, G, B of the image; n (x) is a pixel in the window area with the pixel point x as the center; l isC(y) a color channel map of the image; l isdark(x) Is LC(y) corresponding dark channel images;
obtaining a dark primary color image of an infrared radiation image by equation (3)
Obtaining a dark primary color image of the infrared polarization image by equation (3)
The method adopts multi-feature separation based on the dark primary color theory to respectively obtain the bright features, the dark features and the detail features of two images, and comprises the following steps:
s3.1, obtaining the image after the infrared radiation image is inverted by using the following formulas (4) and (5) respectivelyAnd the image after the infrared polarization image is invertedThen the dark primary color image is processedAndrespectively associated with the imagesAndfusing according to the rule that the absolute value is small to obtain dark characteristic images D of the infrared radiation images shown in the formulas (6) and (7) respectivelyrAnd dark feature image D of infrared polarization imagep
S3.2, mixing the aboveAndimages D respectively corresponding theretorAnd DpSubtracting to obtain bright characteristic images L of the infrared radiation images shown in the formulas (8) and (9), respectivelyrAnd bright feature image L of the infrared polarization imagep
S3.3, respectively taking the infrared radiation image r and the infrared polarization image as p and the corresponding dark primary color imageAnd dark primary color imageDifferencing to obtain detail images P of the infrared radiation images shown in equations (10) and (11), respectivelyrAnd detail image P of infrared polarization imagep
Further, the bright feature image L is fused in the above step S4rAnd bright feature image LpThe method comprises the following steps:
s4.1, respectively obtaining bright characteristic images LrGaussian weighted local energy ofAnd bright feature image LpGaussian weighted local energy ofThe calculation is performed by using equation (12):
in the formula (12), the first and second groups,representing a gaussian-weighted local energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;middle L represents a bright feature image;
the bright feature images L are obtained by the formula (12)rGaussian weighted local energy ofAnd bright feature image LpGaussian weighted local energy of
S4.2, obtainingDecoding the image LrAnd LpMatching degree M of Gaussian weighted local energyE(m, n), solving using equation (13);
s4.3, carrying out bright feature image L by using the Gaussian weighted local energy obtained in the step S4.1 and the matching degree of the Gaussian weighted local energy obtained in the step S4.2rAnd LpObtaining a fusion result F of the bright feature imageL
The fusion of the bright feature images of the infrared radiation image and the infrared polarization image is performed using equation (14):
wherein,
in formula (14), TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LrAnd LpThe regions centered on the point (m, n) are not similar, and the two images L arerAnd LpSelecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LrAnd LpThe fusion result of (2) is a coefficient weighted average.
Still further, the dark feature image D is fused in the above step S5rAnd dark feature image DpComprises the following steps:
s5.1, obtaining a dark feature image DrLocal area weighted variance energy ofAnd dark feature imagesDpLocal area weighted variance energy ofThe calculation is performed by using equation (16):
in the formula (16), the first and second groups,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;wherein D represents a dark feature image;
respectively obtaining dark feature images D by using formula (16)rLocal area weighted variance energy ofAnd dark feature image DpLocal area weighted variance energy of
S5.2, solving two images D according to the result of the step S5.1rAnd DpLocal region weighted variance energy matching degree MV(m, n); the calculation is performed by using equation (17):
s5.3, using the local area weighted variance energy obtained in step S5.1 and the local area weighted variance energy matching degree M obtained in step S5.2V(m, n) two dark feature images DrAnd DpObtaining a fusion result F of the dark feature imageD(ii) a The fusion is performed using equation (18):
wherein,
in the formula (18), ThA threshold value for judging the similarity of the darkness features is fused, and the value is 0.5-1; if it is ME(m,n)<ThThen two images DrAnd DpThe regions centered on the point (m, n) are not similar, and the two images DrAnd DpSelecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DrAnd DpThe fusion result of (2) is a coefficient weighted average.
Still further, the step S6 includes the following steps:
s6.1, obtaining a detail characteristic image P of the infrared radiation imagerLocal gradientAnd detail characteristic image P of infrared polarization imagepLocal gradient ofThe solution is performed using equation (20):
in the formula,representing the local gradient at the pixel point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;middle P indicates a detail feature image;respectively representing horizontal and vertical templates and a detail characteristic image P by using Sobel operatorkConvolving the obtained horizontal and vertical edge images;
obtaining a detail feature image P using the formula (20)rLocal gradient ofAnd detail feature image PpLocal gradient of
S6.2, obtaining a detail characteristic image P of the infrared radiation imagerLocal weighted variance ofAnd detail characteristic image P of infrared polarization imagepLocal gradient of
The local weighted variance solving method adopts the same method as the formula (16), namely the local weighted variance of the detail feature image is:
in the formula,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;and P denotes a detail feature image.
Obtaining the detail feature image P using the formula (21)rLocal weighted variance ofAnd detail feature image PpLocal gradient of
S6.3, solving the local difference gradient matching degree, the local weighted variance matching degree, the local difference gradient and the local difference variance of the two detail characteristic images; the calculation is performed using the following equations (22) to (25), respectively:
local gradient matching degree:
local weighted variance matching:
local differential gradient:
local variance of difference:
s6.4, obtaining a pixel-based decision map PDG (M, n) according to the local difference gradient delta T (M, n) and the local difference variance delta V (M, n), and matching the degree M according to the local gradientT(M, n) and local weighted variance matching MV(m, n) obtaining a feature difference degree decision graph DDG;
s6.5, judging a determined region and an uncertain region according to a decision graph PDG (m, n) based on pixels and a feature difference degree decision graph DDG;
s6.6, utilizing the characteristic difference to drive and fuse the determined areas of the two detail characteristic images;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as a driving factor for determining region fusion, which is denoted as DIF (m, n), as shown in the following equation (27):
DIF (m, n) ═ Δ T (m, n) · Δ V (m, n) equation (27)
Then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area asAs shown in equation (28):
where "· represents the product of values at corresponding pixel locations in the matrix;
s6.7, fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
let the membership functions of "local gradients of detail feature images of infrared radiation image and infrared polarization image are large" be μT(Pr(m, n)) and μT(Pp(m, n)) as shown in formula (29); the membership functions of "the local weighted variance of the detail feature images of the infrared radiation image and the infrared polarization image is large" are respectively μV(Pr(m, n)) and μV(Pp(m, n)), as shown in (30):
in the formula, k represents a gray level image r of an original infrared image or a gray level image p of an original infrared polarization image;
respectively calculating membership functions of pixel values of detail characteristic images of the infrared radiation image and the infrared polarization image at positions (m, n) to the importance degree of the fusion image of the uncertain region by using a fuzzy logic traffic operation rule, and respectively using muT∩V(Pr(m, n)) and μT∩V(Pp(m, n)) as shown in formula (31):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))]formula (31)
In the formula, k represents a gray level image r of an original infrared image or a gray level image p of an original infrared polarization image;
then, the fused image of the uncertain region of the two image detail feature images is shown as formula (32):
wherein "· represents the product of the values at the corresponding pixel locations in the matrix, ·/represents the division of the values at the corresponding pixel locations in the matrix;
s6.8, fusionAndobtaining the final fusion result FDIF(m, n) is represented by formula (33) and for FDIF(m, n) performing consistency check;
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) move up, verify the center pixel with the pixels around the window: if the central pixel is from PrAnd PpIn one of the images, and s (4 < s < 8) surrounding the central pixel are from the other image, the central pixel value is changed to the pixel value of the other image at that location, and the window traverses the entire image FDIF(m, n) to obtain F after correctionDIF(m,n)。
Still further, the step S6.4 includes the following steps:
s6.4.1, obtaining a pixel-based decision map PDG (m, n) from the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n): when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9Wherein p is1~p9A decision diagram showing that the pixel position is 1 and the other pixel positions are 0 when the above condition is satisfied;
s6.4.2 matching degree M according to local gradientT(M, n) and local weighted variance matching MV(m, n) obtaining a feature difference degree decision graph DDG, using formula (26):
in the formula (d)1And d2The corresponding pixel position satisfying the formula (26) is 1, and the other pixel positions are 0.
Further, the method for determining the definite area and the indefinite area in step S6.5 above is: p satisfying the S6.4.1 correspondence condition of step S6.4 is determined by PDG1、p2、p5、p6、p7And p8To determine the region. For p1And p2For the two situations, the two difference characteristics can reflect whether the gray value of the corresponding pixel point is reserved in the fused image; for p5、p6、p7And p8For the four conditions, whether the gray value of the corresponding pixel point is reserved in the fusion image can be reflected by using one difference characteristic; p is determined by PDG and DDG3And p4To determine the area; the DDG can determine the difference degree of the local features of the two images by utilizing the feature difference degree decision graph, and then the difference feature with larger difference degree is selected, and the difference feature can reflect whether the gray value of the corresponding pixel point is reserved in the fused image or not; because of the region p9The two decision graphs PDG and DDG can not be used for judging that the area belongs to the uncertainty area.
Still further, the result of the fusion steps S4, S5 and S6 in the above step S7 means that the image F containing a bright feature is fusedL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n) fusion using the formula (34)Carrying out the following steps:
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (34)
wherein α, β and gamma are fusion weight coefficients and have a value range of [0,1], preferably α is 1, β is 0.3 and gamma is 1, so as to reduce the supersaturation of the fusion image and improve the contrast.
Compared with the prior art, the invention has the following beneficial effects:
1. the method of the invention fuses bright characteristic information, dark characteristic information and detail characteristic information of infrared radiation images and infrared polarization images in the same scene, and complements the image characteristic information (because the two images have the same information and also have different information).
2. The method of the invention considers the relationship between adjacent pixels, the local area brightness difference, the intensity of local area gray scale change and the local gradient change in the image fusion process, namely, the fusion process combines a plurality of characteristic differences to effectively describe the image characteristics which are uncertain and randomly changed, so that the fusion result retains the valuable information of the image, the definition of the image and the contrast of the target background are improved, and the detail information of the image is enhanced.
Drawings
These and/or other aspects and advantages of the present invention will become more apparent and more readily appreciated from the following detailed description of the embodiments of the invention, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a pseudo-color fusion method of infrared radiation and infrared polarization images according to an embodiment of the present invention;
FIG. 2 is a flowchart of a detail feature fusion method in step S6 in a pseudo-color fusion method of infrared radiation and infrared polarization images according to an embodiment of the present invention;
FIG. 3 is a graph of the fusion result of infrared radiation and infrared polarization images according to an embodiment of the present invention, wherein (a) is an infrared radiation image; (b) is an infrared polarization image; (c) is a pseudo color fusion result diagram of the embodiment of the invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, the following detailed description of the invention is provided in conjunction with the accompanying drawings and the detailed description of the invention.
Examples
A pseudo-color fusion method based on multi-feature infrared radiation and polarization images is specifically disclosed as a flowchart in FIG. 1, and comprises the following steps S1-S7:
s1, obtaining an RGB image based on the RGB color mapping of the infrared radiation image and the infrared polarization image:
because the infrared polarization image and the infrared light intensity image are gray level images, RGB color mapping is needed to perform false color image fusion, and the gray level images are converted into RGB images. Because human eyes have different sensitivities to red, green and blue lights, red is warm, people feel strong and glaring, and about 65 percent of human eye cells are sensitive to the red light; in a three primary color spectrum with the same brightness, human subjective feeling considers that green is the brightest of the three primary colors, about 33% of human eye cells are sensitive to the green light, the contrast sensitivity of human eye vision to blue light change is higher, and about 2% of human eye cells are sensitive to the blue light.
The infrared polarization image is darker as a whole, in addition, the infrared polarization image is greatly influenced by the polarization degree and the detection angle of an object, the difference between the brightness and the darkness of the image is larger, the edge information of the target and the background of a scene can be better reflected, and the infrared polarization image is good at imaging the infrared camouflage and the target with low radiation intensity. The infrared radiation image is greatly influenced by the radiation intensity of the object by the infrared intensity image, the radiation degree of the part with high temperature is relatively high, the part with low temperature is relatively low, the details of the image scene are relatively clear, and the definition is relatively high. Thus, the infrared polarized image is assigned to the R channel to highlight the bright objects that are characteristic in the polarized image by a distinctive red color; giving an infrared radiation image with larger information amount to a G channel, and highlighting detail information in the image by bright green; and assigning the difference absolute value image of the two to a B channel, and distinguishing the difference characteristics of the two by blue.
S2, converting the RGB space of the image obtained in step S1 into YUV space, and extracting the luminance component Y:
the formula for converting the RGB space into the YUV space is shown in the following formula (1):
the formula in which the luminance component Y is extracted is shown in formula (2):
y is 0.299R +0.587G +0.114B formula (2)
S3, respectively carrying out multi-feature separation based on a dark primary color theory on the gray level image of the original infrared radiation image and the gray level image of the original infrared polarization image, and respectively obtaining a bright feature, a dark feature and a detail feature of the two images:
assume that the original infrared radiation image is r and the original infrared polarization image is p. The dark primary color image is He and the like used for estimating the transmittance in the atmospheric scattering model, and the natural image is defogged quickly. The solution method of the dark channel map is shown in equation (3).
Where C is the three color channels R, G, B of the image; n (x) is a pixel in the window area with the pixel point x as the center; l isC(y) a color channel map of the image; l isdark(x) The dark channel image corresponding to the image L reflects the image fog degree, and the dark channel L of the fog-free imagedark(x) The value tends to 0; for foggy images, Ldark(x) The value is large; obtaining a dark primary color image of an infrared radiation image by equation (3) Obtaining a dark primary color image of the infrared polarization image by equation (3)
In a natural image, the region obviously affected by fog is generally the brightest pixel point in the dark primary color, and the pixel value of the fog-free region in the dark primary color is very low. Therefore, for the gray-scale image, the dark primary color image includes a bright area in the original image, and reflects the low-frequency part of the original image, that is, an area with relatively smooth gray-scale change in the original image is reserved, so that the difference of the bright and dark characteristics is more prominent, and the local area information with relatively violent gray-scale change and high contrast, especially the edge detail information, is lost.
The method comprises the following steps of respectively obtaining bright features, dark features and detail features of two images by adopting multi-feature separation based on a dark primary color theory:
s3.1, obtaining the image after the infrared radiation image is inverted by using the following formulas (4) and (5) respectivelyAnd the image after the infrared polarization image is invertedThen the dark primary color image is processedAndrespectively associated with the imagesAndfusing according to the rule that the absolute value is small to obtain dark characteristic images D of the infrared radiation images shown in the formulas (6) and (7) respectivelyrAnd dark feature image D of infrared polarization imagep
S3.2, mixingAndimages D respectively corresponding theretorAnd DpSubtracting to obtain bright characteristic images L of the infrared radiation images shown in the formulas (8) and (9), respectivelyrAnd bright feature image L of the infrared polarization imagep
S3.3, respectively taking the infrared radiation image r and the infrared polarization image as p and the dark primary color imageAnd dark primary color imageDifferencing to obtain detail images P of the infrared radiation images shown in equations (10) and (11), respectivelyrAnd detail image P of infrared polarization imagep
The following needs are to be specifically explained: the following steps S4, S5, S6 are not time-series, and may be described in one step, and the step numbers are given here only for convenience of the following description: steps S4, S5, S6 may be performed synchronously; any two can be carried out simultaneously, and then the last one is carried out; or any one of the two can be carried out firstly, and then the rest two can be carried out synchronously; or one of them in sequence.
S4, fusing corresponding bright features of the two images by adopting a matching method based on local region energy features:
the bright feature information concentrates the bright area in the original image, reflects the low-frequency component in the original image, and fuses the bright feature image LrAnd bright feature image LpThe method comprises the following steps:
s4.1, respectively obtaining bright characteristic images LrGaussian weighted local energy ofAnd bright feature image LpGaussian weighted local energy ofThe calculation is performed by using equation (12):
in the formula (12), the first and second groups,representing a gaussian-weighted local energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;l in (1) represents a bright feature image.
The bright feature images L are obtained by the formula (12)rGaussian weighted local energy ofAnd bright feature image LpGaussian weighted local energy of
S4.2, solving the image LrAnd LpMatching degree M of Gaussian weighted local energyE(m, n), solving using equation (13);
s4.3, fusing the image L by using the Gaussian weighted local energy and the matching degree of the Gaussian weighted local energyrAnd LpObtaining a fusion result F of the bright characteristic imagesL
The fusion of the bright feature images of the infrared radiation image and the infrared polarization image is performed using equation (14):
in the formula (14), the reaction mixture,
in the formula, TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LrAnd LpThe regions centered on the point (m, n) are not similar, and the two images L arerAnd LpSelecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LrAnd LpThe fusion result of (2) is a coefficient weighted average.
S5, fusing dark features corresponding to the two images by adopting a matching method based on the local area weighted variance features:
the dark feature image lacks a bright area in the source image, but can still be regarded as an approximate image of the source image, contains the main energy of the image, embodies the basic outline of the image, and fuses the dark feature image DrAnd dark feature image DpComprises the following steps:
s5.1, obtaining a dark feature image DrLocal area weighted variance energy ofAnd dark feature image DpLocal area weighted variance energy ofThe calculation is performed by using equation (16):
in the formula (16), the first and second groups,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;and D denotes a dark feature image.
The dark feature images D can be obtained by the formula (16)rLocal area weighted variance energy ofAnd dark feature image DpLocal area weighted variance energy of
S5.2, solving two images DrAnd DpThe local region weighted variance energy matching degree; the calculation is performed by using equation (17):
s5.3, fusing two dark feature images D by using local area weighted variance energy and local area weighted variance energy matching degreerAnd DpObtaining a fusion result F of the dark feature imagesD(ii) a The fusion is performed using equation (18):
in the formula (18), the first and second groups,
in the formula, ThA threshold value for judging the similarity of the darkness features is fused, and the value is 0.5-1; if it is ME(m,n)<ThThen two images DrAnd DpThe regions centered on the point (m, n) are not similar, and the two images DrAnd DpSelecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DrAnd DpThe fusion result of (2) is a coefficient weighted average.
S6, driving detail features of the fused infrared radiation image and the infrared polarization image by using fuzzy logic and feature difference: the local gradient and the local variance can well reflect the detail information of the image, and the definition of the image is expressed. In order to retain the detail information of the detail feature image as much as possible and improve the definition, the feature difference driving method based on the local gradient and the local variance is adopted to fuse the detail features of the infrared radiation image and the infrared polarization image, and the processing flow steps are as shown in fig. 2 and comprise the following steps:
s6.1, obtaining an infrared radiation imageDetail feature image PrLocal gradientAnd detail characteristic image P of infrared polarization imagepLocal gradient ofThe solution is performed using equation (20):
in the formula,representing the local gradient at the pixel point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;middle P indicates a detail feature image;respectively representing horizontal and vertical edge images obtained by convolution of horizontal and vertical templates of a Sobel operator and the detail characteristic image;
obtaining a detail feature image P using the formula (20)rLocal gradient ofAnd detail feature image PpLocal gradient of
S6.2, obtaining a detail characteristic image P of the infrared radiation imagerLocal weighted variance ofAnd infraredDetail feature image P of polarization imagepLocal gradient of
The local weighted variance solving method adopts the same method as the formula (16), namely the local weighted variance of the detail feature image is:
in the formula,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;the middle P represents a detail feature image;
obtaining the detail feature image P using the formula (21)rLocal weighted variance ofAnd detail feature image PpLocal gradient of
S6.3, solving the local difference gradient matching degree, the local weighted variance matching degree, the local difference gradient and the local difference variance of the two detail characteristic images; the calculation is performed using the following equations (22) to (25), respectively:
local gradient matching degree:
local weighted variance matching:
local differential gradient:
local variance of difference:
s6.4, obtaining a pixel-based decision map PDG (M, n) according to the local difference gradient delta T (M, n) and the local difference variance delta V (M, n), and matching the degree M according to the local gradientT(M, n) and local weighted variance matching MV(m, n) obtaining a feature difference degree decision graph DDG; the method comprises the following steps:
s6.4.1, obtaining a pixel-based decision map PDG (m, n) from the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n): obtaining a pixel-based decision map PDG (m, n) from the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n): when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9Wherein p is1~p9Indicating that the pixel position is when the above condition is satisfied1, decision diagrams with other pixel positions being 0;
s6.4.2 matching degree M according to local gradientT(M, n) and local weighted variance matching MV(m, n) obtaining a feature difference degree decision graph DDG, using formula (26):
in the formula (d)1And d2Indicates that the corresponding pixel position satisfying the formula (26) is 1 and the other pixel positions are 0;
s6.5, judging a determined region and an uncertain region according to a decision graph PDG (m, n) based on pixels and a feature difference degree decision graph DDG;
it can be decided by the PDG that p satisfies the corresponding condition of S6.4.1 of step S6.41、p2、p5、p6、p7And p8To determine the region. Since for p1And p2For the two situations, the two difference characteristics can reflect whether the gray values of the pixel points corresponding to the two detail characteristic images are reserved in the fusion image or not; for p5、p6、p7And p8For the four cases, whether the gray values of the pixel points corresponding to the two detail feature images are reserved in the fusion image can be reflected by using one difference feature.
P can be judged by PDG and DDG3And p4To determine the region. Because the DDG can be used for determining the difference degree of the local features of the two images, and then the difference feature with larger difference degree is selected, the difference feature can reflect whether the gray value of the corresponding pixel point is reserved in the fused image.
However for region p9It is not possible to judge the method from the two decision graphs PDG and DDG, which belongs to the uncertainty region.
S6.6, utilizing the characteristic difference to drive and fuse the determined areas of the two detail characteristic images;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as a driving factor for determining region fusion, which is denoted as DIF (m, n), as shown in the following equation (27):
DIF (m, n) ═ Δ T (m, n) · Δ V (m, n) equation (27)
Then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area asAs shown in equation (28):
where "· denotes the product of the values at the corresponding pixel locations in the matrix.
S6.7, fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
and fusing uncertain areas by using a fuzzy logic theory, considering that the local gradient of the detail characteristic image is large or the local weighted variance of the detail characteristic image is large aiming at the detail characteristic images of the infrared radiation image and the infrared polarization image, and constructing a membership function of the detail characteristic image aiming at the group of relations. Let the membership functions of "local gradients of detail feature images of infrared radiation image and infrared polarization image are large" be μT(Pr(m, n)) and μT(Pp(m, n)) as shown in formula (29); the membership functions of "the local weighted variance of the detail feature images of the infrared radiation image and the infrared polarization image is large" are respectively μV(Pr(m, n)) and μV(Pp(m, n)), as shown in (30):
in the formula, k represents a grayscale image r of the original infrared image or a grayscale image p of the original infrared polarization image.
Respectively calculating membership functions of pixel values of detail characteristic images of the infrared radiation image and the infrared polarization image at positions (m, n) to the importance degree of the fusion image of the uncertain region by using a fuzzy logic traffic operation rule, and respectively using muT∩V(Pr(m, n)) and μT∩V(Pp(m, n)) as shown in formula (31):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))]formula (31)
In the formula, k represents a grayscale image r of the original infrared image or a grayscale image p of the original infrared polarization image.
Then, the fused image of the uncertain region of the two image detail feature images is shown as formula (32):
wherein "· represents the product of the values at the corresponding pixel locations in the matrix, ·/represents the division of the values at the corresponding pixel locations in the matrix;
s6.8, fusionAndobtaining the final fusion result FDIF(m, n) is represented by formula (33) and for FDIF(m,n) carrying out consistency check;
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) move up, verify the center pixel with the pixels around the window: if the central pixel is from PrAnd PpIn one of the images, and s (4 < s < 8) surrounding the central pixel are from the other image, the central pixel value is changed to the pixel value of the other image at that location, and the window traverses the entire image FDIF(m, n) to obtain F after correctionDIF(m,n)。
And S7, fusing the results of the steps S4, S5 and S6 to obtain a gray fusion result, replacing the brightness component in the step S2 with the gray fusion result, and then performing YUV inverse transformation to obtain a final pseudo-color fusion result.
The result of the fusing steps S4, S5 and S6 means that the image F containing bright features is fusedL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n), fusing using equation (34):
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (34)
in the formula, α, β and γ are fusion weight coefficients, and the value range is [0,1 ]. in order to reduce the supersaturation of the fusion image and improve the contrast, α is 1, β is 0.3, and γ is 1.
The brightness contrast of the image is reduced in the RGB color mapping process, so that the brightness component needs to be subjected to gray level enhancement, the brightness is enhanced by replacing the brightness component with the fused image, namely, the brightness component in the step S2 is replaced by the fused result F, and then YUV inverse transformation is carried out to obtain the final fused result of the infrared radiation and the infrared polarization image.
FIG. 3 shows a comparison of infrared radiation and infrared polarization images and their fusion results in accordance with an embodiment of the present invention: the method can effectively improve the quality of the infrared image and the recognition rate of the target, enhance the image information and improve the image contrast and definition.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A pseudo-color fusion method based on multi-feature infrared radiation and polarized images is characterized by comprising the following steps:
s1, obtaining an RGB image based on RGB color mapping of the infrared radiation image and the infrared polarization image;
s2, converting the RGB space of the image obtained in the step S1 into YUV space, and extracting a brightness component Y;
s3, respectively carrying out multi-feature separation based on a dark primary color theory on the gray level images of the original infrared radiation image and the infrared polarization image to respectively obtain a bright feature, a dark feature and a detail feature of the two images;
s4, fusing the bright features corresponding to the two images obtained in the step S3 by adopting a matching method based on the energy features of the local regions, fusing the dark features corresponding to the two images obtained in the step S3 by adopting a matching method based on the weighted variance features of the local regions, and driving and fusing the detailed features of the infrared radiation image and the infrared polarization image obtained in the step S3 by utilizing fuzzy logic and feature difference;
s5, fusing the bright feature, dark feature and detail feature results in the step S4 to obtain a gray level fusion result, replacing the brightness component Y in the step S2 with the fusion result, and then performing YUV inverse transformation to obtain a final pseudo-color fusion result.
2. The method for pseudo-color fusion of multi-feature based infrared radiation and polarized images according to claim 1,
in step S2, the formula for converting the RGB space into the YUV space is shown in formula (1):
the formula in which the luminance component Y is extracted is shown in formula (2):
y is 0.299R +0.587G +0.114B, formula (2).
3. The method for pseudo-color fusion of multi-feature based infrared radiation and polarized images according to claim 1,
in step S3, let r be the gray image of the original infrared radiation image, p be the gray image of the original infrared polarization image, and the solution method of the dark primary color map is shown in formula (3):
in formula (3), C is the three color channels R, G, B of the image; n (x) is the pixel point xPixels within the window field of the heart; l isC(y) a color channel map of the image; l isdark(x) Is LC(y) corresponding dark channel images;
obtaining a dark primary color image of an infrared radiation image by equation (3)
Obtaining a dark primary color image of the infrared polarization image by equation (3)
The method adopts multi-feature separation based on the dark primary color theory to respectively obtain the bright features, the dark features and the detail features of two images, and comprises the following steps:
s3.1, obtaining the image after the infrared radiation image is inverted by using the following formulas (4) and (5) respectivelyAnd the image after the infrared polarization image is invertedThen the dark primary color image is processedAndrespectively associated with the imagesAndfusing according to the rule that the absolute value is small to obtain dark characteristic images D of the infrared radiation images shown in the formulas (6) and (7) respectivelyrAnd dark feature image D of infrared polarization imagep
S3.2, mixing the aboveAndimages D respectively corresponding theretorAnd DpSubtracting to obtain bright characteristic images L of the infrared radiation images shown in the formulas (8) and (9), respectivelyrAnd bright feature image L of the infrared polarization imagep
S3.3, respectively taking the infrared radiation image r and the infrared polarization image as p and the corresponding dark primary color imageAnd dark primary color imageDifferencing to obtain detail images P of the infrared radiation images shown in equations (10) and (11), respectivelyrAnd detail image P of infrared polarization imagep
4. The method for pseudo-color fusion of multi-feature based infrared radiation and polarized images of claim 3,
in the step S4, a matching method based on local region energy features is adopted to fuse bright features corresponding to the two images obtained in the step S3, and the method includes the following steps:
s4.1, respectively obtaining bright characteristic images LrGaussian weighted local energy ofAnd bright feature image LpGaussian weighted local energy ofThe calculation is performed by using equation (12):
in the formula (12), the first and second groups,representing a Gaussian-weighted office centred on a point (m, n)Partial energy; w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2; k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;middle L represents a bright feature image;
the bright feature images L are obtained by the formula (12)rGaussian weighted local energy ofAnd bright feature image LpGaussian weighted local energy of
S4.2, solving the image LrAnd LpMatching degree M of Gaussian weighted local energyE(m, n), solving using equation (13);
s4.3, carrying out bright feature image L by using the Gaussian weighted local energy obtained in the step S4.1 and the matching degree of the Gaussian weighted local energy obtained in the step S4.2rAnd LpObtaining a fusion result F of the bright feature imageL
The fusion of the bright feature images of the infrared radiation image and the infrared polarization image is performed using equation (14):
wherein,
in formula (14), TlThe threshold value for judging the similarity of the brightness features is selected to be 0-0.5; if it is ME(m,n)<TlThen two images LrAnd LpThe regions centered on the point (m, n) are not similar, and the two images L arerAnd LpSelecting the one with large energy in the Gaussian weighted area as the fusion result; otherwise, two images LrAnd LpThe fusion result of (2) is a coefficient weighted average.
5. The method of pseudo-color fusion of multi-feature based infrared radiation and polarized images of claim 4,
the step of fusing the corresponding dark features of the two images obtained in the step S3 by adopting a matching method based on local area weighted variance features in the step S4 is as follows:
s4.4, obtaining a dark feature image DrLocal area weighted variance energy ofAnd dark feature image DpLocal area weighted variance energy ofThe calculation is performed by using equation (16):
in the formula (16), the first and second groups,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;wherein D represents a dark feature image;
respectively obtaining dark feature images D by using formula (16)rLocal area weighted variance energy ofAnd dark feature image DpLocal area weighted variance energy of
S4.5, solving two images D according to the result of the step S4.4rAnd DpLocal region weighted variance energy matching degree MV(m, n); the calculation is performed by using equation (17):
s4.6, using the local area weighted variance energy obtained in step S4.4 and the local area weighted variance energy matching degree M obtained in step S4.5V(m, n) two dark feature images DrAnd DpObtaining a fusion result F of the dark feature imageD(ii) a The fusion is performed using equation (18):
wherein,
in the formula (18), ThA threshold value for judging the similarity of the darkness features is fused, and the value is 0.5-1; if it is ME(m,n)<ThThen two images DrAnd DpThe regions centered on the point (m, n) are not similar, and the two images DrAnd DpSelecting the one with large weighted variance energy in the local area as the fusion result; otherwise, two images DrAnd DpThe fusion result of (2) is coefficient weightingAnd (6) averaging.
6. The pseudo-color fusion method of multi-feature based infrared radiation and polarization images according to claim 5, wherein the step S4 of driving the detail features of the infrared radiation image and the infrared polarization image obtained in the fusion step S3 by fuzzy logic and feature difference comprises the following steps:
s4.7, obtaining a detail characteristic image P of the infrared radiation imagerLocal gradientAnd detail characteristic image P of infrared polarization imagepLocal gradient ofThe solution is performed using equation (20):
in the formula,representing the local gradient at the pixel point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;middle P indicates a detail feature image;respectively representing horizontal and vertical templates and a detail characteristic image P by using Sobel operatorkConvolving the obtained horizontal and vertical edge images;
obtaining a detail feature image P using the formula (20)rLocal gradient ofAnd detail feature image PpLocal gradient of
S4.8, obtaining a detail characteristic image P of the infrared radiation imagerLocal weighted variance ofAnd detail characteristic image P of infrared polarization imagepLocal gradient of
The local weighted variance solving method adopts the same method as the formula (16), namely the local weighted variance of the detail feature image is:
in the formula,represents the local region weighted variance energy centered at point (m, n); w (i, j) is a Gaussian filter matrix; n is the size of the region; t ═ N-1)/2;represents the local area average centered at point (m, n); k represents a gray level image r of the original infrared image or a gray level image p of the original infrared polarization image;the middle P represents a detail feature image;
obtaining the detail feature image P using the formula (21)rLocal weighted variance ofAnd detailsCharacteristic image PpLocal gradient of
S4.9, solving the local difference gradient matching degree, the local weighted variance matching degree, the local difference gradient and the local difference variance of the two detail characteristic images; the calculation is performed using the following equations (22) to (25), respectively:
local gradient matching degree:
local weighted variance matching:
local differential gradient:
local variance of difference:
s4.10, obtaining a pixel-based decision map PDG (M, n) according to the local difference gradient Delta T (M, n) and the local difference variance Delta V (M, n), and matching the degree M according to the local gradientT(M, n) and local weighted variance matching MV(m, n) obtaining a feature difference degree decision graph DDG;
s4.11, judging a determined area and an uncertain area according to a decision graph PDG (m, n) based on pixels and a feature difference degree decision graph DDG;
s4.12, utilizing the characteristic difference to drive and fuse the determined areas of the two detail characteristic images;
taking the product of the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n) as a driving factor for determining region fusion, which is denoted as DIF (m, n), as shown in the following equation (27):
DIF (m, n) ═ Δ T (m, n) · Δ V (m, n) equation (27)
Then using DIF (m, n) to drive the fusion determined area, and obtaining the image after fusion of the determined area asAs shown in equation (28):
where "· represents the product of values at corresponding pixel locations in the matrix;
s4.13, fusing uncertain areas of the two detail characteristic images by using a fuzzy logic theory;
let the membership functions of "local gradients of detail feature images of infrared radiation image and infrared polarization image are large" be μT(Pr(m, n)) and μT(Pp(m, n)) as shown in formula (29); the membership functions of "the local weighted variance of the detail feature images of the infrared radiation image and the infrared polarization image is large" are respectively μV(Pr(m, n)) and μV(Pp(m, n)), as shown in (30):
in the formula, k represents a gray level image r of an original infrared image or a gray level image p of an original infrared polarization image;
respectively calculating membership functions of pixel values of detail characteristic images of the infrared radiation image and the infrared polarization image at positions (m, n) to the importance degree of the fusion image of the uncertain region by using a fuzzy logic traffic operation rule, and respectively using muT∩V(Pr(m, n)) and μT∩V(Pp(m, n)) as shown in formula (31):
μT∩V(Pk(m,n))=min[μT(Pk(m,n)),μV(Pk(m,n))]formula (31)
In the formula, k represents a gray level image r of an original infrared image or a gray level image p of an original infrared polarization image;
then, the fused image of the uncertain region of the two image detail feature images is shown as formula (32):
wherein "· represents the product of the values at the corresponding pixel locations in the matrix, ·/represents the division of the values at the corresponding pixel locations in the matrix;
s4.14, fusionAndobtaining the final fusion result FDIF(m, n) is represented by formula (33) and for FDIF(m, n) performing consistency check;
to FDIF(m, n) A consistency check is performed on image F using a window of size 3X 3DIF(m, n) move up, verify the center pixel with the pixels around the window: if the central pixel is from PrAnd PpIn one of the images, and 4-8 pixels surrounding the central pixel are from the other image, the central pixel value is changed to the pixel value of the other image at that location, and the window traverses the entire image FDIF(m, n) to obtain F after correctionDIF(m,n)。
7. The method for pseudo-color fusion of multi-feature based infrared radiation and polarized images according to claim 6,
said step S4.10 comprises the steps of:
s4.10.1, obtaining a pixel-based decision map PDG (m, n) from the local difference gradient Δ T (m, n) and the local difference variance Δ V (m, n): when Δ T (m, n) > 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p1(ii) a When Δ T (m, n) < 0 and Δ V (m, n) < 0, PDG (m, n) ═ p2(ii) a When Δ T (m, n) > 0 and Δ V (m, n) < 0, PDG (m, n) ═ p3(ii) a When Δ T (m, n) < 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p4(ii) a When Δ T (m, n) ═ 0 and Δ V (m, n) > 0, let PDG (m, n) ═ p5(ii) a When Δ T (m, n) is 0 and Δ V (m, n) < 0, let PDG be p6(ii) a When Δ T (m, n) > 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p7(ii) a When Δ T (m, n) < 0 and Δ V (m, n) ═ 0, let PDG (m, n) ═ p8(ii) a When Δ T (m, n) is 0 and Δ V (m, n) is 0, let PDG (m, n) be p9Wherein p is1~p9A decision diagram showing that the pixel position is 1 and the other pixel positions are 0 when the above condition is satisfied;
s4.10.2 matching degree M according to local gradientT(M, n) and local weighted variance matching MV(m, n) obtaining a feature difference degree decision graph DDG, using formula (26):
in the formula (d)1And d2The corresponding pixel position satisfying the formula (26) is 1, and the other pixel positions are 0.
8. The method for pseudo-color fusion of multi-feature based infrared radiation and polarized images according to claim 7,
the method for determining the determined area and the uncertain area in step S4.11 is as follows:
p satisfying the S4.10.1 correspondence condition of step S4.10 is determined by PDG1、p2、p5、p6、p7And p8To determine the region for p1And p2In the two situations, whether the gray value of the corresponding pixel point is reserved in the fused image or not is judged according to any one of the two difference characteristics;for p5、p6、p7And p8In the four cases, whether the gray value of the corresponding pixel point is reserved in the fused image or not is judged according to one of the difference characteristics which is not zero;
p is determined by PDG and DDG3And p4To determine the area: determining the difference degree of the local features of the two images by using a feature difference degree decision graph DDG, then selecting the difference feature with larger difference degree, and judging whether the gray value of the corresponding pixel point is reserved in the fused image by using the difference feature;
because of the region p9This region belongs to the uncertainty region, which is not the case in any of the above descriptions.
9. The method for pseudo-color fusion of multi-feature based infrared radiation and polarization images of claim 8 wherein the fusion of bright, dark and detail features in step S4 in step S5 results in fusion of an image F containing bright featuresL(m, n), image F of dark featuresD(m, n) and image F of detail featureDIF(m, n), fusing using equation (34):
F=αFL(m,n)+βFD(m,n)+γFDIF(m,n) (34)
in the formula, α, β and gamma are fusion weight coefficients, and the value range is [0,1 ].
10. The method of pseudo-color fusion of multi-feature based infrared radiation and polarization images of claim 9 wherein α is 1, β is 0.3 and γ is 1 to reduce oversaturation of the fused image and improve contrast.
CN201811180835.4A 2018-10-09 2018-10-09 The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features Pending CN109377468A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811180835.4A CN109377468A (en) 2018-10-09 2018-10-09 The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811180835.4A CN109377468A (en) 2018-10-09 2018-10-09 The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features

Publications (1)

Publication Number Publication Date
CN109377468A true CN109377468A (en) 2019-02-22

Family

ID=65402869

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811180835.4A Pending CN109377468A (en) 2018-10-09 2018-10-09 The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features

Country Status (1)

Country Link
CN (1) CN109377468A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110084771A (en) * 2019-03-11 2019-08-02 中北大学 A kind of more algorithm optimization fusion methods of bimodal infrared image piecemeal based on set-valued mappong
CN111445464A (en) * 2020-03-31 2020-07-24 中北大学 Difference characteristic frequency distribution construction method based on nonparametric estimation
CN115631123A (en) * 2022-11-22 2023-01-20 北京航空航天大学 Bionic vision fusion severe environment imaging device and method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105069769A (en) * 2015-08-26 2015-11-18 哈尔滨工业大学 Low-light and infrared night vision image fusion method
US20160093034A1 (en) * 2014-04-07 2016-03-31 Steven D. BECK Contrast Based Image Fusion
CN107103596A (en) * 2017-04-27 2017-08-29 湖南源信光电科技股份有限公司 A kind of color night vision image interfusion method based on yuv space
CN107909112A (en) * 2017-11-27 2018-04-13 中北大学 The fusion method that a kind of infrared light intensity is combined with polarization image multiclass argument
CN108419062A (en) * 2017-02-10 2018-08-17 杭州海康威视数字技术股份有限公司 Image co-registration equipment and image interfusion method
CN108492274A (en) * 2018-04-03 2018-09-04 中国人民解放军国防科技大学 Long-wave infrared polarization feature extraction and fusion image enhancement method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160093034A1 (en) * 2014-04-07 2016-03-31 Steven D. BECK Contrast Based Image Fusion
CN105069769A (en) * 2015-08-26 2015-11-18 哈尔滨工业大学 Low-light and infrared night vision image fusion method
CN108419062A (en) * 2017-02-10 2018-08-17 杭州海康威视数字技术股份有限公司 Image co-registration equipment and image interfusion method
CN107103596A (en) * 2017-04-27 2017-08-29 湖南源信光电科技股份有限公司 A kind of color night vision image interfusion method based on yuv space
CN107909112A (en) * 2017-11-27 2018-04-13 中北大学 The fusion method that a kind of infrared light intensity is combined with polarization image multiclass argument
CN108492274A (en) * 2018-04-03 2018-09-04 中国人民解放军国防科技大学 Long-wave infrared polarization feature extraction and fusion image enhancement method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
安富等: "模糊逻辑与特征差异驱动的红外偏振图像融合模型", 《红外技术》 *
李伟伟等: "红外偏振与红外光强图像的伪彩色融合研究", 《红外技术》 *
郭喆等: "一种红外偏振与光强图像的暗原色多特征分离融合方法", 《科学技术与工程》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110084771A (en) * 2019-03-11 2019-08-02 中北大学 A kind of more algorithm optimization fusion methods of bimodal infrared image piecemeal based on set-valued mappong
CN110084771B (en) * 2019-03-11 2022-07-05 中北大学 Bimodal infrared image blocking multi-algorithm optimization fusion method based on collection value mapping
CN111445464A (en) * 2020-03-31 2020-07-24 中北大学 Difference characteristic frequency distribution construction method based on nonparametric estimation
CN111445464B (en) * 2020-03-31 2022-04-19 中北大学 Difference characteristic frequency distribution construction method based on nonparametric estimation
CN115631123A (en) * 2022-11-22 2023-01-20 北京航空航天大学 Bionic vision fusion severe environment imaging device and method
CN115631123B (en) * 2022-11-22 2023-03-03 北京航空航天大学 Bionic vision fusion severe environment imaging device and method

Similar Documents

Publication Publication Date Title
Ancuti et al. D-hazy: A dataset to evaluate quantitatively dehazing algorithms
Negru et al. Exponential contrast restoration in fog conditions for driving assistance
CN109754377B (en) Multi-exposure image fusion method
CN111292257B (en) Retinex-based image enhancement method in scotopic vision environment
Li et al. A multi-scale fusion scheme based on haze-relevant features for single image dehazing
CN110570360B (en) Retinex-based robust and comprehensive low-quality illumination image enhancement method
CN109410161B (en) Fusion method of infrared polarization images based on YUV and multi-feature separation
CN111968054A (en) Underwater image color enhancement method based on potential low-rank representation and image fusion
CN103914699A (en) Automatic lip gloss image enhancement method based on color space
CN107895357B (en) A kind of real-time water surface thick fog scene image Enhancement Method based on FPGA
Kotwal et al. Joint desmoking and denoising of laparoscopy images
CN105678318B (en) The matching process and device of traffic sign
CN109377468A (en) The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features
Yu et al. A false color image fusion method based on multi-resolution color transfer in normalization YCBCR space
CN105447825B (en) Image defogging method and its system
Chen et al. An advanced visibility restoration algorithm for single hazy images
CN110120028A (en) A kind of bionical rattle snake is infrared and twilight image Color Fusion and device
Bi et al. Haze removal for a single remote sensing image using low-rank and sparse prior
CN110827218A (en) Airborne image defogging method based on image HSV transmissivity weighted correction
CN106570839A (en) Red Channel prior based underwater image sharpening method
Halmaoui et al. Contrast restoration of road images taken in foggy weather
CN109410160B (en) Infrared polarization image fusion method based on multi-feature and feature difference driving
CN111915508A (en) Image texture detail enhancement method for dyschromatopsia
CN112991222A (en) Image haze removal processing method and system, computer equipment, terminal and application
CN115100240A (en) Method and device for tracking object in video, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190222