CN109410161A - A kind of fusion method of the infrared polarization image separated based on YUV and multiple features - Google Patents

A kind of fusion method of the infrared polarization image separated based on YUV and multiple features Download PDF

Info

Publication number
CN109410161A
CN109410161A CN201811180887.1A CN201811180887A CN109410161A CN 109410161 A CN109410161 A CN 109410161A CN 201811180887 A CN201811180887 A CN 201811180887A CN 109410161 A CN109410161 A CN 109410161A
Authority
CN
China
Prior art keywords
image
formula
polarization
fusion
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811180887.1A
Other languages
Chinese (zh)
Other versions
CN109410161B (en
Inventor
宋斌
冉骏
陈蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Source Letter Photoelectric Polytron Technologies Inc
Original Assignee
Hunan Source Letter Photoelectric Polytron Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Source Letter Photoelectric Polytron Technologies Inc filed Critical Hunan Source Letter Photoelectric Polytron Technologies Inc
Priority to CN201811180887.1A priority Critical patent/CN109410161B/en
Publication of CN109410161A publication Critical patent/CN109410161A/en
Application granted granted Critical
Publication of CN109410161B publication Critical patent/CN109410161B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of fusion methods of infrared polarization image separated based on YUV and multiple features: indicating the polarization state of light with Stokes vector first, and calculates degree of polarization image and angle of polarization image;Degree of polarization and angle of polarization image respectively exclusive part P' and R' are solved again;Its fusion results and overall strength image I are carried out fusion in yuv space and obtain final infrared polarization image co-registration result by multiple features separation method the blending image P' and R' for recycling dark primary theory.This method is based on human-eye visual characteristic YUV color space and is merged, effectively improve the effect of visualization of Polarization Image Fusion result, enhance the detailed information such as the edge of image, improve picture contrast, merge complementary information between each amount of polarization, so that blending image scene is more abundant, facilitate the identification to camouflaged target.

Description

A kind of fusion method of the infrared polarization image separated based on YUV and multiple features
Technical field
The present invention relates generally to technical field of image processing, is separated more particularly to a kind of based on YUV and multiple features The fusion method of infrared polarization image.
Background technique
Be mainly that the infrared intensity of scenery is imaged for traditional infrared imaging system, mainly with the temperature of scenery Degree, radiance etc. are related.When placing the identical noise source of temperature around object, then existing thermal infrared imager just can not Camouflaged target is identified, infrared imagery technique faces serious limitation and challenge.Compared with traditional infrared imaging, The polarization imaging of light can reduce the degradation effects of complex scene, and can reduce complex scene using the polarization imaging of light Degradation effects, while the structure and range information of scene can be obtained.It is different from the another of radiation just because of light polarization information A kind of information characterizing object, the testee of identical radiation may have different degree of polarizations, can be multiple using polarization means The signal being tested under miscellaneous background.
The polarization state of light can be described with Stokes vector, and each amount of polarization is superfluous in terms of expressing polarization information Remaining and complementary.There is the case where only considering single difference characteristic during Polarization Image Fusion, this cannot be in image Characteristics of image that is all uncertain and changing at random is effectively described, and causes to lose some valuable letters in fusion process Breath leads to the failure of fusion with identification;Blending image visual effect affects the resolution capability of human eye and to camouflaged target simultaneously Recognition effect.
Summary of the invention
It is an object of the invention to overcome above-mentioned the deficiencies in the prior art, proposition is separated based on a kind of YUV and multiple features The fusion method of infrared polarization image realizes the fusion between polarized component, merges complementary information between each amount of polarization, so that Blending image scene is more abundant, enhances the detailed information such as the edge of image, improves picture contrast, meanwhile, it is regarded based on human eye Feel that characteristic YUV color space is merged, effectively improves the effect of visualization of Polarization Image Fusion result.
The technical scheme is that a kind of fusion method of the infrared polarization image separated based on YUV and multiple features, packet Include following steps:
S1, using the polarization state of Stokes vector representation light, degree of polarization image P and the angle of polarization are calculated according to Stokes vector Image R;
S2, the exclusive part P' and polarization that degree of polarization image is solved according to the shared part Co of degree of polarization and angle of polarization image The exclusive part R' of angle image;
S3, multiple features separation method blending image P' and R' based on dark primary theory are utilized;
S4, it is merged using the fusion results and overall strength image I of step S3 in yuv space, obtains final fusion As a result.
Further, in above-mentioned steps S1 shown in the representation such as formula (1) of Stokes vector S:
In formula, I1、I2、I3And I4Respectively represent the luminous intensity that collected polarization direction is 0 °, 45 °, 90 ° and 135 ° degree Image;I represents the overall strength of light;Q represents the intensity difference between horizontal polarization and vertical polarization, U represent polarization direction at 45 ° and Intensity difference between 135 °;V represents the left-handed intensity difference between right-hand circular polarization component of light;
The degree of polarization image P and angle of polarization image R of polarised light are indicated are as follows:
Further, in above-mentioned steps S2, each pixel in image P and R is operated using formula (4), is obtained Obtain the shared part Co of image P and R:
Co=min (p, R) (4)
The exclusive part P' of degree of polarization image and the exclusive part of angle of polarization image are obtained respectively using formula (5) and (6) R':
P'=P-Co (5)
R'=R-Co (6)
It is also further, above-mentioned steps S3 the following steps are included:
S3.1, multiple features separation is carried out to image P' and R' based on dark primary theory, it is respective dark obtains image P' and R' Feature, bright feature and minutia image;
S3.2, using the bright characteristic image of matching process blending image P' and R' based on energy of local area feature;
S3.3, using the dark characteristic pattern of matching process blending image P' and R' based on regional area weighted variance feature Picture;
S3.4, the minutia image of fuzzy logic and feature difference driving blending image P' and R' is utilized;
S3.5, fusion steps S3.2, S3.3 and S3.4 as a result, obtain image P' and R' fusion results.
It should be strongly noted that step S3.2, S3.3 and S3.4 be regardless of timing, can put in one step into Row description, carry out here S3.2, S3.3 and S3.4 step by step be described only for it is subsequent statement be more clear: step S3.2, S3.3 synchronous can be carried out with S3.4;It can also be carried out simultaneously with any two, then carry out the last one;It can also first be carried out In any one, then synchronize carry out it is remaining two;Or successively carry out one of those.
It is also further, in above-mentioned steps S3.1,
Dark primary is that He etc. is used to estimate the transmissivity in atmospherical scattering model, is realized to natural image Quick demisting, Shown in the method for solving of dark channel diagram such as formula (7):
In formula, C is three Color Channels R, G, B of image;N (x) is the picture in the window appli centered on pixel x Element;LCIt (y) is a Color Channel figure of image;Ldark(x) it is dark channel diagram, image fogging degree is reflected, for fogless figure Picture, its dark Ldark(x) value tends to 0;For foggy image, Ldark(x) value is larger;
The dark primary image of the exclusive part P' of degree of polarization image is obtained by formula (7)
The dark primary image of the exclusive part R' of angle of polarization image is obtained by formula (7)
Using the multiple features separation based on dark primary theory, bright feature, dark feature and the details of two images are obtained respectively The step of feature are as follows:
S3.1.1, it formula (8) and (9) is utilized respectively obtains image after image P' is negatedIt is negated with image R' Image afterwardsAgain by dark primary imageWithRespectively with imageWithSmall rule is taken to carry out according to absolute value Fusion obtains the dark characteristic image D of image P'P'With the dark characteristic image D of image R'R', as shown in formula (10) and (11);
S3.1.2, generalWithCorresponding image D respectivelyP'And DR'It is poor to make, and obtains the bright characteristic pattern of image P' As LP'With the bright characteristic image L of image R'R', as shown in formula (12) and (13);
S3.1.3, by image P' and image R' respectively with dark primary imageWith dark primary imageIt is poor to make, and is schemed As the detail pictures P of P'P'With the detail pictures P of image R'R', as shown in formula (14) and (15);
It is also further, in above-mentioned steps S3.2, merge bright characteristic image LP'With bright characteristic image LR'Including following step It is rapid:
S3.2.1 is utilized respectively Gauss and weights the local energy of Gauss weighting that local energy function acquires two bright characteristic images Amount, Gauss weight shown in local energy function such as formula (16):
In formula,It represents the Gauss centered on point (m, n) and weights local energy;W (i, j) is gaussian filtering square Battle array;N is the size in region;T=(N-1)/2;K represents P' or R';
Bright characteristic image L is acquired respectively using formula (16)P'Gauss weight local energyWith bright characteristic image LR''s Gauss weights local energy
S3.2.2, image L is solvedP'With image LR'Gauss weight local energy matching degree;
Image LP'With image LR'Gauss weight local energy matching degree are as follows:
S3.2.3, local energy and Gauss weighting local energy matching degree blending image L are weighted using GaussP'And LR', obtain Obtain bright characteristic image fusion results FL
Image LP'And LR'Fusion rule are as follows:
Wherein,
In formula, TlThe threshold value of similitude judgement is merged for brightness, value is 0~0.5;If ME(m, n) < Tl, then Two images LP'And LR'Region centered on point (m, n) is dissimilar, two images LP'And LR'Fusion results choose Gauss add Weigh the big person of region energy;No person, two images LP'And LR'Fusion results be coefficient weighted average.
It is also further, dark characteristic image D is merged in above-mentioned steps S3.3P'With dark characteristic image DR'The following steps are included:
S3.3.1, it is utilized respectively the Gauss weighting office that regional area weighted variance energy function acquires the dark characteristic image of two width Portion's energy;Shown in the weighted variance energy function such as formula (20) of regional area:
In formula,Represent the regional area weighted variance energy centered on point (m, n);W (i, j) is Gauss filter Wave matrix;N is the size in region;T=(N-1)/2;Represent the regional area average value centered on point (m, n);k Represent P' or R';
Dark characteristic image D is acquired respectively using formula (20)P'Regional area weighted variance energyWith dark characteristic image DR'Regional area weighted variance energy
S3.3.2, two images D is solvedP'And DR'Regional area weighted variance energy matching degree;
Image DP'With image DR'The matching degree of regional area weighted variance energy acquired using formula (21):
S3.3.3, regional area weighted variance energy and local sub-region right variance energy match degree blending image two are utilized Width image DP'And DR', obtain the fusion results F of dark characteristic imageD
Two images DP'And DR'Fusion formula are as follows:
Wherein,
In formula, ThFor the threshold value of darkness Fusion Features similitude judgement, value is 0.5~1;If ME(m, n) < Th, then Two images DP'And DR'Region centered on point (m, n) is dissimilar, two images DP'And DR'Fusion results choose partial zones The big person of domain weighted variance energy;No person, two images DP'And DR'Fusion results be coefficient weighted average.
It is also further, the feature difference driving method based on partial gradient and local variance is used in above-mentioned steps S3.4 Merge the minutia of degree of polarization and angle of polarization image, comprising the following steps:
S3.4.1, the minutia image P for acquiring image P'P'With the minutia image P of image R'R'Partial gradient;
Solve the formula of partial gradient are as follows:
In formula,Represent the partial gradient at pixel (m, n);K represents P' or R'; Respectively represent the horizontal and vertical edge graph obtained using the horizontal and vertical template and minutia image convolution of Sobel operator Picture;
Minutia image P is obtained using formula (24)P'Partial gradientWith minutia image PR'Office Portion's gradient
S3.4.2, minutia image P is acquiredP'And PR'Local weighted variance;
Local weighted variance uses the method for formula (20) equally, i.e., as shown in formula (25):
In formula,Represent the regional area weighted variance energy centered on point (m, n);W (i, j) is Gauss filter Wave matrix;N is the size in region;T=(N-1)/2;Represent the regional area average value centered on point (m, n);k Represent P' or R';
Minutia image P is obtained using formula (25)P'Local weighted varianceWith minutia image PR' Partial gradient
S3.4.3, the partial gradient matching degree for acquiring two width minutia images, local weighted variance matching degree, part are poor Different gradient and local difference variance:
Partial gradient matching degree:
Local weighted variance matching degree:
Local difference gradient:
Local difference variance:
S3.4.4, pixel-based determine is obtained according to local difference gradient delta T (m, n) and part difference variance Δ V (m, n) Plan figure PDG (m, n), and according to partial gradient matching degree MT(m, n) and local weighted variance matching degree MV1(m, n) obtains feature Difference degree decision diagram DDG, comprising the following steps:
(1) decision diagram pixel-based is obtained according to local difference gradient delta T (m, n) and part difference variance Δ V (m, n) PDG (m, n): as Δ T (m, n) > 0, Δ V (m, n) > 0, PDG (m, n)=p is enabled1;As Δ T (m, n) < 0, Δ V (m, n) < 0 When, PDG (m, n)=p2;As Δ T (m, n) > 0, Δ V (m, n) < 0, PDG (m, n)=p3;As Δ T (m, n) < 0, Δ V (m, N) when > 0, PDG (m, n)=p is enabled4;As Δ T (m, n)=0, Δ V (m, n) > 0, PDG (m, n)=p is enabled5;When Δ T (m, n)= When 0, Δ V (m, n) < 0, PDG=p is enabled6;As Δ T (m, n) > 0, Δ V (m, n)=0, PDG (m, n)=p is enabled7;When Δ T (m, N) 0 < when Δ V (m, n)=0, enables PDG (m, n)=p8;As Δ T (m, n)=0, Δ V (m, n)=0, PDG (m, n)=p is enabled9; Wherein p1~p9Location of pixels is 1 when expression meets above-mentioned condition, the decision diagram that other location of pixels are 0;
(2) according to partial gradient matching degree MT(m, n) and local weighted variance matching degree MV(m, n) can get feature difference Degree decision diagram DDG, as shown in formula (30):
In formula, d1And d2The corresponding location of pixels that expression meets formula (30) is 1, other location of pixels are 0.
S3.4.5, determine to determine area according to decision diagram PDG (m, n) pixel-based and feature difference degree decision diagram DDG Domain and uncertain region;
It can be determined that the p for meeting (1) corresponding condition of step S3.4.4 by PDG1、p2、p5、p6、p7And p8To determine area Domain: because for p1And p2For both of these case, two difference characteristics can reflect corresponding pixel gray value whether It is retained in blending image;For p5、p6、p7And p8For these four situations, just it can reflect using one of difference characteristic Whether the gray value of corresponding pixel points is retained in blending image;
P can be determined that by PDG and DDG3And p4To determine region: because can be true using feature difference degree decision diagram DDG The difference degree size of fixed two image local features, then chooses the biggish difference characteristic of difference degree, this difference characteristic energy Enough reflect whether the gray value of corresponding pixel points is retained in blending image;
However for region p9Cannot be according to two decision diagram PDG and DDG determining methods, this region belongs to uncertain area Domain;
S3.4.6, the determination region that two width minutia images of fusion are driven using feature difference;
It is driven using the product of local difference gradient delta T (m, n) and part difference variance Δ V (m, n) as determining region fusion Reason element, is denoted as DIF (m, n), shown as the following formula:
DIF (m, n)=Δ T (m, n) Δ V (m, n) (31)
It is then merged using DIF (m, n) driving and determines region, the obtained fused image in determination region is Are as follows:
In formula, " * " represents the product of value at respective pixel position in matrix;
S3.4.7, the uncertain region that two width minutia images are merged using fuzzy logic theory;
Assuming that " minutia image PP'And PR'Partial gradient be big " membership function be respectively μT(Pp'(m,n)) And μT(PR'(m, n)), " minutia image PP'And PR'Local weighted variance be big " membership function be respectively μV(PP' (m, n)) and μV(PR'(m, n)), as shown in formula (33) and (34):
In formula, k represents P' or R'.
Two width minutia image P can be calculated separately using the friendship operation rule of fuzzy logicP'And PR'At position (m, n) Place pixel value to the membership function of uncertain region blending image significance level, be denoted as μ respectivelyT∩V(PP'(m, n)) and μT∩V (PR'(m, n)), as shown in formula (35):
μT∩V(Pk(m, n))=min [μT(Pk(m,n)),μV(Pk(m,n))] (35)
In formula, k represents P' or R';
Then, the fusion results of two images minutia image uncertain region are as follows:
In formula, " * " represents the product of value at respective pixel position in matrix, and "/" represents respective pixel position in matrix The place's of setting value is divided by;
S3.4.8, fusionWithThe fusion results for obtaining two width minutia images are FDIF(m, N), and to FDIF(m, n) carries out consistency desired result;
To FDIF(m, n) carries out consistency desired result, using the window of size 3 × 3 in image FDIFIt is moved on (m, n), uses window Pixel around mouthful carrys out authentication center pixel.
It is also further, fusion steps S3.2, S3.3 and S3.4's the result is that fusion includes bright spy in above-mentioned steps S3.5 The image F of signLThe image F of (m, n), dark featureDThe image F of (m, n) and minutiaDIF(m, n), fusion formula such as formula (38) It is shown:
F=α FL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
In formula, α, β and γ are fusion weight coefficient, and value range is [0,1];It is highly preferred that above-mentioned α value is that 1, β takes It is 1 that value, which is 0.3, γ value, to meet the supersaturation of less blending image and improve contrast.
It is also further, it is carried out using the fusion results of step S3 and overall strength image I in yuv space in above-mentioned steps S4 The method of fusion are as follows: by the channel Y of overall strength image I input yuv space, then the fusion results of step S3 are negated and obtain image F', i.e. F '=255-F, then F' is input to the channel U, the fusion results F of step S3 is finally input to the channel V, is obtained to the end Blending image.
Compared with the existing technology, the invention has the following advantages:
1, information is redundancy and complementation between each amount of polarization of polarization image, and the method for the present invention is to each of polarization image Amount of polarization is merged, and efficiently solves the problems, such as the information redundancy during Polarization Image Fusion between each amount of polarization, Complementary information between each amount of polarization is merged, so that blending image scene is more abundant, facilitates the identification to camouflaged target.
2, the correlation between characterization image adjacent pixel and partial zones are combined during the method for the present invention image co-registration The energy of local area feature of domain luminance difference, the regional area Variance feature for reflecting regional area grey scale change difference and anti- The regional area gradient of the details variance of image pixel is reflected, correlation and regional area brightness between image adjacent pixel are characterized The energy of local area feature of difference is conducive to improve picture contrast, reflects the regional area of regional area grey scale change difference Variance feature is conducive to improve image definition, reflects that the regional area gradient of the details variance of image pixel is conducive to enhancing figure The detailed information of picture enhances the detailed information such as the edge of image to improve the clarity and contrast of image.
3, the method for the present invention is merged based on human-eye visual characteristic YUV color space, effectively improves polarization image The effect of visualization of fusion results.
Detailed description of the invention
From the detailed description with reference to the accompanying drawing to the embodiment of the present invention, these and/or other aspects of the invention and Advantage will become clearer and be easier to understand, in which:
Fig. 1 is the fusion method flow chart for the infrared polarization image of the embodiment of the present invention separated based on YUV and multiple features;
Fig. 2 be the infrared polarization image of the embodiment of the present invention separated based on YUV and multiple features fusion method in utilize The minutia image interfusion method flow chart of fuzzy logic and the image P' and R' of feature difference driving.
Specific embodiment
In order to make those skilled in the art more fully understand the present invention, with reference to the accompanying drawings and detailed description to this hair It is bright to be described in further detail.
Embodiment 1
A kind of fusion method of the infrared polarization image separated based on YUV and multiple features, concrete operations process such as Fig. 1 institute Show, includes the following steps S1-S4.
S1 utilizes the polarization state of Stokes (Stokes) vector representation light, calculates degree of polarization image according to Stokes vector P and angle of polarization image R;
Infrared intensity imaging is mainly related with the temperature of scenery, radiance etc., and temperature phase is placed around object With noise source when, then existing thermal infrared imager can not just identify that infrared imagery technique faces serious to target Limitation and challenge.Compared with traditional infrared imaging, the polarization imaging of light can reduce the degradation effects of complex scene, and And the degradation effects of complex scene can be reduced using the polarization imaging of light, be conducive to the identification for camouflaged target.
Polarization is used as a kind of essential characteristic of light, can not be observed directly by human eye, it is therefore desirable to by polarization information with certain Kind form shows that one side human eye perceives or facilitate computer disposal.The polarization state of light is indicated using Stokes vector, Stokes vector describes the polarization state and intensity of light using four Stokes parameters, they are the time averages of light intensity, Dimension with intensity can be detected directly by detector.The representation of Stokes vector S are as follows:
In formula, I1、I2、I3And I4Respectively represent the luminous intensity that collected polarization direction is 0 °, 45 °, 90 ° and 135 ° degree Image;I represents the overall strength of light;Q represents the intensity difference between horizontal polarization and vertical polarization, U represent polarization direction at 45 ° and Intensity difference between 135 °;V represents the left-handed intensity difference between right-hand circular polarization component of light.
Often do not have to phase delay device in actually polarization, Stokes parameter only can be obtained by rotation linear polarizer.Cause The degree of polarization image P and angle of polarization image R of this polarised light can be indicated are as follows:
S2 according to the two of degree of polarization and angle of polarization image share part Co solve degree of polarization exclusive part P' and partially The exclusive part R' of vibration angle image;
There are redundancy and mutually complementary information between degree of polarization and angle of polarization image, to each pixel of image P and R into Row formula (4) operation, can get the shared part Co of image P and R.
Co=min (p, R) (4)
The exclusive part of degree of polarization and angle of polarization image can be obtained respectively using formula (5) and (6).
P'=P-Co (5)
R'=R-Co (6)
S3 utilizes multiple features separation method blending image P' and R' based on dark primary theory;
S3.1 is based on dark primary theory and carries out multiple features separation to image P' and R', obtains the respective dark spy of image P' and R' Sign, bright feature and minutia image,
Dark primary is that He etc. is used to estimate the transmissivity in atmospherical scattering model, is realized to natural image Quick demisting. Shown in the method for solving of dark channel diagram such as formula (7).
In formula, C is three Color Channels R, G, B of image;N (x) is the picture in the window appli centered on pixel x Element;LCIt (y) is a Color Channel figure of image;Ldark(x) it is dark channel diagram, image fogging degree is reflected, for fogless figure Picture, its dark Ldark(x) value tends to 0;For foggy image, Ldark(x) value is larger.In natural image, influenced by mist bright Aobvious region is usually pixel most bright in dark primary, and pixel value of the fogless region in dark primary is very low.Therefore for For gray level image, dark primary figure contains bright areas in original image, embodies original image low frequency part, that is, remains original image Grey scale change is than more gentle region as in, so that bright dark feature difference is more prominent, the variation of lose gray level value is more violent, right The local region information higher than degree, especially edge detail information.
The dark primary image of the exclusive part P' of degree of polarization image is obtained by formula (7)
The dark primary image of the exclusive part angle of polarization image R' is obtained by formula (7)
Using the multiple features separation based on dark primary theory, bright feature, dark feature and the details of two images are obtained respectively The step of feature are as follows:
S3.1.1 is utilized respectively formula (8) and (9) and obtains image after image P' is negatedAfter being negated with image R' ImageAgain by dark primary imageWithRespectively with imageWithSmall rule is taken to be melted according to absolute value It closes, obtains the dark characteristic image D of image P'P'With the dark characteristic image D of image R'R', as shown in formula (10) and (11);
S3.1..2 willWithCorresponding image D respectivelyP'And DRIt is poor to make, and obtains the bright characteristic image of image P' LP'With the bright characteristic image L of image R'R', as shown in formula (12) and (13);
S3.1.3 by image P' and image R' respectively with dark primary imageWith dark primary imageIt is poor to make, and is schemed As the detail pictures P of P'P'With the detail pictures P of image R'R', as shown in formula (14) and (15);
Equally, it should be strongly noted that following steps S3.2, S3.3 and S3.4 are can be placed on one regardless of timing It is described in a step, it is convenient only for subsequent descriptions to carry out number of steps here: step S3.2, S3.3 and S3.4 can be same Step carries out;It can also be carried out simultaneously with any two, then carry out the last one;Can also first carry out wherein any one, then It synchronizes and carries out remaining two;Or successively carry out one of those.
S3.2 uses the bright characteristic image of matching process blending image P' and R' based on energy of local area feature;
Bright characteristic information has concentrated the bright areas in original image, embodies the low frequency component in original image, merges bright spy Levy image LP'With bright characteristic image LR'The step of are as follows:
S3.2.1 is utilized respectively Gauss and weights the local energy of Gauss weighting that local energy function acquires two bright characteristic images Amount;
Gauss weights shown in local energy function such as formula (16):
In formula,It represents the Gauss centered on point (m, n) and weights local energy;W (i, j) is gaussian filtering square Battle array;N is the size in region;T=(N-1)/2;K represents P' or R';Then bright characteristic image L can be acquired respectively using formula (16)P' Gauss weight local energyWith bright characteristic image LR'Gauss weight local energy
S3.2.2 solves image LP'With image LR'Gauss weight local energy matching degree;
Image LP'With image LR'Gauss weight local energy matching degree are as follows:
S3.2.3 weights local energy using Gauss and Gauss weights local energy matching degree blending image LP'And LR', obtain Obtain bright characteristic image fusion results FL
Image LP'And LR'Fusion rule are as follows:
Wherein,
In formula, TlThe threshold value of similitude judgement is merged for brightness, value is 0~0.5;If ME(m, n) < Tl, then Two images LP'And LR'Region centered on point (m, n) is dissimilar, two images LP'And LR'Fusion results choose Gauss add Weigh the big person of region energy;No person, two images LP'And LR'Fusion results be coefficient weighted average.
S3.3 uses the dark characteristic image of matching process blending image P' and R' based on regional area weighted variance feature;
Dark characteristic image lacks the bright areas in source images, but still can regard the approximate image of source images as, wraps Contain the main energetic of image, and embodied the elementary contour of image, merges dark characteristic image DP'With dark characteristic image DR''s Step are as follows:
S3.3.1 is utilized respectively the Gauss weighting office that regional area weighted variance energy function acquires the dark characteristic image of two width Portion's energy;
Shown in the weighted variance energy function such as formula (20) of regional area:
In formula,Represent the regional area weighted variance energy centered on point (m, n);W (i, j) is Gauss filter Wave matrix;N is the size in region;T=(N-1)/2;Represent the regional area average value centered on point (m, n);k Represent P' or R'.
Then dark characteristic image D can be acquired respectively using formula (20)P'Regional area weighted variance energyWith dark feature Image DR'Regional area weighted variance energy
S3.3.2 solves two images DP'And DR'Regional area weighted variance energy matching degree;
Image DP'With image DR'Regional area weighted variance energy matching degree:
S3.3.3 utilizes regional area weighted variance energy and local sub-region right variance energy match degree blending image two Width image DP'And DR', obtain the fusion results F of dark characteristic imageD
Two images DP'And DR'Fusion rule are as follows:
Wherein,
In formula, ThFor the threshold value of darkness Fusion Features similitude judgement, value is 0.5~1;If
ME(m, n) < Th, then two images DP'And DR'Region centered on point (m, n) is dissimilar, two images DP'With DR'Fusion results choose the big person of regional area weighted variance energy;No person, two images DP'And DR'Fusion results be coefficient Weighted average.
S3.4 drives the minutia image of blending image P' and R' using fuzzy logic and feature difference.
Partial gradient and local variance can be well reflected the detailed information of image, express the clarity of image.For The detailed information for retaining minutia image as much as possible, promotes clarity, using based on partial gradient and local variance Feature difference driving method merges the minutia of degree of polarization and angle of polarization image, as shown in Figure 2, the specific steps are as follows:
S3.4.1 acquires the minutia image P of image P'P'With the minutia image P of image R'R'Partial gradient;
Solve the formula of partial gradient are as follows:
In formula,Represent the partial gradient at pixel (m, n);K represents P' or R'; Respectively represent the horizontal and vertical edge graph obtained using the horizontal and vertical template and minutia image convolution of Sobel operator Picture.
Minutia image P is obtained using formula (24)P'Partial gradientWith minutia image PR'Office Portion's gradient
S3.4.2 acquires minutia image PP'And PR'Local weighted variance;
Local weighted variance uses the method for formula (20) equally, i.e., are as follows:
In formula,Represent the regional area weighted variance energy centered on point (m, n);W (i, j) is Gauss filter Wave matrix;N is the size in region;T=(N-1)/2;Represent the regional area average value centered on point (m, n);k Represent P' or R'.
Minutia image P is obtained using formula (25)P'Local weighted varianceWith minutia image PR' Partial gradient
S3.4.3 acquires local difference gradient and local difference variance and the partial gradient spy of two width minutia images It seeks peace the matching degree of local weighted Variance feature;
Partial gradient matching degree:
Local weighted variance matching degree:
Local difference gradient:
Local difference variance:
S3.4.4 obtains decision diagram pixel-based according to feature difference and obtains feature difference journey using characteristic matching degree Spend decision diagram;
(1) decision pixel-based can get according to local difference gradient delta T (m, n) and part difference variance Δ V (m, n) Figure PDG (m, n): as Δ T (m, n) > 0, Δ V (m, n) > 0, PDG (m, n)=p is enabled1;As Δ T (m, n) < 0, Δ V (m, n) When < 0, PDG (m, n)=p2;As Δ T (m, n) > 0, Δ V (m, n) < 0, PDG (m, n)=p3;As Δ T (m, n) < 0, Δ V When (m, n) > 0, PDG (m, n)=p is enabled4;As Δ T (m, n)=0, Δ V (m, n) > 0, PDG (m, n)=p is enabled5;When Δ T (m, N)=0, when Δ V (m, n) < 0, PDG=p is enabled6;As Δ T (m, n) > 0, Δ V (m, n)=0, PDG (m, n)=p is enabled7;As Δ T When (m, n) < 0, Δ V (m, n)=0, PDG (m, n)=p is enabled8;As Δ T (m, n)=0, Δ V (m, n)=0, PDG (m, n) is enabled =p9.Wherein p1~p9Location of pixels is 1 when expression meets above-mentioned condition, the decision diagram that other location of pixels are 0;
(2) according to partial gradient matching degree MT(m, n) and local weighted variance matching degree MV(m, n) can get feature difference Degree decision diagram DDG, as shown in formula (30):
In formula, d1And d2The corresponding location of pixels that expression meets formula (30) is 1, other location of pixels are 0.
S3.4.5 determines to determine region and uncertain region according to decision diagram pixel-based and feature difference degree decision diagram Domain;
It can be determined that the p for meeting (1) corresponding condition of step S3.4.4 by PDG1、p2、p5、p6、p7And p8To determine area Domain.Because for p1And p2For both of these case, two difference characteristics can reflect corresponding pixel gray value whether It is retained in blending image;For p5、p6、p7And p8For these four situations, just it can reflect using one of difference characteristic Whether the gray value of corresponding pixel points is retained in blending image.
P can be determined that by PDG and DDG3And p4To determine region.Because can be true using feature difference degree decision diagram DDG The difference degree size of fixed two image local features, then chooses the biggish difference characteristic of difference degree, this difference characteristic energy Enough reflect whether the gray value of corresponding pixel points is retained in blending image.
However for region p9Cannot be according to two decision diagram PDG and DDG determining methods, this region belongs to uncertain area Domain.
S3.4.6 merges the determination region of two width minutia images using feature difference driving;
It is driven using the product of local difference gradient delta T (m, n) and part difference variance Δ V (m, n) as determining region fusion Reason element, is denoted as DIF (m, n), shown as the following formula:
DIF (m, n)=Δ T (m, n) Δ V (m, n) (31)
It is then merged using DIF (m, n) driving and determines region, the obtained fused image in determination region is Are as follows:
In formula, " * " represents the product of value at respective pixel position in matrix.
S3.4.7 merges the uncertain region of two width minutia images using fuzzy logic theory;
Uncertain region is merged using fuzzy logic theory, for minutia image PP'And PR', need to consider to be details spy The partial gradient of sign image be big or the local weighted variance of minutia image be it is big, for this group of relationship, building The subordinating degree function of minutia image.Then assume " minutia image PP'And PR'Partial gradient be big " be subordinate to letter Number is respectively for μT(Pp'(m, n)) and μT(PR'(m, n)), " minutia image PP'And PR'Local weighted variance be big " Membership function be respectively μV(PP'(m, n)) and μV(PR'(m, n)), as shown in formula (33) and (34):
In formula, k represents P' or R'.
Two width minutia image P can be calculated separately using the friendship operation rule of fuzzy logicP'And PR'At position (m, n) Place pixel value to the membership function of uncertain region blending image significance level, be denoted as μ respectivelyT∩V(PP'(m, n)) and μT∩V (PR'(m, n)), as shown in formula (35):
μT∩V(Pk(m, n))=min [μT(Pk(m,n)),μV(Pk(m,n))] (35)
In formula, k represents P' or R'.
Then, the fusion results of two images minutia image uncertain region are as follows:
In formula, " * " represents the product of value at respective pixel position in matrix, and "/" represents respective pixel position in matrix The place's of setting value is divided by.
S3.4.8 fusionWithThe fusion results for obtaining two width minutia images are FDIF(m, N), and to FDIF(m, n) carries out consistency desired result;
To FDIF(m, n) carries out consistency desired result, using the window of size 3 × 3 in image FDIFIt is moved on (m, n), uses window Pixel around mouthful carrys out authentication center pixel.If center pixel is from PrAnd PpIn one of image, and the center pixel Surrounding s (4 < s < 8) a pixel both be from another image, then the center pixel value is just changed to another figure As pixel value in the position, window traverses whole image FDIF(m, n) obtains the F correctedDIF(m,n)。
S3.5 fusion steps S3.2, S3.3 and S3.4 as a result, obtain image P' and R' fusion results.
Fusion includes the image F of bright featureLThe image F of (m, n), dark featureDThe image F of (m, n) and minutiaDIF(m, N), fusion formula is as follows:
F=α FL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
In formula, α, β and γ are fusion weight coefficient, and value range is [0,1].In order to less blending image supersaturation simultaneously Contrast is improved, it be 0.3, γ value is 1 that α value, which is 1, β value,.
S4 is merged using the fusion results and overall strength image I of step S3 in yuv space, and final fusion knot is obtained Fruit;
By the channel Y of overall strength image I input yuv space, then the fusion results of step S3 are negated and obtain image F', i.e., F'=255-F, then F' is input to the channel U, the fusion results F of step S3 is finally input to the channel V, has obtained last melt Close image.
Various embodiments of the present invention are described above, above description is exemplary, and non-exclusive, and It is not limited to disclosed each embodiment.Without departing from the scope and spirit of illustrated each embodiment, for this skill Many modifications and changes are obvious for the those of ordinary skill in art field.Therefore, protection scope of the present invention is answered This is subject to the protection scope in claims.

Claims (10)

1. a kind of fusion method of the infrared polarization image separated based on YUV and multiple features, which is characterized in that including following step It is rapid:
S1, using the polarization state of Stokes vector representation light, degree of polarization image P and the angle of polarization are then calculated according to Stokes vector Image R;
S2, the exclusive part P' and the angle of polarization that degree of polarization image is solved according to the shared part Co of degree of polarization and angle of polarization image The exclusive part R' of image;
S3, multiple features separation method blending image P' and R' based on dark primary theory are utilized;
S4, it is merged using the fusion results and overall strength image I of step S3 in yuv space, obtains final fusion results.
2. the fusion method of the infrared polarization image separated as described in claim 1 based on YUV and multiple features, feature are existed In in the step S1 shown in the representation such as formula (1) of Stokes vector S:
In formula, I1、I2、I3And I4Respectively represent the light intensity image that collected polarization direction is 0 °, 45 °, 90 ° and 135 ° degree;I Represent the overall strength of light;Q represents the intensity difference between horizontal polarization and vertical polarization, U represent polarization direction 45 ° and 135 ° it Between intensity difference;V represents the left-handed intensity difference between right-hand circular polarization component of light;
The degree of polarization image P and angle of polarization image R of polarised light are indicated are as follows:
3. the fusion method of the infrared polarization image separated as claimed in claim 2 based on YUV and multiple features, feature are existed In, in the step S2,
Obtain using formula (4) the shared part Co of image P and R to each pixel in image P and R:
Co=min (p, R) (4)
Then the exclusive part P' of degree of polarization image and the exclusive part of angle of polarization image are obtained respectively using formula (5) and (6) R':
P'=P-Co (5)
R'=R-Co (6).
4. the fusion method of the infrared polarization image separated as claimed in claim 3 based on YUV and multiple features, feature are existed In, the step S3 the following steps are included:
S3.1, multiple features separation is carried out to image P' and R' based on dark primary theory, obtain the respective dark feature of image P' and R', Bright feature and minutia image;
S3.2, using the bright characteristic image of matching process blending image P' and R' based on energy of local area feature;Using base In the dark characteristic image of matching process the blending image P' and R' of regional area weighted variance feature;Utilize fuzzy logic and feature The minutia image of variance drive blending image P' and R';
Bright characteristic image, dark characteristic image and minutia image in S3.3, fusion steps S3.2 as a result, obtaining image P' With the fusion results of R'.
5. the fusion method of the infrared polarization image separated as claimed in claim 4 based on YUV and multiple features, feature are existed In, in the step S3.1,
Dark primary is that He etc. is used to estimate the transmissivity in atmospherical scattering model, realizes to natural image Quick demisting, helps secretly Shown in the method for solving of road figure such as formula (7):
In formula, C is three Color Channels R, G, B of image;N (x) is the pixel in the window appli centered on pixel x;LC It (y) is a Color Channel figure of image;Ldark(x) it is dark channel diagram, reflects image fogging degree, for fog free images, Its dark Ldark(x) value tends to 0;For foggy image, Ldark(x) value is larger;
The dark primary image of the exclusive part P' of degree of polarization image is obtained by formula (7)
The dark primary image of the exclusive part R' of angle of polarization image is obtained by formula (7)
Using the multiple features separation based on dark primary theory, bright feature, dark feature and the minutia of two images are obtained respectively The step of are as follows:
S3.1.1, it formula (8) and (9) is utilized respectively obtains image after image P' is negatedFigure after being negated with image R' PictureAgain by dark primary imageWithRespectively with imageWithSmall rule is taken to be merged according to absolute value, Obtain the dark characteristic image D of image P'P'With the dark characteristic image D of image R'R', as shown in formula (10) and (11);
S3.1.2, generalWithCorresponding image D respectivelyP'And DR'It is poor to make, and obtains the bright characteristic image L of image P'P' With the bright characteristic image L of image R'R', as shown in formula (12) and (13):
S3.1.3, by image P' and image R' respectively with dark primary imageWith dark primary imageIt is poor to make, and obtains image P' Detail pictures PP'With the detail pictures P of image R'R', as shown in formula (14) and (15);
6. the fusion method of the infrared polarization image separated as claimed in claim 5 based on YUV and multiple features, feature are existed In using the bright characteristic pattern of matching process blending image P' and R' based on energy of local area feature in the step S3.2 Picture, comprising the following steps:
S3.2.1 is utilized respectively Gauss and weights the Gauss weighting local energy that local energy function acquires two bright characteristic images, high Shown in this weighting local energy function such as formula (16):
In formula,It represents the Gauss centered on point (m, n) and weights local energy;W (i, j) is gaussian filtering matrix;N For the size in region;T=(N-1)/2;K represents P' or R';
Bright characteristic image L is acquired respectively using formula (16)P'Gauss weight local energyWith bright characteristic image LR'Gauss Weight local energy
S3.2.2, image L is solvedP'With image LR'Gauss weight local energy matching degree;
Image LP'With image LR'Gauss weight local energy matching degree are as follows:
S3.2.3, local energy and Gauss weighting local energy matching degree blending image L are weighted using GaussP'And LR', obtain bright Characteristic image fusion results FL
Image LP'And LR'Fusion rule are as follows:
Wherein,
In formula, TlThe threshold value of similitude judgement is merged for brightness, value is 0~0.5;If ME(m, n) < Tl, then two width Image LP'And LR'Region centered on point (m, n) is dissimilar, two images LP'And LR'Fusion results choose Gauss weight area The big person of domain energy;No person, two images LP'And LR'Fusion results be coefficient weighted average.
7. the fusion method of the infrared polarization image separated as claimed in claim 6 based on YUV and multiple features, feature are existed In using the dark feature of matching process blending image P' and R' based on regional area weighted variance feature in the step S3.2 Image, comprising the following steps:
S3.2.4, it is utilized respectively the local energy of Gauss weighting that regional area weighted variance energy function acquires the dark characteristic image of two width Amount;Shown in the weighted variance energy function such as formula (20) of regional area:
In formula,Represent the regional area weighted variance energy centered on point (m, n);W (i, j) is gaussian filtering square Battle array;N is the size in region;T=(N-1)/2;Represent the regional area average value centered on point (m, n);K is represented P' or R';
Dark characteristic image D is acquired respectively using formula (20)P'Regional area weighted variance energyWith dark characteristic image DR''s Regional area weighted variance energy
S3.2.5, two images D is solvedP'And DR'Regional area weighted variance energy matching degree;
Image DP'With image DR'The matching degree of regional area weighted variance energy acquired using formula (21):
S3.2.6, regional area weighted variance energy and local two width figure of sub-region right variance energy match degree blending image are utilized As DP'And DR', obtain the fusion results F of dark characteristic imageD
Two images DP'And DR'Fusion formula are as follows:
Wherein,
In formula, ThFor the threshold value of darkness Fusion Features similitude judgement, value is 0.5~1;If ME(m, n) < Th, then two width figure As DP'And DR'Region centered on point (m, n) is dissimilar, two images DP'And DR'Fusion results choose regional area weighting The big person of variance energy;No person, two images DP'And DR'Fusion results be coefficient weighted average.
8. the fusion method of the infrared polarization image separated as claimed in claim 7 based on YUV and multiple features, feature are existed In, using the minutia image of fuzzy logic and feature difference driving blending image P' and R' in the step S3.2, including Following steps:
S3.2.7, the minutia image P for acquiring image P'P'With the minutia image P of image R'R'Partial gradient;
Solve the formula of partial gradient are as follows:
In formula,Represent the partial gradient at pixel (m, n);K represents P' or R'; Respectively Represent the horizontal and vertical edge image obtained using the horizontal and vertical template and minutia image convolution of Sobel operator;
Minutia image P is obtained using formula (24)P'Partial gradientWith minutia image PR'Part ladder Degree
S3.2.8, minutia image P is acquiredP'And PR'Local weighted variance;
Local weighted variance uses the method for formula (20) equally, i.e., as shown in formula (25):
In formula,Represent the regional area weighted variance energy centered on point (m, n);W (i, j) is gaussian filtering square Battle array;N is the size in region;T=(N-1)/2;Represent the regional area average value centered on point (m, n);K is represented P' or R';
Minutia image P is obtained using formula (25)P' local weighted varianceWith minutia image PR'Office Portion's gradient
S3.2.9, the partial gradient matching degree for acquiring two width minutia images, local weighted variance matching degree, local difference ladder Degree and local difference variance:
Partial gradient matching degree:
Local weighted variance matching degree:
Local difference gradient:
Local difference variance:
S3.2.10, decision pixel-based is obtained according to local difference gradient delta T (m, n) and part difference variance Δ V (m, n) Scheme PDG (m, n), and according to partial gradient matching degree MT(m, n) and local weighted variance matching degree MV1It is poor that (m, n) obtains feature Off course degree decision diagram DDG, comprising the following steps:
(1) decision diagram PDG pixel-based is obtained according to local difference gradient delta T (m, n) and part difference variance Δ V (m, n) (m, n): as Δ T (m, n) > 0, Δ V (m, n) > 0, PDG (m, n)=p is enabled1;As Δ T (m, n) < 0, Δ V (m, n) < 0, PDG (m, n)=p2;As Δ T (m, n) > 0, Δ V (m, n) < 0, PDG (m, n)=p3;As Δ T (m, n) < 0, Δ V (m, n) > When 0, PDG (m, n)=p is enabled4;As Δ T (m, n)=0, Δ V (m, n) > 0, PDG (m, n)=p is enabled5;When Δ T (m, n)=0, Δ When V (m, n) < 0, PDG=p is enabled6;As Δ T (m, n) > 0, Δ V (m, n)=0 enables PDG (m, n)=p7;As Δ T (m, n) < 0, When Δ V (m, n)=0, PDG (m, n)=p is enabled8;As Δ T (m, n)=0, Δ V (m, n)=0, PDG (m, n)=p is enabled9;Wherein p1 ~p9Location of pixels is 1 when expression meets above-mentioned condition, the decision diagram that other location of pixels are 0;
(2) according to partial gradient matching degree MT(m, n) and local weighted variance matching degree MV(m, n) can get feature difference degree Decision diagram DDG, as shown in formula (30):
In formula, d1And d2The corresponding location of pixels that expression meets formula (30) is 1, other location of pixels are 0;
S3.2.11, determine to determine according to decision diagram PDG (m, n) pixel-based and feature difference degree decision diagram DDG region and Uncertain region:
S3.2.12, the determination region that two width minutia images of fusion are driven using feature difference;
Using the product of local difference gradient delta T (m, n) and part difference variance Δ V (m, n) as determine region fusion driving because Element is denoted as DIF (m, n), shown as the following formula:
DIF (m, n)=Δ T (m, n) Δ V (m, n) (31)
It is then merged using DIF (m, n) driving and determines region, the obtained fused image in determination region isAre as follows:
In formula, " * " represents the product of value at respective pixel position in matrix;
S3.2.13, the uncertain region that two width minutia images are merged using fuzzy logic theory;
Assuming that " minutia image PP'And PR'Partial gradient be big " membership function be respectively μT(Pp' (m, n)) and μT (PR'(m, n)), " minutia image PP'And PR'Local weighted variance be big " membership function be respectively μV(PP'(m, ) and μ n)V(PR'(m, n)), as shown in formula (33) and (34):
In formula, k represents P' or R';
Two width minutia image P can be calculated separately using the friendship operation rule of fuzzy logicP'And PR'At the place of position (m, n) Pixel value is denoted as μ to the membership function of uncertain region blending image significance level respectivelyT∩V(PP'(m, n)) and μT∩V(PR'(m, N)), as shown in formula (35):
μT∩V(Pk(m, n))=min [μT(Pk(m,n)),μV(Pk(m,n))] (35)
In formula, k represents P' or R';
Then, the fusion results of two images minutia image uncertain region are as follows:
In formula, " * " represents the product of value at respective pixel position in matrix, and "/" represents in matrix at respective pixel position Value is divided by;
S3.2.14, fusionWithThe fusion results for obtaining two width minutia images are FDIF(m, n), And to FDIF(m, n) carries out consistency desired result;
To FDIF(m, n) carries out consistency desired result, using the window of size 3 × 3 in image FDIFIt is moved on (m, n), with window week The pixel enclosed carrys out authentication center pixel.
9. the fusion method of the infrared polarization image separated as claimed in claim 8 based on YUV and multiple features, feature are existed In, bright characteristic image, dark characteristic image and minutia image in the step S3.3 in fusion steps S3.2 the result is that Fusion includes the image F of bright featureLThe image F of (m, n), dark featureDThe image F of (m, n) and minutiaDIF(m, n), fusion are public Shown in formula such as formula (38):
F=α FL(m,n)+βFD(m,n)+γFDIF(m,n) (38)
In formula, α, β and γ are fusion weight coefficient, and it be 0.3, γ value is 1 that α value, which is 1, β value, to meet less fusion figure The supersaturation of picture simultaneously improves contrast.
10. the fusion method of the infrared polarization image separated as claimed in claim 9 based on YUV and multiple features, feature are existed In the method merged using the fusion results of step S3 and overall strength image I in yuv space in the step S4 are as follows:
By the channel Y of overall strength image I input yuv space, then the fusion results of step S3 are negated and obtain image F', i.e. F'= 255-F, then F' is input to the channel U, the fusion results F of step S3 is finally input to the channel V, obtains fusion figure to the end Picture.
CN201811180887.1A 2018-10-09 2018-10-09 Fusion method of infrared polarization images based on YUV and multi-feature separation Active CN109410161B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811180887.1A CN109410161B (en) 2018-10-09 2018-10-09 Fusion method of infrared polarization images based on YUV and multi-feature separation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811180887.1A CN109410161B (en) 2018-10-09 2018-10-09 Fusion method of infrared polarization images based on YUV and multi-feature separation

Publications (2)

Publication Number Publication Date
CN109410161A true CN109410161A (en) 2019-03-01
CN109410161B CN109410161B (en) 2020-11-13

Family

ID=65467525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811180887.1A Active CN109410161B (en) 2018-10-09 2018-10-09 Fusion method of infrared polarization images based on YUV and multi-feature separation

Country Status (1)

Country Link
CN (1) CN109410161B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111307065A (en) * 2020-03-05 2020-06-19 中国铁道科学研究院集团有限公司基础设施检测研究所 Steel rail profile detection method, system and device based on polarization beam splitting
CN111383242A (en) * 2020-05-29 2020-07-07 浙江大华技术股份有限公司 Image fog penetration processing method and device
CN112164017A (en) * 2020-09-27 2021-01-01 中国兵器工业集团第二一四研究所苏州研发中心 Deep learning-based polarization colorization method
CN113421205A (en) * 2021-07-16 2021-09-21 合肥工业大学 Small target detection method combined with infrared polarization imaging

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107253485A (en) * 2017-05-16 2017-10-17 北京交通大学 Foreign matter invades detection method and foreign matter intrusion detection means
CN108389158A (en) * 2018-02-12 2018-08-10 河北大学 A kind of infrared and visible light image interfusion method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107253485A (en) * 2017-05-16 2017-10-17 北京交通大学 Foreign matter invades detection method and foreign matter intrusion detection means
CN108389158A (en) * 2018-02-12 2018-08-10 河北大学 A kind of infrared and visible light image interfusion method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ZHOU PU-CHENG 等: "Camouflaged Target Separation by Spectral-polarimetric Imagery Fusion with Shearlet Transform and Clustering Segmentation", 《INTERNATIONAL SYMPOSIUM ON PHOTOELECTRONIC DETECTION AND IMAGING 2013:IMAGING SENSORS AND APPLICATIONS》 *
安富 等: "模糊逻辑与特征差异驱动的红外偏振图像融合模型", 《红外技术》 *
李伟伟 等: "红外偏振与红外光强图像的伪彩色融合研究", 《红外技术》 *
郭喆 等: "一种红外偏振与光强图像的暗原色多特征分离融合方法", 《科学技术与工程》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111307065A (en) * 2020-03-05 2020-06-19 中国铁道科学研究院集团有限公司基础设施检测研究所 Steel rail profile detection method, system and device based on polarization beam splitting
CN111383242A (en) * 2020-05-29 2020-07-07 浙江大华技术股份有限公司 Image fog penetration processing method and device
CN111383242B (en) * 2020-05-29 2020-09-29 浙江大华技术股份有限公司 Image fog penetration processing method and device
CN112164017A (en) * 2020-09-27 2021-01-01 中国兵器工业集团第二一四研究所苏州研发中心 Deep learning-based polarization colorization method
CN112164017B (en) * 2020-09-27 2023-11-17 中国兵器工业集团第二一四研究所苏州研发中心 Polarization colorization method based on deep learning
CN113421205A (en) * 2021-07-16 2021-09-21 合肥工业大学 Small target detection method combined with infrared polarization imaging
CN113421205B (en) * 2021-07-16 2022-11-15 合肥工业大学 Small target detection method combined with infrared polarization imaging

Also Published As

Publication number Publication date
CN109410161B (en) 2020-11-13

Similar Documents

Publication Publication Date Title
AU2018247216B2 (en) Systems and methods for liveness analysis
Zhang et al. Fast haze removal for nighttime image using maximum reflectance prior
Silverman et al. Display and enhancement of infrared images
CN109410161A (en) A kind of fusion method of the infrared polarization image separated based on YUV and multiple features
EP0932114B1 (en) A method of and apparatus for detecting a face-like region
KR100796707B1 (en) Image fusion system and method
CN107977940A (en) background blurring processing method, device and equipment
CN112288663A (en) Infrared and visible light image fusion method and system
JP2002501265A (en) Method and apparatus for removing bright or dark spots by fusing multiple images
Zhang et al. A retina inspired model for enhancing visibility of hazy images
Yu et al. A false color image fusion method based on multi-resolution color transfer in normalization YCBCR space
CN106815826A (en) Night vision image Color Fusion based on scene Recognition
CN105869115B (en) A kind of depth image super-resolution method based on kinect2.0
CN110210541A (en) Image interfusion method and equipment, storage device
Stone et al. Forward looking anomaly detection via fusion of infrared and color imagery
CN110490187A (en) Car license recognition equipment and method
Asmare et al. Image enhancement by fusion in contourlet transform
US10721448B2 (en) Method and apparatus for adaptive exposure bracketing, segmentation and scene organization
CN109410160A (en) The infrared polarization image interfusion method driven based on multiple features and feature difference
CN109377468A (en) The pseudo-colours fusion method of infra-red radiation and polarization image based on multiple features
US11455710B2 (en) Device and method of object detection
Qian et al. Fast color contrast enhancement method for color night vision
Tao et al. Intelligent colorization for thermal infrared image based on CNN
Petrovic Multilevel image fusion
Qian et al. Effective contrast enhancement method for color night vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant