CN107240096A - A kind of infrared and visual image fusion quality evaluating method - Google Patents

A kind of infrared and visual image fusion quality evaluating method Download PDF

Info

Publication number
CN107240096A
CN107240096A CN201710405190.9A CN201710405190A CN107240096A CN 107240096 A CN107240096 A CN 107240096A CN 201710405190 A CN201710405190 A CN 201710405190A CN 107240096 A CN107240096 A CN 107240096A
Authority
CN
China
Prior art keywords
local
mrow
fused images
infrared
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710405190.9A
Other languages
Chinese (zh)
Inventor
朱亚辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Xueqian Normal University
Original Assignee
Shaanxi Xueqian Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Xueqian Normal University filed Critical Shaanxi Xueqian Normal University
Priority to CN201710405190.9A priority Critical patent/CN107240096A/en
Publication of CN107240096A publication Critical patent/CN107240096A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The invention discloses a kind of infrared and visual image fusion quality evaluating method, it the described method comprises the following steps:Obtain infrared source images, visible light source image and fused images;Its global minutia index is calculated using the gradient amplitude comentropy of fused images;Infrared source images are divided into by target area and background area using the marking area extraction method of frequency domain, and division result is mapped to fused images;The marginal texture similarity between infrared image target area and fused images target area and the contrast between fused images target and background are calculated respectively, and their summations are obtained into fused images localized target characteristic index;Finally by the global minutia index of fused images and local target signature index weighted sum, infrared and visual image fusion quality evaluation index is obtained.The present invention can both evaluate the global minutia and local target signature of fused images, can also evaluate the global feature of fused images, taken into account the versatility during actual evaluation and particularity requirement, had higher uniformity with subjective assessment.

Description

A kind of infrared and visual image fusion quality evaluating method
Technical field
The present invention relates to image fusion quality assessment field, more particularly to a kind of infrared commented with visual image fusion quality Valency method.
Background technology
Infrared is the important branch of image co-registration with visual image fusion, is also the emphasis of current image co-registration research. Infrared sensor is imaged by heat radiation, is conducive to protruding the target area in scene, but it is thin to characterize scene well Save feature;Visible light sensor passes through object catoptric imaging, using the teaching of the invention it is possible to provide the detailed information of scene where target.Therefore, it is infrared Not only there is the target signature of good infrared image with visual image fusion, and visible images can also be retained well Detailed information.Relative to the image fusion technology quickly grown, fused image quality evaluation development is relatively slow.Image co-registration Quality evaluation can not only compare fusion method performance quality, be also used as instructing the improved foundation of fusion method.Therefore, it is red It is most important to the application of follow-up fused images with visual image fusion quality evaluating method outside.
Found by the retrieval to prior art, Chinese patent literature CN103049893A, publication date 2013.04.17, A kind of method and device of image fusion quality assessment is disclosed, is comprised the following steps:Step 1) obtain each source images and described The fused images of source images;Step 2) each source images using fuzzy clustering method split to obtain segmentation figure picture, and will be each Segmentation figure picture merges into a total segmentation figure;Step 3) the vision variance notable figures of each source images is obtained, according to vision variance Notable figure calculates weights figure, and calculates source images and each region of fused images according to vision variance notable figure and total segmentation figure Notable coefficient;Step 4) according to total segmentation figure, weights figure and notable coefficient, calculate fused images and source images on regional Weighting structures similarity;Step 5) the weighting structures similarity in all regions, which is summed, obtains the evaluation of the fused image quality Index.
Chinese patent literature CN104008543A, publication date 2014.08.27, disclose a kind of image co-registration quality and comment Valency method, comprises the following steps:Step 1) obtain source images:It is required that the source images time is upper synchronously, spatially cover the same area; Step 2) image preprocessing step:Using quadratic polynomial registration, arest neighbors interpolation method resampling;Step 3) image co-registration:It is right Source images use the fusion method that multiscale analysis and composition Shift Method are combined, and obtain the fusion figure under different fusion methods Picture;Step 4) fusion mass evaluation:The cross entropy and structural similarity between fused images and source images are calculated, cross entropy is set up With structural similarity weighted function model, the fusion mass calculated under preset weights evaluates total value.
Although above two technology can evaluate fused image quality, they are in infrared and visual image fusion matter Amount evaluation is performed poor, and is analyzed its reason and is:
1. above two technology is by calculating the similarity evaluation fused image quality between source images and fused images, Both technologies have ignored the minutia of fused images in itself;
2. one of infrared main purpose with visual image fusion is the target signature for retaining infrared source images, but above-mentioned The target signature that two kinds of technologies are not directed to fused images is evaluated;
3. a kind of method and device (Chinese patent literature CN103049893A) of image fusion quality assessment uses mould Clustering method segmentation infrared image is pasted, but the image that disturbs complex background of the dividing method and low signal-to-noise ratio (SNR) images can not be correct Segmentation, influences follow-up evaluation result;
4. a kind of Quality Measures for Image Fusion (Chinese patent literature CN104008543A) and a kind of image co-registration The method and device (Chinese patent literature CN103049893A) of quality evaluation calculates fused images using structural similarity Similarity between source images, fused image quality is evaluated with this.But when two images are more obscured, respectively using both The image fusion quality assessment result that technology is obtained has poor uniformity with subjective assessment.
Therefore, the global minutia of fused images and local target signature, and and subjective assessment can be reflected by seeking one kind Infrared and visual image fusion quality evaluating method with higher uniformity, to evaluating infrared and visual image fusion matter The quality of amount is vital.
The content of the invention
The present invention is directed to deficiencies of the prior art, it is proposed that one kind is infrared to be commented with visual image fusion quality Valency method.
The present invention uses following technical scheme to solve above-mentioned technical problem:
A kind of infrared and visual image fusion quality evaluating method, is comprised the steps of:
Step 1:Obtain infrared source images A, visible light source image B and fused images F;
Step 2:Calculate the global minutia index of fused images;
Step 3:Calculate the localized target characteristic index of fused images;
Step 4:The global minutia index of fused images and local target signature index are weighted summation, obtained Fused image quality evaluation index.
It is used as a kind of infrared and the further prioritization scheme of visual image fusion quality evaluating method, step 2 of the invention Described in calculate the global minutia indexs of fused images, its detailed process is:
Step 2.1:Calculate fused images F gradient magnitude
For fused images F, its horizontal gradient is calculatedAnd vertical gradient
Here, it regard sum of the two as gradient:
Step 2.2:Fused images gradient amplitude comentropy is calculated, and as the global minutia index of fused images Rglobal
Wherein, piThe probability that gray value occurs for i pixel in fused images gradient map is represented, L is gray scale of image etc. Level.
It is used as a kind of infrared and the further prioritization scheme of visual image fusion quality evaluating method, step 3 of the invention Described in calculate the localized target characteristic indexs of fused images, its detailed process is:
Step 3.1:The region division of infrared source images and fused images
Infrared source images are divided into by target area and background area using the marking area extracting method based on frequency domain, Marking area extracting method step based on frequency domain is as follows:
The notable feature of infrared source images is extracted using Gaussian band-pass filter, Gaussian band-pass filter is defined as follows:
Wherein, σ121> σ2) be Gaussian filter standard variance.
In order to obtain all frequency values of low-frequency range, σ as much as possible1It is set to infinity;In order to remove the high frequency of image Noise and texture information, are selected first with the discrete Gauss value of small Gaussian kernel filter fits.
The saliency map S of image is calculated using following formula:
S (x, y)=| Aμ-Awhc(x,y)|
In formula, AμFor infrared radiation source gradation of image average value;Awhc(x, y) is image of the infrared source images after gaussian filtering; | | it is L1Norm.
The target and background of infrared source images is split using region-growing method, comprised the following steps that:1) notable The maximum point of selection gray value is used as seed point in degree figure;2) centered on seed point, it is considered to its 4 neighborhood territory pixel point, if full Sufficient growing strategy, is merged.Difference using neighborhood territory pixel point and cut zone gray average is as similarity measure, difference Minimum neighborhood similitude is merged into cut zone;3) when similarity measure is more than segmentation threshold, then stop growing.
Step 3.2:Calculate the marginal texture similarity between fused images target area and infrared radiation source image target area ESSIM(Alocal,Flocal)
In formula,The target area of respectively infrared source images and the target area of fused images;l(Alocal, Flocal), c (Alocal,Flocal), s (Alocal,Flocal), e (Alocal,Flocal) it is respectively target area AlocalAnd target area FlocalBrightness ratio compare that component, structure compare component and edge compares component compared with component, contrast;Parameter alpha, β, γ, η difference For their weight, α=β=γ=η=1 is generally taken;Respectively target area AlocalWith target area Flocal's Pixel average;Respectively target area AlocalWith target area FlocalPixel variance;For target area Domain AlocalWith target area FlocalBetween covariance,Respectively target area AlocalWith target area Flocal's The gray variance of edge image,For the gray scale covariance of two edges of regions images, edge image here is to use Sobel edge detection methods are obtained;c1,c2,c3,c4For constant, it is to there is unstability when avoiding denominator close to 0 to introduce purpose.
Step 3.3:Calculate the contrast between the target and background of fused images
First, the luminance mean value detraction contrast normalization coefficient (abbreviation MSCN coefficients) of fused images is calculated, it calculates public Formula is as follows:
In formula, constant C is unstable to occur when avoiding image flat region denominator from being intended to zero;
For two dimension The symmetrical gaussian weighing function of circle, makes K=L=3 here.
Secondly, the weber contrast between target and background is calculated, calculation formula is:Cw=| Lt-Lb|/Lb
Wherein, LtAnd LbThe average value of the MSCN coefficients of pixel respectively in target area and neighbouring background area.
Step 3.4 calculates the localized target characteristic evaluating index of fused images
By the marginal texture similarity ESSIM (A between fused images target area and infrared radiation source image target arealocal, Flocal) contrast C between the target and background of fused imageswIt is added, the localized target characteristic evaluating for obtaining fused images refers to Mark Rlocal, i.e.,:
Rlocal=ESSIM (Alocal,Flocal)+CW
The present invention uses above technical scheme compared with prior art, with following technique effect:
1. the present invention had both considered infrared and visual image fusion global minutia, it is contemplated that fused images Target signature, embodies and infrared merges purpose with visual image fusion;
2. the present invention using the marking area extracting method based on frequency domain by infrared source images be divided into target area and Background area, for thering is the image of complex background interference and low signal-to-noise ratio (SNR) images correctly to split;
3. the present invention calculates the MSCN coefficients of fused images first, the contrast gain for simulating human vision was covered Journey, finally calculates weber contrast between target and background, can effectively realize the visitor of fused images target and context-aware contrast See and evaluate;
4. the present invention is using between marginal texture Similarity Measure fused images target area and infrared radiation source image target area Correlation, to a certain extent, can effective evaluation more fuzzy infrared and visual image fusion quality.
5. being verified by emulation experiment, infrared and visual image fusion quality evaluating method of the invention has higher Subjective consistency, can preferably reflect infrared and visual image fusion quality.
Brief description of the drawings
Fig. 1 is a kind of infrared flow chart with visual image fusion quality evaluating method that the present invention is provided;
Fig. 2 is the fused images and its gradient amplitude figure and gradient amplitude histogram that the present invention is provided;
Fig. 3 is the infrared source images that the present invention is provided and the segmentation figure picture of fused images;
Fig. 4 is the first group of source images and fused images that the present invention is provided;
Fig. 5 is the second group of source images and fused images that the present invention is provided;
Fig. 6 is the 3rd group of source images and the fused images that the present invention is provided;
Fig. 7 is the 4th group of source images and the fused images that the present invention is provided.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, below in conjunction with accompanying drawing, the present invention is entered Row is described in detail.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not used to limit this hair It is bright.
With reference to Fig. 1, the invention discloses a kind of infrared and visual image fusion quality evaluating method, following step is included Suddenly:
Step 1:Obtain infrared source images A, visible light source image B and fused images F;
Step 2:Calculate the global minutia index R of fused imagesglobal
The gradient information of image is the higher content of human eye sensitivity's degree in image, can be as characterizing image detail A kind of feature of information.
For fused images F, its horizontal gradient is calculatedAnd vertical gradient
Using sum of the two as fused images gradient:
Fig. 2 gives fused images and its gradient amplitude figure and gradient amplitude histogram, wherein, Fig. 2 a are to be based on averaging method (AVE) image co-registration, gradient amplitude figure and gradient amplitude histogram corresponding to it are shown in Fig. 2 b and Fig. 2 c respectively;Fig. 2 d are bases In laplacian pyramid (LAP) image co-registration, its corresponding gradient amplitude figure and gradient amplitude histogram are shown in Fig. 2 e respectively With Fig. 2 f.
Analysis chart 2 understands that Fig. 2 d texture-rich, high resolution, picture contrast are obvious, and quality is better than Fig. 2 a.Accordingly Ground, although the shape of the corresponding histogram of gradients of two width fused images is one narrow unimodal, and top is close to Grad The direction for being zero, but their width and gray scale dynamic range exist it is significantly different:1) peak type of Fig. 2 f peak type compared with Fig. 2 c is delayed; 2) peak-peak is different, Fig. 2 c peak-peak for 14000, Fig. 2 f peak-peak less than 8000;3) grayscale dynamic range Difference, Fig. 2 f gray scale dynamic range is wide compared with Fig. 2 c, and its information included is more enriched.
Degree can be enriched with the details of quantitative assessment fused images using histogram of gradients.Histogram is the angle from probability The feature of image gradient is studied, degree is enriched by calculating fused images gradient information entropy quantitative analysis fused images details.Melt Close image overall minutia Index Formula as follows:
Wherein, piThe probability that gray value occurs for i pixel in fused images gradient map is represented, L is gray scale of image etc. Level.
Step 3:Calculate the localized target characteristic index R of fused imageslocal
Calculate the localized target characteristic index R of fused imageslocalComprise the following steps that:
Step 3.1:The region division of infrared source images and fused images
Infrared source images are divided into by target area and background area, base using the marking area extraction method based on frequency domain Comprised the following steps that in the marking area extraction method of frequency domain:
The notable feature of infrared source images is extracted using Gaussian band-pass filter, Gaussian band-pass filter is defined as follows:
Wherein, σ121> σ2) be Gaussian filter standard variance.In order to obtain all of low-frequency range as much as possible Frequency values, σ1It is set to infinity;In order to remove the high-frequency noise and texture information of image, select first with small Gaussian kernel wave filter The discrete Gauss value of fitting.
The saliency map S of image is calculated using following formula:
S (x, y)=| Aμ-Awhc(x,y)|
In formula, AμFor infrared radiation source gradation of image average value;Awhc(x, y) is image of the infrared source images after gaussian filtering; | | it is L1Norm.
The target and background of infrared source images is split using region-growing method, comprised the following steps that:1) notable The maximum point of selection gray value is used as seed point in degree figure;2) centered on seed point, it is considered to its 4 neighborhood territory pixel point, if full Sufficient growing strategy, is being merged.Difference using neighborhood territory pixel point and cut zone gray average is as similarity measure, poor The minimum neighborhood similitude of value is merged into cut zone;3) when similarity measure is more than segmentation threshold, then stop growing.
Fig. 3 gives infrared source images and the segmentation figure picture of fused images, and wherein Fig. 3 a~Fig. 3 c are respectively infrared radiation source figure Picture, AVE fused images (fused images obtained through averaging method), LAP fused images are (through melting that laplacian pyramid is obtained Close image), Fig. 3 d~Fig. 3 f are respectively their own region segmentation figure.
From figure 3, it can be seen that this method is effectively extracted the target area of infrared image, as shown in Fig. 3 d;By target area The division result in domain is respectively mapped to AVE fused images and LAP fused images, the target area segmentation difference of two width fused images As shown in Fig. 3 e and Fig. 3 f.From human eye observation it is recognized that while AVE fused images and LAP fused images can detect thermal target, But the target signature of LAP fusion methods preferably remains the target signature of infrared image, its fusion mass is better than AVE fusion sides Method.
Step 3.2:Calculate the marginal texture similarity between fused images target area and infrared radiation source image target area
In formula,The target area of respectively infrared source images and the target area of fused images;l(Alocal, Flocal), c (Alocal,Flocal), s (Alocal,Flocal), e (Alocal,Flocal) be respectively infrared source images target area AlocalWith fused images target area FlocalBrightness ratio compare component, structure compared with component, contrast and compare component and edge Compare component;Parameter alpha, beta, gamma, η is respectively their weight, generally takes α=β=γ=η=1;Respectively target Region AlocalWith target area FlocalPixel average;Respectively target area AlocalWith target area Flocal's Pixel variance;For target area AlocalWith target area FlocalBetween covariance,Respectively target area Domain AlocalWith target area FlocalEdge image gray variance,For the gray scale covariance of two edges of regions images, Here edge image is obtained using Sobel edge detection methods;c1,c2,c3,c4For constant, it is to avoid denominator to introduce purpose There is unstability during close to 0.
Step 3.3:Calculate the contrast between the target and background of fused images
First, the luminance mean value detraction contrast normalization coefficient (abbreviation MSCN coefficients) of fused images is calculated, it calculates public Formula is as follows:
In formula, constant C is unstable to occur when avoiding image flat region denominator from being intended to zero;
For two dimension The symmetrical gaussian weighing function of circle, makes K=L=3 here.
Secondly, the weber contrast between target and background is calculated, calculation formula is:Cw=| Lt-Lb|/Lb.Wherein, LtAnd Lb The average value of the MSCN coefficients of pixel respectively in target area and neighbouring background area.
Step 3.4:Calculate the localized target characteristic evaluating index R of fused imageslocal
By edges of regions structural similarity ESSIM (Alocal,Flocal) contrast between the target and background of fused images CwIt is added, obtains the localized target characteristic evaluating index R of fused imageslocal, i.e.,:Rlocal=ESSIM (Alocal,Flocal)+CW
Step 4:The global minutia index of fused images and local target signature index are weighted summation, obtained Fused image quality evaluation index, calculation formula is:R=w1Rglobal+w2Rlocal
In formula, w1,w2Respectively global minutia RglobalWith local target signature RlocalWeight, generally take w1= 0.6,w2=0.4.
Evaluation criterion:Evaluation index R values are bigger, represent infrared more excellent with visual image fusion quality;Conversely, representing red It is poorer with visual image fusion quality outside.
The present invention gives based on the simulation result figure under certain simulated conditions, technical solution of the present invention acquisition is embodied Beneficial effect.
The present invention is infrared to four groups first to use averaging method (AVE), principal component analytical method with visible light source image (PCA), Laplacian-pyramid method (LAP) and discrete wavelet (DWT) are merged, and the recycling present invention is a kind of infrared and can See that light image fusion mass evaluation method is evaluated fused images.
The present invention choose it is wherein representative it is 4 groups infrared be shown with visible light source image and fused images, divide Not as shown in Fig. 4-Fig. 7, wherein, in figure a, b be respectively in the infrared and visible light source image of rigid registrations, figure c-f respectively should The fused images obtained with AVE, PCA, LAP, DWT.
Table 1 gives above-mentioned 4 groups infrared and visual image fusion evaluation of estimate.
Table is 1 four groups infrared with visual image fusion quality evaluation value
As can be seen from Table 1:1) in Fig. 4, the optimal quality of LAP fusion methods, next to that DWT fusion methods;2) exist In Fig. 5, DWT fusion method optimal qualities, next to that LAP fusion methods.Although the fused images details that PCA fusion methods are obtained Compare abundant, but its localized target characteristic mass is poor, causes the total quality of fused images poor.3) in Fig. 6 and Fig. 7 In, it is that DWT fusion methods are optimal, next to that LAP fusion methods.
Comprehensive to understand, the fused image quality obtained using DWT or LAP fusion methods is preferable.Because:One side Face, because wavelet analysis method take into account the characteristic of multiresolution, no matter reasonability from algorithm or human eye subjective assessment In terms of be superior to AVE fusion methods and PCA fusion methods;On the other hand, DWT fusion methods be can be seen that from Fig. 4-Fig. 7 AVE fusion methods and PCA fusion methods are superior to LAP fusion methods.Therefore, the present invention is effective, its result and subjectivity Evaluation result has preferable uniformity.
It can be seen from Table 1 that:The present invention can both evaluate the infrared global minutia with visible ray fused images and Localized target feature, can also evaluate the global feature of fused images, take into account versatility during actual evaluation and special Property require that there is certain directive function to further improving infrared with visible light image fusion method.

Claims (3)

1. a kind of infrared and visual image fusion quality evaluating method, it is characterised in that comprise the steps of:
Step 1) obtain infrared source images A, visible light source image B and fused images F;
Step 2) calculate fused images global minutia index;
Step 3) calculate fused images localized target characteristic index;
Step 4) the global minutia index of fused images and local target signature index are weighted summation, obtain infrared With visual image fusion quality evaluation index.
2. one kind according to claim 1 is infrared with visual image fusion quality evaluating method, it is characterised in that step 2) detailed process of the global minutia index of the calculating fused images described in is:
Fused images F horizontal gradient is calculated firstAnd vertical gradient
<mrow> <mo>&amp;dtri;</mo> <msub> <mi>f</mi> <mi>H</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>F</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>F</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow>
<mrow> <mo>&amp;dtri;</mo> <msub> <mi>f</mi> <mi>V</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>F</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>-</mo> <mi>F</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow>
Using sum of the two as the point gradient:
<mrow> <mo>&amp;dtri;</mo> <mi>f</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <mo>&amp;dtri;</mo> <msub> <mi>f</mi> <mi>H</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>+</mo> <mo>|</mo> <mo>&amp;dtri;</mo> <msub> <mi>f</mi> <mi>V</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>|</mo> </mrow>
Fused images gradient amplitude comentropy is calculated again, in this, as the global minutia index R of fused imagesglobal, its table It is as follows up to formula:
In formula, piThe probability that gray value occurs for i pixel in fused images gradient map is represented, L is the tonal gradation of image.
3. one kind according to claim 1 is infrared with visual image fusion quality evaluating method, it is characterised in that step 3) detailed process of the localized target characteristic index of the calculating fused images described in is:
First, infrared source images are divided into by target area and background area using the marking area extracting method based on frequency domain Domain, and division result is mapped in fused images;
Secondly, the marginal texture similarity ESSIM between fused images target area and infrared radiation source image target area is calculated (Alocal,Flocal), expression formula is shown in shown in formula (1):
ESSIM(Alocal,Flocal)=[l (Alocal,Flocal)]α[c(Alocal,Flocal)]β[s(Alocal,Flocal)]γ[e(Alocal, Flocal)]η (1)
In formula, Alocal,FlocalThe target area of respectively infrared source images and the target area of fused images;l(Alocal, Flocal), c (Alocal,Flocal), s (Alocal,Flocal), e (Alocal,Flocal) be respectively infrared source images target area Alocal With fused images target area FlocalBrightness ratio compare that component, structure compare component and edge compares point compared with component, contrast Amount;Parameter alpha, beta, gamma, η is respectively their weight, generally takes α=β=γ=η=1.
Then, the contrast C between the target and background of fused images is calculatedW, expression formula is shown in shown in formula (2):
Cw=| Lt-Lb|/Lb (2)
In formula, LtAnd LbThe average value of the MSCN coefficients of pixel respectively in target area and neighbouring background area.Wherein, merge The MSCN coefficient formulas of image is as follows:
<mrow> <mover> <mi>F</mi> <mo>&amp;OverBar;</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>F</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>&amp;mu;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mi>&amp;sigma;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> </mfrac> </mrow>
In formula, constant C is unstable to occur when avoiding image flat region denominator from being intended to zero;
ω={ ωkl| K=-K ..., K;L=-L ..., L } it is the symmetrical gaussian weighing function of two-dimensional circle, K=L=3 is made here.
Finally, by target area marginal texture similarity ESSIM (Alocal,Flocal) contrast C between target and backgroundWAsk With, obtain obtain fused images localized target characteristic evaluating index Rlocal, i.e. Rlocal=ESSIM (Alocal,Flocal)+CW
CN201710405190.9A 2017-06-01 2017-06-01 A kind of infrared and visual image fusion quality evaluating method Pending CN107240096A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710405190.9A CN107240096A (en) 2017-06-01 2017-06-01 A kind of infrared and visual image fusion quality evaluating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710405190.9A CN107240096A (en) 2017-06-01 2017-06-01 A kind of infrared and visual image fusion quality evaluating method

Publications (1)

Publication Number Publication Date
CN107240096A true CN107240096A (en) 2017-10-10

Family

ID=59985675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710405190.9A Pending CN107240096A (en) 2017-06-01 2017-06-01 A kind of infrared and visual image fusion quality evaluating method

Country Status (1)

Country Link
CN (1) CN107240096A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062757A (en) * 2018-01-05 2018-05-22 北京航空航天大学 It is a kind of to utilize the method for improving Intuitionistic Fuzzy Clustering algorithm extraction infrared target
CN108256519A (en) * 2017-12-13 2018-07-06 苏州长风航空电子有限公司 A kind of notable detection method of infrared image based on global and local interaction
CN108447058A (en) * 2018-03-30 2018-08-24 北京理工大学 A kind of image quality evaluating method and system
CN108549874A (en) * 2018-04-19 2018-09-18 广州广电运通金融电子股份有限公司 A kind of object detection method, equipment and computer readable storage medium
CN108830847A (en) * 2018-06-19 2018-11-16 中国石油大学(华东) Visible light is objectively evaluated with infrared grayscale fusion image perceptual contrast
CN109118493A (en) * 2018-07-11 2019-01-01 南京理工大学 A kind of salient region detecting method in depth image
CN109166131A (en) * 2018-09-29 2019-01-08 西安工业大学 The infrared vehicle night vision anti-blooming light image segmentation and evaluation method merged with visible light
CN109447980A (en) * 2018-11-12 2019-03-08 公安部第三研究所 Realize method, computer readable storage medium and the processor of image quality evaluation control
CN110035239A (en) * 2019-05-21 2019-07-19 北京理工大学 One kind being based on the more time of integration infrared image fusion methods of gray scale-gradient optimizing
CN110111292A (en) * 2019-04-30 2019-08-09 淮阴师范学院 A kind of infrared and visible light image fusion method
CN110322423A (en) * 2019-04-29 2019-10-11 天津大学 A kind of multi-modality images object detection method based on image co-registration
CN110428455A (en) * 2019-04-19 2019-11-08 中国航空无线电电子研究所 A kind of visible images and far infrared image object method for registering
CN110555823A (en) * 2019-04-11 2019-12-10 江南大学 Image fusion quality evaluation method based on TVL structure texture decomposition
CN110889817A (en) * 2019-11-19 2020-03-17 中国人民解放军海军工程大学 Image fusion quality evaluation method and device
CN111583315A (en) * 2020-04-23 2020-08-25 武汉卓目科技有限公司 Novel visible light image and infrared image registration method and device
CN114331937A (en) * 2021-12-27 2022-04-12 哈尔滨工业大学 Multi-source image fusion method based on feedback iterative adjustment under low illumination condition

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102881010A (en) * 2012-08-28 2013-01-16 北京理工大学 Method for evaluating perception sharpness of fused image based on human visual characteristics
CN106709916A (en) * 2017-01-19 2017-05-24 泰康保险集团股份有限公司 Image quality assessment method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102881010A (en) * 2012-08-28 2013-01-16 北京理工大学 Method for evaluating perception sharpness of fused image based on human visual characteristics
CN106709916A (en) * 2017-01-19 2017-05-24 泰康保险集团股份有限公司 Image quality assessment method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
张小凤: "可见光和红外图像融合质量评价研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
张玉珍 等: "基于相位一致性的结构相似度图像融合质量评价方法", 《兵工学报》 *
沈军民 等: "结合结构信息和亮度统计的无参考图像质量评价", 《电子学报》 *
郑加苏: "基于图像信息熵的无参考图像质量评估算法的研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108256519A (en) * 2017-12-13 2018-07-06 苏州长风航空电子有限公司 A kind of notable detection method of infrared image based on global and local interaction
CN108256519B (en) * 2017-12-13 2022-06-17 苏州长风航空电子有限公司 Infrared image significance detection method based on global and local interaction
CN108062757A (en) * 2018-01-05 2018-05-22 北京航空航天大学 It is a kind of to utilize the method for improving Intuitionistic Fuzzy Clustering algorithm extraction infrared target
CN108062757B (en) * 2018-01-05 2021-04-30 北京航空航天大学 Method for extracting infrared target by using improved intuitionistic fuzzy clustering algorithm
CN108447058B (en) * 2018-03-30 2020-07-14 北京理工大学 Image quality evaluation method and system
CN108447058A (en) * 2018-03-30 2018-08-24 北京理工大学 A kind of image quality evaluating method and system
CN108549874A (en) * 2018-04-19 2018-09-18 广州广电运通金融电子股份有限公司 A kind of object detection method, equipment and computer readable storage medium
CN108830847A (en) * 2018-06-19 2018-11-16 中国石油大学(华东) Visible light is objectively evaluated with infrared grayscale fusion image perceptual contrast
CN109118493A (en) * 2018-07-11 2019-01-01 南京理工大学 A kind of salient region detecting method in depth image
CN109118493B (en) * 2018-07-11 2021-09-10 南京理工大学 Method for detecting salient region in depth image
CN113077482A (en) * 2018-09-29 2021-07-06 西安工业大学 Quality evaluation method for fused image
CN113077482B (en) * 2018-09-29 2024-01-19 西安工业大学 Quality evaluation method of fusion image
CN109166131A (en) * 2018-09-29 2019-01-08 西安工业大学 The infrared vehicle night vision anti-blooming light image segmentation and evaluation method merged with visible light
CN109166131B (en) * 2018-09-29 2021-06-29 西安工业大学 Infrared and visible light fused automobile night vision anti-blooming image segmentation and evaluation method
CN109447980A (en) * 2018-11-12 2019-03-08 公安部第三研究所 Realize method, computer readable storage medium and the processor of image quality evaluation control
CN109447980B (en) * 2018-11-12 2021-12-10 公安部第三研究所 Method for implementing image quality evaluation control, computer readable storage medium and processor
CN110555823A (en) * 2019-04-11 2019-12-10 江南大学 Image fusion quality evaluation method based on TVL structure texture decomposition
CN110555823B (en) * 2019-04-11 2023-04-07 江南大学 Image fusion quality evaluation method based on TVL structure texture decomposition
CN110428455B (en) * 2019-04-19 2022-11-04 中国航空无线电电子研究所 Target registration method for visible light image and far infrared image
CN110428455A (en) * 2019-04-19 2019-11-08 中国航空无线电电子研究所 A kind of visible images and far infrared image object method for registering
CN110322423A (en) * 2019-04-29 2019-10-11 天津大学 A kind of multi-modality images object detection method based on image co-registration
CN110111292A (en) * 2019-04-30 2019-08-09 淮阴师范学院 A kind of infrared and visible light image fusion method
CN110111292B (en) * 2019-04-30 2023-07-21 淮阴师范学院 Infrared and visible light image fusion method
CN110035239A (en) * 2019-05-21 2019-07-19 北京理工大学 One kind being based on the more time of integration infrared image fusion methods of gray scale-gradient optimizing
CN110889817A (en) * 2019-11-19 2020-03-17 中国人民解放军海军工程大学 Image fusion quality evaluation method and device
CN111583315A (en) * 2020-04-23 2020-08-25 武汉卓目科技有限公司 Novel visible light image and infrared image registration method and device
CN114331937A (en) * 2021-12-27 2022-04-12 哈尔滨工业大学 Multi-source image fusion method based on feedback iterative adjustment under low illumination condition

Similar Documents

Publication Publication Date Title
CN107240096A (en) A kind of infrared and visual image fusion quality evaluating method
CN102842136B (en) A kind of optic disk projective iteration method of comprehensive vascular distribution and optic disk appearance characteristics
CN103606132B (en) Based on the multiframe Digital Image Noise method of spatial domain and time domain combined filtering
CN102609939B (en) TFDS (Train Coach Machine Vision Detection System) image quality evaluation method and system
CN103048329B (en) A kind of road surface crack detection method based on active contour model
CN101976444B (en) Pixel type based objective assessment method of image quality by utilizing structural similarity
CN106651888B (en) Colour eye fundus image optic cup dividing method based on multi-feature fusion
CN109242888A (en) A kind of infrared and visible light image fusion method of combination saliency and non-down sampling contourlet transform
CN103295191A (en) Multi-scale vision self-adaptation image enhancing method and evaluating method
CN104361582A (en) Method of detecting flood disaster changes through object-level high-resolution SAR (synthetic aperture radar) images
CN108171695A (en) A kind of express highway pavement detection method based on image procossing
CN104574381B (en) A kind of full reference image quality appraisement method based on local binary patterns
CN108550145A (en) A kind of SAR image method for evaluating quality and device
CN106815583A (en) A kind of vehicle at night license plate locating method being combined based on MSER and SWT
CN109190624A (en) Kitchen fume concentration detection method based on image procossing
CN101650439A (en) Method for detecting change of remote sensing image based on difference edge and joint probability consistency
CN107341790A (en) A kind of image processing method of environment cleanliness detection
CN109993797A (en) Door and window method for detecting position and device
CN116205823A (en) Ultrasonic image denoising method based on spatial domain filtering
CN108537787A (en) A kind of quality judging method of facial image
CN101968885A (en) Method for detecting remote sensing image change based on edge and grayscale
CN102842120A (en) Image blurring degree detection method based on supercomplex wavelet phase measurement
CN106887002B (en) A kind of infrared image sequence conspicuousness detection method
CN109886195A (en) Skin identification method based on depth camera near-infrared single color gradation figure
CN105787912A (en) Classification-based step type edge sub pixel localization method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20171010

WD01 Invention patent application deemed withdrawn after publication