CN105005976A - Fusion based infrared image enhancement method - Google Patents

Fusion based infrared image enhancement method Download PDF

Info

Publication number
CN105005976A
CN105005976A CN201510400962.0A CN201510400962A CN105005976A CN 105005976 A CN105005976 A CN 105005976A CN 201510400962 A CN201510400962 A CN 201510400962A CN 105005976 A CN105005976 A CN 105005976A
Authority
CN
China
Prior art keywords
component
image
psi
light
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510400962.0A
Other languages
Chinese (zh)
Other versions
CN105005976B (en
Inventor
邱长军
薛晓利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Zhong Haoyingfu Science And Technology Ltd
Original Assignee
Chengdu Zhong Haoyingfu Science And Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Zhong Haoyingfu Science And Technology Ltd filed Critical Chengdu Zhong Haoyingfu Science And Technology Ltd
Priority to CN201510400962.0A priority Critical patent/CN105005976B/en
Publication of CN105005976A publication Critical patent/CN105005976A/en
Application granted granted Critical
Publication of CN105005976B publication Critical patent/CN105005976B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a fusion based infrared image enhancement method. The method comprises a first step of receiving an infrared image P output by an infrared detector; a second step of segmenting the received infrared image P into a low-light component P1, a medium-light component P2 and a high-light component P3; a third step of separately performing image enhancement processing on the low-light component P1, the medium-light component P2 and the high-light component P3, so as to obtain corresponding low-light enhancement component P1', medium-light enhancement component P2' and high-light enhancement component P3'; a fourth step of performing image fusion processing on the low-light enhancement component P1', the medium-light enhancement component P2' and the high-light enhancement component P3' of the third step; and a fifth step of outputting a result P4. Through adoption of the method, while key detail information is maintained, the dynamic scope of the image is compressed and the contrast and definition of the image are improved, and finally, an output result with a wide dynamic range considering both the bright zone and the dark zone of the image is obtained.

Description

A kind of infrared image enhancing method based on merging
Technical field
The invention belongs to image processing field, being specifically related to a kind of infrared image enhancing method based on merging.
Background technology
Nowadays, infrared thermal imaging technique is applied to the fields such as public security, fire-fighting, military affairs, medical science, electric power, industry more and more widely.It utilizes the electromagnetic wave heat radiation of specific wavelength to carry out imaging, and therefore infrared image also can be referred to as temperature pattern.But, because infrared imaging system is through being usually used in the very large scene of temperature span, such as: ground and sky, normal temperature and flame etc.And in this scene, the temperature contrast between the object and background of user's real concern is less.Therefore, in actual application, need infrared imaging system not only to have higher space and temperature resolving power, and there is larger dynamic range of signals.
High-performance infrared imaging system usually adopt 14bits or more seniority signal gathering unit to infrared eye output signal sample and quantize.Like this, in general stable scene, the gray scale of heat picture may concentrate in a less quantizing range, but for some special Larger Dynamic scenes, the gray scale of heat picture just may be distributed in larger quantizing range.In order to ensure that processing result image has suitable brightness and contrast, taking into account the data volume of display device or follow-up fast processing simultaneously, often needing 14bits high accuracy data to be compressed to 8bits data width.If it is improper to compress, the image information with regard to having great dynamic range can not get retaining, that is: Larger Dynamic compression of images may cause in 8bits image the detailed information being difficult to restore original image and losing.
At present, strengthen the point of penetration of process in whole imaging process according to infrared image details, the 8bits gray level image that can export infrared system compression carries out details and strengthens process; But need data volume to be processed less (8bits), processing speed is relatively very fast, but the details in image is often just lost in compression process, be difficult to be restored again by follow-up enhancing process.
Summary of the invention
The object of the invention is to overcome the deficiencies in the prior art, provide one to take into account image clear zone, dark space, there is the infrared image enhancing method based on fusion of wide dynamic range Output rusults.
For realizing above object, the present invention adopts following technical scheme: a kind of infrared image enhancing method based on merging, the method comprises the steps:
The infrared image P that S1, reception infrared eye export;
S2, the described infrared image P received is divided into low-light (level) component P1, medium luminance component P2 and high luminance component P3;
S3, respectively image enhancement processing is carried out to described low-light (level) component P1, described medium luminance component P2 and described high luminance component P3, corresponding obtain that low-light (level) strengthens component P1 ', middle equiluminous strengthens component P2 ' and high illumination enhancing component P3 ';
S4, component P1 ' is strengthened to low-light (level) described in described step S3, described middle equiluminous strengthens component P2 ' and described high illumination enhancing component P3 ' carries out image co-registration process;
S5, Output rusults P4.
Further, the infrared image P of 14bits is received in described step S1.
Further, in described step S3, carry out power transform respectively to low-light (level) component P1, medium luminance component P2 and high luminance component P3 and carry out image enhancement processing and obtain low-light (level) and strengthen component P1 ', middle equiluminous strengthens component P2 ', high illumination strengthens component P3 ', and concrete formula is as follows:
P 1 , ( x , y ) = C 1 × P 1 ( x , y ) γ 1 + b 1 P 2 , ( x , y ) = C 2 × P 2 ( x , y ) γ 2 + b 2 P 3 , ( x , y ) = C 3 × P 3 ( x , y ) γ 3 + b 3
C1, C2, C3 refer to the coefficient factor of power transform, and b1, b2, b3 refer to the side-play amount of power transform, γ 1, γ 2, γ 3refer to the power series of power transform.
C1, C2, C3, γ 1, γ 2, γ 3be positive number, wherein γ 1< 1, γ 2≈ 1, γ 3> 1;
When=1, power transform changes linear transformation into;
As <0, transform function graph, above direct ratio function, now expands low gray level, and compression high grade grey level, makes image brighten;
As >0, transform function graph, below direct ratio function, now expands high grade grey level, compresses low gray level, makes image dimmed.
Further, strengthen component P1 ' in described step S4 to the low-light (level) that obtains in step S3, middle equiluminous strengthens component P2 ', high illumination strengthens component P3 ' image and carries out fusion treatment, specifically comprises the steps: S41, to P1 ', P2 ', P3 ' carries out small echo direct transform respectively, is specially:
The first, use continuous wavelet transform CWT, formula is:
W ( a , &tau; ) = < x ( t ) , &Psi; a , &tau; ( t ) > = 1 a &Integral; R x ( t ) &Psi; ( t - &tau; a ) d &OverBar; t
Wherein, a represents zoom factor, and τ represents time displacement, and Ψ (t) is wavelet or morther wavelet, represent the complex conjugate of Ψ (t).
The second, use wavelet transform DWT, formula is:
W x ( j , k ) = &Integral; R x ( t ) &Psi; j , k ( t ) &OverBar; d t , Wherein &Psi; j , k ( t ) = 1 2 j &Psi; ( t 2 j - k ) , All the other parameters are the same.
S42, wavelet coefficient merge, and concrete steps are:
The first step, at coefficient image c Δin (m, n) (Δ=A, B), calculate energy centered by (m, n) point in surrounding window region or the variance tolerance as this detailed information intensity:
S &Delta; ( m , n ) = &Sigma; u &Element; U &Sigma; v &Element; V &omega; ( u , v ) &lsqb; c &Delta; ( m + u , n + v ) &rsqb; 2
Wherein ω (u, v) is the template window centered by (0,0), U and V is respectively the set of template window line number and row number composition;
Second step, calculates c aand c bbetween local, normalized cross-correlation coefficient:
M A B ( m , n ) = 2 &Sigma; u &Element; U &Sigma; v &Element; V &omega; ( u , v ) c A ( m + u , n + v ) c B ( m + u , n + v ) S A ( m , n ) + S B ( m , n )
3rd step, according to cross-correlation coefficient size, takes different amalgamation modes: work as M aBduring (m, n)≤α, wherein α=0.85, namely
Work as M aBduring (m, n)≤α, namely
c F(m,n)=W(m,n)c A(m,n)+[I(m,n)-W(m,n)]c B(m,n);
Wherein I (m, n)) be unit matrix, weight coefficient W (m, n)) determined by following formula:
S43, wavelet inverse transformation
If discrete wavelet sequence { ψ j,k(t) } j, k ∈ Zform a framework, its Lower and upper bounds is respectively A and B, then when a=b, can learn being inversely transformed into of wavelet transform by frame concept:
f ( t ) = A - 1 &Sigma; j &lang; f , &psi; j , k ( t ) &rang; &psi; ~ j , k ( t ) = 1 A &Sigma; j , k WT f ( j , k ) &psi; j , k ( t )
As A ≠ B, and A, B relatively time, approach as single order, desirable so, the formula of wavelet inverse transformation is approximately:
f ( t ) = &Sigma; j &lang; f , &psi; j , k ( t ) &rang; &psi; ~ j , k ( t ) &ap; 2 A + B &Sigma; j , k WT f ( j , k ) &psi; j , k ( t ) .
Further, the view data that the image after being strengthened by step S4 in described step S5 is converted to 8bits carries out showing and processing, adopt linear gradation transformation for mula to be:
g ( x , y ) = f ( x , y ) - f min f m a x - f min ( g m a x - g min ) + g m i n
Wherein, the pixel value at central (x, y) coordinate place of f (x, y) representing input images; Represent the minimum value of input picture; Represent the maximal value of input picture; Represent the minimum value of output image; Represent the maximal value of output image; G (x, y) represents the value of output image at coordinate (x, y) place.
The present invention adopts as above technical scheme, be low-light (level) component, medium luminance component and high luminance component by the 14bits Iamge Segmentation of input, then respectively image enhancement processing is carried out to these three kinds of luminance component, finally strengthen result by three to be merged, finally taken into account the Output rusults with wide dynamic range of image clear zone, dark space.The large multipair integral image of existing image enchancing method carry out enhancing process, usually can only Integral lifting brightness of image or suppress dark picture areas, can not both take into account.The present invention is while the key detailed information of reservation, and the dynamic range of compressed image also improves contrast and sharpness.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the infrared image enhancing method that the present invention is based on fusion;
Fig. 2 is the power function curve that the different power series of the power transform of the infrared image enhancing method that the present invention is based on fusion are corresponding;
Fig. 3 is the image co-registration schematic diagram of the wavelet transformation of the infrared image enhancing method that the present invention is based on fusion.
Embodiment
Below in conjunction with accompanying drawing, various embodiment of the present invention is further illustrated:
As shown in Figure 1, the invention provides a kind of infrared image enhancing method based on merging, the method comprises the steps:
The infrared image P that S1, reception infrared eye export;
S2, the described infrared image P received is divided into low-light (level) component P1, medium luminance component P2 and high luminance component P3;
S3, respectively image enhancement processing is carried out to described low-light (level) component P1, described medium luminance component P2 and described high luminance component P3, corresponding obtain that low-light (level) strengthens component P1 ', middle equiluminous strengthens component P2 ' and high illumination enhancing component P3 ';
S4, component P1 ' is strengthened to low-light (level) described in described step S3, described middle equiluminous strengthens component P2 ' and described high illumination enhancing component P3 ' carries out image co-registration process;
S5, Output rusults P4.
In the present embodiment step S2, be specially to adopt and split based on the basic, normal, high luminance component of image of bit plane.
Be 2 for gray level nimage, pixel value can be write as following form:
a n-12 n-1+a n-22 n-2+...+a 12 1+a 02 0
Coefficient is extracted according to formula above by each pixel in image, one width resolving power is that the multivalue image of n just becomes n width bianry image, wherein the i-th width image is made up of i-th binary digit of all pixels, and each bianry image is referred to as a bit plane.
In the present embodiment, the output of infrared eye is generally 14bits, the image of 0---5 binary digit formation can be set as low-light (level) component, the image of 4--9 binary digit formation is as medium luminance component, and the image of 8---13 binary digit formation is as bright field image.That is:
It should be noted that, user can according to the bit plane scope needing to determine belonging to basic, normal, high illumination of practical application.Such as, user is in practical process, and the image that can set 0-2 scale-of-two formations in the middle of input 14bits is low-light (level) bit-plane image, and the image of 11-13 scale-of-two formation is high illumination bit-plane image, and all the other are medium illumination image.
As shown in Figure 2, in described step S3, carry out power transform respectively to low-light (level) component P1, medium luminance component P2 and high luminance component P3 and carry out image enhancement processing and obtain low-light (level) and strengthen component P1 ', middle equiluminous strengthens component P2 ', high illumination strengthens component P3 ', and concrete formula is as follows:
P 1 &prime; ( x , y ) = C 1 &times; P 1 ( x , y ) &gamma; 1 + b 1 P 2 &prime; ( x , y ) = C 2 &times; P 2 ( x , y ) &gamma; 2 + b 2 P 3 &prime; ( x , y ) = C 3 &times; P 3 ( x , y ) &gamma; 3 + b 3
Here C1, C2, C3 refer to the coefficient factor of power transform, and b1, b2, b3 refer to the side-play amount of power transform, γ 1, γ 2, γ 3refer to the power series of power transform.
C1, C2, C3, γ 1, γ 2, γ 3be positive number, wherein γ 1< 1, γ 2≈ 1, γ 3> 1.
When=1, power transform changes linear transformation into;
As <0, transform function graph, above direct ratio function, now expands low gray level, and compression high grade grey level, makes image brighten;
As >0, transform function graph, below direct ratio function, now expands high grade grey level, compresses low gray level, makes image dimmed.
As shown in Figure 3, the clear zone adopting wavelet transformation to obtain back in the present embodiment strengthens image and dark space image merges.First this method carries out wavelet transformation to source images, then adopts suitable fusion rule to merge the wavelet coefficient on different decomposition layer, forms new wavelet pyramid structure, finally carries out inverse wavelet transform again and tries to achieve fused images.Specifically comprise the steps:
S41, small echo direct transform, be specially:
The first, by any L 2(R) function x (t) in space launches below wavelet basis, claims this continuous wavelet transform (ContinoueWaveletTransform, CWT) expanding into x (t):
W ( a , &tau; ) = < x ( t ) , &Psi; a , &tau; ( t ) > = 1 a &Integral; R x ( t ) &Psi; ( t - &tau; a ) d &OverBar; t
Wherein, a represents zoom factor, and τ represents time displacement.
The second, in Digital Image Processing, normally used is wavelet transform, at any L 2(R) wavelet transform (DiscreteWaveletTransform, DWT) of the x (t) in space is:
W x ( j , k ) = &Integral; R x ( t ) &Psi; j , k ( t ) &OverBar; d t , Wherein &Psi; j , k ( t ) = 1 2 j &Psi; ( t 2 j - k )
It should be noted that, discretize here all for continuous print scale parameter and continuous translation parameter, instead of for time variable t's.
S42, wavelet coefficient merge, and in the present embodiment, wavelet coefficient fusion method adopts the weighted mean fusion rule based on window.That is: adopt the window of fixed size to carry out filtering to coefficient image, filtered pixel value is as the tolerance of this detailed information intensity.Concrete steps are:
The first step, at coefficient image c Δin (m, n) (Δ=A, B), calculate energy centered by (m, n) point in surrounding window region or the variance tolerance as this detailed information intensity:
S &Delta; ( m , n ) = &Sigma; u &Element; U &Sigma; v &Element; V &omega; ( u , v ) [ c &Delta; ( m + u , n + v ) ] 2
Wherein ω (u, v) is the template window centered by (0,0), U and V is respectively the set of template window line number and row number composition;
Second step, calculates c aand c bbetween local, normalized cross-correlation coefficient:
M A B ( m , n ) = 2 &Sigma; u &Element; U &Sigma; v &Element; V &omega; ( u , v ) c A ( m + u , n + v ) c B ( m + u , n + v ) S A ( m , n ) + S B ( m , n )
3rd step, according to cross-correlation coefficient size, takes different amalgamation modes: work as M aBduring (m, n)≤α, wherein α generally gets 0.85, illustrates that between source images coefficient, correlativity is lower, chooses the large coefficient of local variance reasonable for merging rear coefficients comparison, namely
Work as M aBduring (m, n) > α, illustrate that between coefficient, correlativity is larger, adopt average weighted method more reasonable, namely
c F(m,n)=W(m,n)c A(m,n)+[I(m,n)-W(m,n)]c B(m,n);
Wherein I (m, n)) be unit matrix, weight coefficient W (m, n)) determined by following formula:
S43, wavelet inverse transformation
Arbitrary function f (t) ∈ L 2(R) wavelet inverse transformation is:
f ( t ) = 1 C &psi; &Integral; R &Integral; R 1 a 2 W f ( a , b ) &psi; ( t - b a d a d b
For digital picture, employing be discrete wavelet inverse transformation.If discrete wavelet sequence { ψ j,k(t) } j, k ∈ Zform a framework, its Lower and upper bounds is respectively A and B, then when a=b, can learn being inversely transformed into of wavelet transform by frame concept:
f ( t ) = A - 1 &Sigma; j &lang; f , &psi; j , k ( t ) &rang; &psi; ~ j , k ( t ) = 1 A &Sigma; j , k WT f ( j , k ) &psi; j , k ( t )
As A ≠ B, and A, B relatively time, approach as single order, desirable so, the formula of wavelet inverse transformation is approximately:
f ( t ) = &Sigma; j &lang; f , &psi; j , k ( t ) &rang; &psi; ~ j , k ( t ) = 2 A + B &Sigma; j , k WT f ( j , k ) &psi; j , k ( t ) .
Analyzing and processing is carried out for 14bits view data in above-mentioned steps S4, and the display device of main flow adopts the view data of 8bits to carry out showing and processing usually, therefore, be necessary that the results conversion after being strengthened by infrared eye is that the form of 8bits facilitates follow-up display and preservation.Adopt linear gradation transformation for mula to be:
g ( x , y ) = f ( x , y ) - f min f m a x - f min ( g m a x - g m i n ) + g min .
Wherein, the pixel value at central (x, y) coordinate place of f (x, y) representing input images; Represent the minimum value of input picture; Represent the maximal value of input picture; Represent the minimum value of output image; Represent the maximal value of output image; G (x, y) represents the value of output image at coordinate (x, y) place.
The present invention is not limited to above-mentioned preferred forms; anyone can draw other various forms of products under enlightenment of the present invention; no matter but any change is done in its shape or structure; every have identical with the application or akin technical scheme, all drops within protection scope of the present invention.

Claims (5)

1., based on the infrared image enhancing method merged, it is characterized in that: the method comprises the steps:
The infrared image P that S1, reception infrared eye export;
S2, the described infrared image P received is divided into low-light (level) component P1, medium luminance component P2 and high luminance component P3;
S3, respectively image enhancement processing is carried out to described low-light (level) component P1, described medium luminance component P2 and described high luminance component P3, corresponding obtain that low-light (level) strengthens component P1 ', middle equiluminous strengthens component P2 ' and high illumination enhancing component P3 ';
S4, component P1 ' is strengthened to low-light (level) described in described step S3, described middle equiluminous strengthens component P2 ' and described high illumination enhancing component P3 ' carries out image co-registration process;
S5, Output rusults P4.
2. a kind of infrared image enhancing method based on merging according to claim 1, is characterized in that: the infrared image P receiving 14bits in described step S1.
3. a kind of infrared image enhancing method based on merging according to claim 1, it is characterized in that: in described step S3, carry out power transform respectively to low-light (level) component P1, medium luminance component P2 and high luminance component P3 to carry out image enhancement processing and obtain low-light (level) and strengthen component P1 ', middle equiluminous strengthens component P2 ', high illumination strengthens component P3 ', and concrete formula is as follows:
P 1 , ( x , y ) = C 1 &times; P 1 ( x , y ) &gamma; 1 + b 1 P 2 , ( x , y ) = C 2 &times; P 2 ( x , y ) &gamma; 2 + b 2 P 3 , ( x , y ) = C 3 &times; P 3 ( x , y ) &gamma; 3 + b 3
C1, C2, C3 refer to the coefficient factor of power transform, and b1, b2, b3 refer to the side-play amount γ of power transform 1, γ 2, γ 3refer to the power series of power transform;
C1, C2, C3, γ 1, γ 2, γ 3be positive number, wherein γ 1< 1, γ 2≈ 1, γ 3> 1;
When=1, power transform changes linear transformation into;
As <0, transform function graph, above direct ratio function, now expands low gray level, and compression high grade grey level, makes image brighten;
As >0, transform function graph, below direct ratio function, now expands high grade grey level, compresses low gray level, makes image dimmed.
4. a kind of infrared image enhancing method based on merging according to claim 1, it is characterized in that: in described step S4, component P1 ' is strengthened to the low-light (level) that obtains in step S3, middle equiluminous strengthens component P2 ', high illumination strengthens component P3 ' image and carries out fusion treatment, specifically comprises the steps:
S41, to P1 ', P2 ', P3 ' carries out small echo direct transform respectively, is specially:
For continuous signal, usually use continuous wavelet transform CWT, formula is:
W ( a , &tau; ) = < x ( t ) , &Psi; a , &tau; ( t ) > = 1 a &Integral; R x ( t ) &Psi; ( t - &tau; a ) &OverBar; d t ;
Wherein, a represents zoom factor, and τ represents time displacement, and Ψ (t) is wavelet or morther wavelet, represent the complex conjugate of Ψ (t);
For image, owing to belonging to discrete signal, therefore usually use wavelet transform DWT, formula is: W x ( j , k ) = &Integral; R x ( t ) &Psi; j , k ( t ) &OverBar; d t , Wherein &Psi; j , k ( t ) = 1 2 j &Psi; ( t 2 j - k ) ;
S42, wavelet coefficient merge, and concrete steps are:
The first step, in coefficient image C Δ (m, n) (Δ=A, B), calculates energy centered by (m, n) point in surrounding window region or the variance tolerance as this detailed information intensity:
S &Delta; ( m , n ) = &Sigma; u &Element; U &Sigma; v &Element; V &omega; ( u , v ) &lsqb; C &Delta; ( m + u , n + v ) &rsqb; 2
Wherein ω (u, v) is the template window centered by (0,0), U and V is respectively the set of template window line number and row number composition;
Second step, calculates two coefficient image C aand C bbetween local, normalized cross-correlation coefficient:
M A B ( m , n ) = 2 &Sigma; u &Element; U &Sigma; v &Element; V &omega; ( u , v ) C A ( m + u , n + v ) C B ( m + u , n + v ) S A ( m , n ) + S B ( m , n )
3rd step, according to cross-correlation coefficient size, takes different amalgamation modes: work as M aBduring (m, n)≤α, wherein
α=0.85, namely
Work as M aBduring (m, n) > α, namely
C F(m,n)=W(m,n)c A(m,n)+[I(m,n)-W(m,n)]c B(m,n);
Wherein I (m, n) is unit matrix, and weight coefficient W (m, n) is determined by following formula:
S43, wavelet inverse transformation
If discrete wavelet sequence { ψ j,k(t) } j, k ∈ Zform a framework, its Lower and upper bounds is respectively A and B, then when a=b, can learn being inversely transformed into of wavelet transform by frame concept:
f ( t ) = A - 1 &Sigma; j < f , &psi; j , k ( t ) > &psi; ~ j , k ( t ) = 1 A &Sigma; j , k WT f ( j , k ) &psi; j , k ( t )
As A ≠ B, and A, B relatively time, approach as single order, desirable so, the formula of wavelet inverse transformation is approximately:
f ( t ) = &Sigma; j < f , &psi; j , k ( t ) > &psi; ~ j , k ( t ) &ap; 2 A + B &Sigma; j , k WT f ( j , k ) &psi; j , k ( t ) .
5. a kind of infrared image enhancing method based on merging according to claim 1, it is characterized in that: the view data that the image after being strengthened by step S4 in described step S5 is converted to 8bits carries out showing and processing, adopt linear gradation transformation for mula to be:
g ( x , y ) = f ( x , y ) - f m i n f m a x - f min ( g m a x - g m i n ) + g m i n
Wherein, the pixel value at central (x, y) coordinate place of f (x, y) representing input images; Represent the minimum value of input picture; Represent the maximal value of input picture; Represent the minimum value of output image; Represent the maximal value of output image; G (x, y) represents the value of output image at coordinate (x, y) place.
CN201510400962.0A 2015-03-30 2015-07-09 A kind of infrared image enhancing method based on fusion Active CN105005976B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510400962.0A CN105005976B (en) 2015-03-30 2015-07-09 A kind of infrared image enhancing method based on fusion

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201510143994 2015-03-30
CN2015101439947 2015-03-30
CN201510400962.0A CN105005976B (en) 2015-03-30 2015-07-09 A kind of infrared image enhancing method based on fusion

Publications (2)

Publication Number Publication Date
CN105005976A true CN105005976A (en) 2015-10-28
CN105005976B CN105005976B (en) 2019-03-05

Family

ID=54378635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510400962.0A Active CN105005976B (en) 2015-03-30 2015-07-09 A kind of infrared image enhancing method based on fusion

Country Status (1)

Country Link
CN (1) CN105005976B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919861A (en) * 2019-01-29 2019-06-21 浙江数链科技有限公司 Infrared image enhancing method, device, computer equipment and storage medium
CN112150399A (en) * 2020-09-27 2020-12-29 安谋科技(中国)有限公司 Image enhancement method based on wide dynamic range and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5995644A (en) * 1997-06-30 1999-11-30 Siemens Corporate Research, Inc. Robust and automatic adjustment of display window width and center for MR images
CN102510502A (en) * 2011-09-30 2012-06-20 苏州佳世达电通有限公司 Method and system for generating high-dynamic-range image
CN103034986A (en) * 2012-11-29 2013-04-10 奇瑞汽车股份有限公司 Night vision image enhancement method based on exposure fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5995644A (en) * 1997-06-30 1999-11-30 Siemens Corporate Research, Inc. Robust and automatic adjustment of display window width and center for MR images
CN102510502A (en) * 2011-09-30 2012-06-20 苏州佳世达电通有限公司 Method and system for generating high-dynamic-range image
CN103034986A (en) * 2012-11-29 2013-04-10 奇瑞汽车股份有限公司 Night vision image enhancement method based on exposure fusion

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
刘贵喜 等: "基于小波分解的图像融合方法及性能评价", 《自动化学报》 *
张亚飞: "基于幂次变换和MSR的光照不均图像增强", 《电脑知识与技术》 *
李骜 等: "基于照度划分的多尺度图像增强新算法", 《吉林大学学报(工学版)》 *
汪荣贵 等: "基于照度分割的局部多尺度Retinex算法", 《电子学报》 *
马士友 等: "基于直方图均衡和幂次变换的灰度图像增强算法", 《计算机应用与软件》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919861A (en) * 2019-01-29 2019-06-21 浙江数链科技有限公司 Infrared image enhancing method, device, computer equipment and storage medium
CN112150399A (en) * 2020-09-27 2020-12-29 安谋科技(中国)有限公司 Image enhancement method based on wide dynamic range and electronic equipment

Also Published As

Publication number Publication date
CN105005976B (en) 2019-03-05

Similar Documents

Publication Publication Date Title
Wang et al. Dehazing for images with large sky region
CN108269244B (en) Image defogging system based on deep learning and prior constraint
CN105825472A (en) Rapid tone mapping system and method based on multi-scale Gauss filters
CN107274365A (en) A kind of mine image intensification method based on unsharp masking and NSCT algorithms
Fu et al. Improved single image dehazing using dark channel prior
CN107292830B (en) Low-illumination image enhancement and evaluation method
CN103049888A (en) Image/video demisting method based on combination of dark primary color of atmospheric scattered light
CN107220957B (en) It is a kind of to utilize the remote sensing image fusion method for rolling Steerable filter
CN102496135B (en) Deadweight tonnage (DWT) domain-based digital watermark method and system
CN104050637A (en) Quick image defogging method based on two times of guide filtration
CN105096278A (en) Image enhancement method based on illumination adjustment and equipment thereof
CN105427257A (en) Image enhancement method and apparatus
CN104463804A (en) Image enhancement method based on intuitional fuzzy set
CN112001843A (en) Infrared image super-resolution reconstruction method based on deep learning
CN107833182A (en) The infrared image super resolution ratio reconstruction method of feature based extraction
Zhang et al. Single image dehazing based on fast wavelet transform with weighted image fusion
CN104299193A (en) Image super-resolution reconstruction method based on high-frequency information and medium-frequency information
CN106971377A (en) A kind of removing rain based on single image method decomposed based on sparse and low-rank matrix
CN107481211A (en) A kind of night traffic based on gradient field fusion monitors Enhancement Method
CN105005976A (en) Fusion based infrared image enhancement method
CN105931198A (en) Icing insulator image enhancement method based on wavelet transformation
Gao et al. Single image haze removal algorithm using pixel-based airlight constraints
CN115953311A (en) Image defogging method based on multi-scale feature representation of Transformer
CN103927753B (en) The absolute blur level method of estimation of a kind of image based on multiple dimensioned restructuring DCT coefficient
Menon et al. An enhanced digital image processing based dehazing techniques for haze removal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant