CN102722864A - Image enhancement method - Google Patents

Image enhancement method Download PDF

Info

Publication number
CN102722864A
CN102722864A CN2012101576620A CN201210157662A CN102722864A CN 102722864 A CN102722864 A CN 102722864A CN 2012101576620 A CN2012101576620 A CN 2012101576620A CN 201210157662 A CN201210157662 A CN 201210157662A CN 102722864 A CN102722864 A CN 102722864A
Authority
CN
China
Prior art keywords
visible images
image
brightness
masking
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012101576620A
Other languages
Chinese (zh)
Other versions
CN102722864B (en
Inventor
戴琼海
付莹
刘烨斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201210157662.0A priority Critical patent/CN102722864B/en
Publication of CN102722864A publication Critical patent/CN102722864A/en
Application granted granted Critical
Publication of CN102722864B publication Critical patent/CN102722864B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides an image enhancement method, comprising steps that: a visible-light image and an infrared image are collected; reversible transformation is carried out on the brightness and the dimension of the visible-light image and the infrared image to obtain contrast information and texture information of the visible-light image and the infrared image; a mask is calculated on the basis of the saturation and the brightness of the visible-light image; the mask is used to migrate the visible-light image and the infrared image, and the contrast information for enhancing the visible-light image is calculated; the mask is used to migrate the visible-light image and the infrared image, and the texture information for enhancing the visible-light image is calculated; reversible transformation is carried out to obtain the brightness of the enhanced visible-light image; and the enhanced visible-light image is obtained on the basis of the brightness of the enhanced visible-light image, the saturation of the visible-light image and tone mixing. The image enhancement method of the embodiment of the invention can automatically realize high dynamic scene information recovery and achieve high qualified visible light image enhancement from two aspects of the contrast and the texture.

Description

A kind of image enchancing method
Technical field
The present invention relates to computer vision field, particularly a kind of image enchancing method.
Background technology
Because the restriction of visible light camera images acquired dynamic range, gather some zone of picture that the tangible scene of light and shade contrast possibly cause obtaining and cross bright and some zone is dark excessively.In order to obtain the image of HDR; Generally can gather the image of many different depth of exposures; Carry out the tone adjustment; Be about to the HDR luminance graph and be transformed into the figure of corresponding low-dynamic range with it, owing to need this image of image of different exposure time generally only to be applicable to static scene.Because original raw form contains more scene dynamics range information with respect to jpeg form commonly used; Also having a kind of method comparatively commonly used is exactly the artificial adjustment of image to the raw form that collects; It also is limited that but this method needs the dynamic range that a lot of manual works is participated in and the raw form collects, and can not collect the high multidate information of real scene fully.
Summary of the invention
The present invention is intended to one of solve the problems of the technologies described above at least.
For this reason, the objective of the invention is to propose a kind of image enchancing method that utilizes infrared image to recover real scene from contrast and texture two aspects.
Image enchancing method according to the embodiment of the invention comprises step: A. gathers visible images and infrared image, and wherein visible images aligns with the photocentre of infrared image; B. the brightness dimension of visible images is carried out reversible transformation, obtain visible images contrast information V LWith visible images texture information V D, and the brightness dimension of infrared image carried out reversible transformation, obtain infrared image contrast information N LWith infrared image texture information N DC. calculate masking-out W according to the saturation degree S of visible images and the brightness V of visible images, masking-out W is used to merge visible images and infrared image; D. according to visible images contrast information V L, infrared image contrast information N LWith masking-out W, calculate the contrast information V ' that strengthens visible images LE. according to visible images texture information V D, infrared image texture information N DWith masking-out W, calculate the texture information V ' that strengthens visible images DF. according to the contrast information V ' that strengthens visible images LWith the texture information V ' that strengthens visible images D, carry out reversible inverse transformation, the brightness V ' of the visible images that is enhanced; And G. mixes the visible images that is enhanced according to the brightness V ' that strengthens visible images and the saturation degree S of visible images and the tone H of visible images.
Method of the present invention is recovered the HDR of real scene through utilizing infrared image from contrast and two aspects of texture, realize that visible images strengthens, and this method can not need the artificial recovery of participating in realizing automatically the information of high dynamic scene.Advantage of the present invention is to utilize the method for calculating shooting in conjunction with infrared image, realizes that from contrast and texture two aspects high-quality visible images strengthens.
In one embodiment of the invention, reversible transformation is a wavelet transformation; Reversible contravariant is changed to inverse wavelet transform.
In one embodiment of the invention, in the visible images, saturation degree S crosses low and the too high or too low zone of brightness V, and masking-out W value is less.
In one embodiment of the invention; Step C comprises: C1. calculates initial masking-out
Figure BDA00001658615600021
according to the brightness V of the saturation degree S of visible images and visible images and C2. is optimized calculating according to initial masking-out
Figure BDA00001658615600022
, obtains masking-out W.
In one embodiment of the invention, calculate initial masking-out among the step C1
Figure BDA00001658615600023
Formula be:
Figure BDA00001658615600024
Wherein, W s=e -α | s-1|, W v=e -β | v-0.5|, α is that positive coefficient, β are positive coefficient.
In one embodiment of the invention, the method for calculating masking-out W is among the step C2:
E ( W ) = W T LW + λ ( W - W ‾ ) T ( W - W ‾ ) - - - ( 1 )
Wherein L is Laplce's matrix, and λ is the specification item coefficient, and (i, j) individual element is defined as in the matrix L
Figure 000000
V wherein iAnd V jBe that visible images is at (i, brightness value j), δ IjBe impulse function, in that (i is 1 j), and other point is 0, μ kBe window w kThe average of middle brightness of image value, ∑ kBe window w kMiddle brightness of image value variance, ε is the standardization parameter, | w k| be window w kThe number of middle element; Differentiate can obtain to equation (1)
( L - λU ) W = λ W ‾ - - - ( 3 )
Then final masking-out W can obtain through finding the solution linear equation (3).
In one embodiment of the invention, strengthen the contrast information V ' of visible images among the step D LComputing formula be: V ' L=WV L+ (1-W) V L ', V wherein L 'Be new luminance graph information, new luminance graph information V L 'Gradient amplitude coupling through visible images and infrared image obtains.
In one embodiment of the invention, it is characterized in that new luminance graph information V L 'Computing method be: use
Figure BDA00001658615600028
The gradient amplitude value of expression visible images,
Figure BDA00001658615600029
The gradient amplitude value of expression infrared image, statistics V GAnd N GProbability histogram, utilize laplacian curve that histogram is carried out match, try to achieve the probability integral function of two curves, the probability integral function is mated, obtain the V ' of matching result G, promptly obtain the gradient that new visible images brightness is tieed up
Figure BDA000016586156000210
Figure BDA000016586156000211
According to what obtain
Figure BDA000016586156000212
With
Figure BDA000016586156000213
Finding the solution Poisson equation finds the solution and obtains new luminance graph information V L '
In one embodiment of the invention, strengthen the texture information V ' of visible images in the step e DComputing formula be: V ' D=WV D+ (1-W) N D
Aspect that the present invention adds and advantage part in the following description provide, and part will become obviously from the following description, or recognize through practice of the present invention.
Description of drawings
Above-mentioned and/or the additional aspect of the present invention and advantage from below in conjunction with accompanying drawing to becoming the description of embodiment obviously and understanding easily, wherein,
Fig. 1 is an image enchancing method process flow diagram according to an embodiment of the invention.
Embodiment
Describe embodiments of the invention below in detail, the example of said embodiment is shown in the drawings, and wherein identical from start to finish or similar label is represented identical or similar elements or the element with identical or similar functions.Be exemplary through the embodiment that is described with reference to the drawings below, only be used to explain the present invention, and can not be interpreted as limitation of the present invention.On the contrary, embodiments of the invention comprise and fall into appended spirit that adds the right claim and all changes, modification and the equivalent in the intension scope.
Image enchancing method of the present invention recovers the HDR of real scene through utilizing infrared image from contrast and two aspects of texture; Realize that former visible images strengthens, this method can not need the artificial recovery of participating in realizing automatically the information of high dynamic scene.Advantage of the present invention is to utilize the method for calculating shooting in conjunction with infrared image, realizes that from contrast and texture two aspects high-quality visible images strengthens.
Fig. 1 is an image enchancing method process flow diagram according to an embodiment of the invention.
As shown in Figure 1, image enchancing method comprises the steps:
Step S101 obtains the visible light and the infrared image that have alignd.
Particularly; Utilize hardware facility that the photocentre of visible light camera and infrared camera is alignd; Scene through spectroscope lets two cameras photograph is identical, or utilizes the more existing good camera that alignd to gather the visible images and the infrared image of same scene.
Step S102, the contrast and the texture information of extraction visible images and infrared image.
Particularly, the brightness dimension of visible images is carried out reversible transformation, obtain the visible images contrast information V of low-frequency range LVisible images texture information V with high band D, and the brightness dimension of infrared image carried out reversible transformation, obtain the infrared image contrast information N of low-frequency range LInfrared image texture information N with high band DIn a preferred embodiment of the invention, wavelet transformation is adopted in reversible transformation.
Step S103 calculates masking-out W according to the saturation degree S of visible images and the brightness V of visible images, and masking-out W is used to merge visible images and infrared image.
Particularly, the saturation degree S of image and brightness V are 0 to 1 value.In general, the zone of the zone of saturation degree S low excessively (being that the S value was near 0 o'clock) and brightness V too high (being that the V value is near 1) or low excessively (being that the V value is near 0) lacks texture information in the visible images, and contrast is gathered not enough.We make initial masking-out
Figure BDA00001658615600031
Give less weights to these zones, can utilize
Figure BDA00001658615600032
Obtain initial masking-out Value, wherein, W s=e -α | s-1|, W v=e -β | v-0.5|, wherein α and β are positive coefficient.Need to prove; The function that calculates initial masking-out
Figure BDA00001658615600034
is also not exclusive; Other function also can use as long as satisfy the condition of " weights of the initial masking-out
Figure BDA00001658615600035
than the zonule are given in place and the too high or too low place of brightness V to saturation degree S is low excessively in the visible images ".
After the value that obtains initial masking-out
Figure BDA00001658615600041
; We obtain high-quality masking-out W by further optimization, optimize equation and are:
E ( W ) = W T LW + λ ( W - W ‾ ) T ( W - W ‾ ) - - - ( 1 )
Wherein L is Laplce's matrix, and λ is the specification item parameter.(i, j) individual element is defined as in the matrix L
Σ k | ( i , j ) ∈ w k ( δ ij - 1 | w k | ( 1 + ( V i - μ k ) T ( ∑ k + ϵ | w k | ) ( V j - μ k ) ) ) - - - ( 2 )
V wherein iAnd V jBe that image is at (i, brightness value j), δ IjBe impulse function, in that (i is 1 j), and other point is 0, μ kAnd ∑ kBe respectively window w kThe average and the variance of middle brightness of image value, ε is the specification item coefficient, | w k| be window w kThe number of middle element.
Differentiate can obtain to equation (1)
( L - λU ) W = λ W ‾ - - - ( 3 )
Wherein, the U representation unit matrix in the formula (3), the high-quality masking-out W after then optimizing can obtain through finding the solution linear equation (3).
Step S104 is according to visible images contrast information V L, infrared image contrast information N LWith masking-out W, calculate the contrast information V ' that strengthens visible images LSpecifically comprise:
Experimental verification shows, the gradient of natural image and near-infrared image all meet laplacian distribution.Therefore, the present invention is mated the contrast of adjusting visible images with regard to the gradient amplitude of utilizing this two width of cloth image, wherein
Figure BDA00001658615600045
The gradient amplitude of expression visible images,
Figure BDA00001658615600046
The gradient amplitude value of expression infrared image, statistics V GAnd N GProbability histogram, utilize laplacian curve that histogram is carried out match, try to achieve the probability integral function of two curves, the probability integral function is mated, obtain the V ' of matching result G, can obtain so new can be with the gradient of the brightness dimension of light image
Figure BDA00001658615600047
Figure BDA00001658615600048
According to what obtain
Figure BDA00001658615600049
With Finding the solution Poisson equation finds the solution and obtains new visible images luminance graph V L 'The masking-out W that integrating step S103 obtains then tries to achieve the contrast information V ' of the final enhancing visible images after the migration LFor
V′ L=W·V L+(1-W)·V L′ (4)
Step S105 is according to visible images texture information V D, infrared image texture information N DWith masking-out W, calculate the texture information V ' that strengthens visible images D
The masking-out W that utilizes step S103 to obtain, the texture information of fusion visible images and near-infrared image is through the texture migration of near-infrared image, the texture information V ' of the enhancing visible images that finally obtains DThe computing formula of migration step is: V ' D=WV D+ (1-W) N D
Step S106 merges the contrast information V ' that strengthens visible images LWith texture information V ' D, obtain the monochrome information V ' of high-quality enhancing visible images.
Particularly, the enhancing visible images contrast information V ' to moving through the contrast masking-out LThe texture information V ' of the enhancing visible images that moved with the texture masking-out D, the opposite variation of the reversible transformation that employing step S102 uses is to V ' LAnd V ' DCarrying out reversible inverse transformation obtains through strengthening the luminance graph V ' of visible images.In a preferred embodiment of the invention, reversible contravariant is changed to inverse wavelet transform.
Step S107 merges the monochrome information V ' that strengthens visible images and saturation degree S and the tone H of original visible images, obtains the final visible images that passes through after the infrared image enhancing.
In the description of this instructions, the description of reference term " embodiment ", " some embodiment ", " example ", " concrete example " or " some examples " etc. means the concrete characteristic, structure, material or the characteristics that combine this embodiment or example to describe and is contained at least one embodiment of the present invention or the example.In this manual, the schematic statement to above-mentioned term not necessarily refers to identical embodiment or example.And concrete characteristic, structure, material or the characteristics of description can combine with suitable manner in any one or more embodiment or example.
Although illustrated and described embodiments of the invention; For those of ordinary skill in the art; Be appreciated that under the situation that does not break away from principle of the present invention and spirit and can carry out multiple variation, modification, replacement and modification that scope of the present invention is accompanying claims and be equal to and limit to these embodiment.

Claims (9)

1. an image enchancing method is characterized in that, comprises step:
A. gather visible images and infrared image, wherein said visible images aligns with the photocentre of said infrared image;
B. the brightness dimension of said visible images is carried out reversible transformation, obtain visible images contrast information V LWith visible images texture information V D, and the brightness dimension of said infrared image carried out reversible transformation, obtain infrared image contrast information N LWith infrared image texture information N D
C. calculate masking-out W according to the saturation degree S of said visible images and the brightness V of said visible images, said masking-out W is used to merge said visible images and said infrared image;
D. according to said visible images contrast information V L, said infrared image contrast information N LWith said masking-out W, calculate the contrast information V ' that strengthens visible images L
E. according to said visible images texture information V D, said infrared image texture information N DWith said masking-out W, calculate the texture information V ' that strengthens visible images D
F. according to the contrast information V ' of said enhancing visible images LTexture information V ' with said enhancing visible images D, carry out reversible inverse transformation, the brightness V ' of the visible images that is enhanced; And
G. according to brightness V ' and the saturation degree S of said visible images and the tone H of said visible images of said enhancing visible images, mix the visible images that is enhanced.
2. image enchancing method as claimed in claim 1 is characterized in that, said reversible transformation is a wavelet transformation; Said reversible contravariant is changed to inverse wavelet transform.
3. image enchancing method as claimed in claim 1 is characterized in that, in the said visible images, said saturation degree S crosses low and the too high or too low zone of said brightness V, and said masking-out W value is less.
4. image enchancing method as claimed in claim 3 is characterized in that, said step C comprises:
C1. according to the brightness V of the saturation degree S of said visible images and said visible images calculate initial masking-out
Figure FDA00001658615500011
and
C2. be optimized calculating according to said initial masking-out
Figure FDA00001658615500012
, obtain masking-out W.
5. image enchancing method as claimed in claim 4 is characterized in that, calculates initial masking-out among the step C1
Figure FDA00001658615500013
Formula be:
Figure FDA00001658615500014
Wherein, W s=e -α | s-1|, W v=e -β | v-0.5|, α is that positive coefficient, β are positive coefficient.
6. image enchancing method as claimed in claim 4 is characterized in that, the computing formula of masking-out W is among the step C2:
E ( W ) = W T LW + λ ( W - W ‾ ) T ( W - W ‾ )
Wherein L is Laplce's matrix, and λ is the specification item coefficient, and (i, j) individual element is defined as in the matrix L
Σ k | ( i , j ) ∈ w k ( δ ij - 1 | w k | ( 1 + ( V i - μ k ) T ( ∑ k + ϵ | w k | ) ( V j - μ k ) ) )
V wherein iAnd V jBe that said visible images is at (i, brightness value j), δ IjBe impulse function, in that (i is 1 j), and other point is 0, μ kBe window w kThe average of middle brightness of image value, ∑ kBe window w kMiddle brightness of image value variance, ε is the standardization parameter, | w k| be window w kThe number of middle element;
Computing formula differentiate to said masking-out W can obtain linear equation
( L - λU ) W = λ W ‾
Wherein, U is a unit matrix, and final said masking-out W obtains through finding the solution said linear equation.
7. image enchancing method as claimed in claim 1 is characterized in that, strengthens the contrast information V ' of visible images described in the said step D LComputing formula be: V ' L=WV L+ (1-W) V L ', V wherein L 'Be new luminance graph information, said new luminance graph information V L 'Gradient amplitude coupling through said visible images and said infrared image obtains.
8. image enchancing method as claimed in claim 7 is characterized in that, said new luminance graph information V L 'Computing method be:
Use The gradient amplitude value of representing said visible images,
Figure FDA00001658615500023
The gradient amplitude value of representing said infrared image, statistics V GAnd N GProbability histogram, utilize laplacian curve that histogram is carried out match, try to achieve the probability integral function of two curves, the probability integral function is mated, obtain the V ' of matching result G, promptly obtain the gradient that new visible images brightness is tieed up
Figure FDA00001658615500024
Figure FDA00001658615500025
According to what obtain
Figure FDA00001658615500026
With
Figure FDA00001658615500027
Finding the solution Poisson equation finds the solution and obtains said new luminance graph information V L '
9. image enchancing method as claimed in claim 1 is characterized in that, strengthens the texture information V ' of visible images described in the said step e DComputing formula be: V ' D=WV D+ (1-W) N D
CN201210157662.0A 2012-05-18 2012-05-18 Image enhancement method Active CN102722864B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210157662.0A CN102722864B (en) 2012-05-18 2012-05-18 Image enhancement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210157662.0A CN102722864B (en) 2012-05-18 2012-05-18 Image enhancement method

Publications (2)

Publication Number Publication Date
CN102722864A true CN102722864A (en) 2012-10-10
CN102722864B CN102722864B (en) 2014-11-26

Family

ID=46948611

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210157662.0A Active CN102722864B (en) 2012-05-18 2012-05-18 Image enhancement method

Country Status (1)

Country Link
CN (1) CN102722864B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400128A (en) * 2013-08-09 2013-11-20 深圳市捷顺科技实业股份有限公司 Image processing method and device
CN103973990A (en) * 2014-05-05 2014-08-06 浙江宇视科技有限公司 Wide dynamic fusion method and device
CN105612740A (en) * 2014-09-16 2016-05-25 华为技术有限公司 Image processing method and device
WO2017059774A1 (en) * 2015-10-09 2017-04-13 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fusion display of thermal infrared and visible image
CN106600554A (en) * 2016-12-15 2017-04-26 天津津航技术物理研究所 Infrared image preprocessing method for on-board night vision pedestrian detection
CN106846288A (en) * 2017-01-17 2017-06-13 中北大学 A kind of many algorithm fusion methods of bimodal infrared image difference characteristic Index
WO2017140182A1 (en) * 2016-02-15 2017-08-24 努比亚技术有限公司 Image synthesis method and apparatus, and storage medium
CN108737741A (en) * 2017-12-21 2018-11-02 西安工业大学 A kind of auto Anti-Blooming system of night Computer Vision
CN112132910A (en) * 2020-09-27 2020-12-25 上海科技大学 Infrared-based matte system suitable for low-light environment and containing semi-transparent information
WO2023123927A1 (en) * 2021-12-30 2023-07-06 上海闻泰信息技术有限公司 Image enhancement method and apparatus, and device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1545064A (en) * 2003-11-27 2004-11-10 上海交通大学 Infrared and visible light image merging method
CN101853492A (en) * 2010-05-05 2010-10-06 浙江理工大学 Method for fusing night-viewing twilight image and infrared image
US20100290703A1 (en) * 2009-05-14 2010-11-18 National University Of Singapore Enhancing Photograph Visual Quality Using Texture and Contrast Data From Near Infra-red Images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1545064A (en) * 2003-11-27 2004-11-10 上海交通大学 Infrared and visible light image merging method
US20100290703A1 (en) * 2009-05-14 2010-11-18 National University Of Singapore Enhancing Photograph Visual Quality Using Texture and Contrast Data From Near Infra-red Images
CN101853492A (en) * 2010-05-05 2010-10-06 浙江理工大学 Method for fusing night-viewing twilight image and infrared image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XIAOPENG ZHANG, ET AL.: "Enhancing Photographs with Near Infrared Images", 《IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION CVPR 2008》, 28 June 2008 (2008-06-28) *
王爽 等: "基于方向对比度和局部方差的双树复小波图像融合算法", 《中国体视学与图像分析》, vol. 14, no. 2, 30 June 2009 (2009-06-30), pages 133 - 137 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400128B (en) * 2013-08-09 2016-12-28 深圳市捷顺科技实业股份有限公司 A kind of image processing method and device
CN103400128A (en) * 2013-08-09 2013-11-20 深圳市捷顺科技实业股份有限公司 Image processing method and device
CN103973990A (en) * 2014-05-05 2014-08-06 浙江宇视科技有限公司 Wide dynamic fusion method and device
CN103973990B (en) * 2014-05-05 2018-12-07 浙江宇视科技有限公司 wide dynamic fusion method and device
US10560644B2 (en) 2014-09-16 2020-02-11 Huawei Technologies Co., Ltd. Image processing method and apparatus
CN105612740A (en) * 2014-09-16 2016-05-25 华为技术有限公司 Image processing method and device
WO2017059774A1 (en) * 2015-10-09 2017-04-13 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fusion display of thermal infrared and visible image
US11354827B2 (en) 2015-10-09 2022-06-07 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fusion display of thermal infrared and visible image
US10719958B2 (en) 2015-10-09 2020-07-21 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fusion display of thermal infrared and visible image
WO2017140182A1 (en) * 2016-02-15 2017-08-24 努比亚技术有限公司 Image synthesis method and apparatus, and storage medium
CN106600554B (en) * 2016-12-15 2020-06-19 天津津航技术物理研究所 Infrared image preprocessing method for vehicle-mounted night vision pedestrian detection
CN106600554A (en) * 2016-12-15 2017-04-26 天津津航技术物理研究所 Infrared image preprocessing method for on-board night vision pedestrian detection
CN106846288B (en) * 2017-01-17 2019-09-06 中北大学 A kind of more algorithm fusion methods of bimodal infrared image difference characteristic Index
CN106846288A (en) * 2017-01-17 2017-06-13 中北大学 A kind of many algorithm fusion methods of bimodal infrared image difference characteristic Index
CN108737741A (en) * 2017-12-21 2018-11-02 西安工业大学 A kind of auto Anti-Blooming system of night Computer Vision
CN112132910A (en) * 2020-09-27 2020-12-25 上海科技大学 Infrared-based matte system suitable for low-light environment and containing semi-transparent information
CN112132910B (en) * 2020-09-27 2023-09-26 上海科技大学 Infrared-based image matting system containing semitransparent information and suitable for low-light environment
WO2023123927A1 (en) * 2021-12-30 2023-07-06 上海闻泰信息技术有限公司 Image enhancement method and apparatus, and device and storage medium

Also Published As

Publication number Publication date
CN102722864B (en) 2014-11-26

Similar Documents

Publication Publication Date Title
CN102722864B (en) Image enhancement method
CN108765336B (en) Image defogging method based on dark and bright primary color prior and adaptive parameter optimization
CN103761710B (en) The blind deblurring method of efficient image based on edge self-adaption
Li et al. Fast multi-exposure image fusion with median filter and recursive filter
CN102831592B (en) Based on the image nonlinearity enhancement method of histogram subsection transformation
CN102096909B (en) Improved unsharp masking image reinforcing method based on logarithm image processing model
CN102231791B (en) Video image defogging method based on image brightness stratification
CN105046658B (en) A kind of low-light (level) image processing method and device
CN107292830B (en) Low-illumination image enhancement and evaluation method
CN111161360B (en) Image defogging method of end-to-end network based on Retinex theory
CN103886565B (en) Nighttime color image enhancement method based on purpose optimization and histogram equalization
CN104700376A (en) Gamma correction and smoothing filtering based image histogram equalization enhancing method
CN103020920A (en) Method for enhancing low-illumination images
CN103325098A (en) High dynamic infrared image enhancement method based on multi-scale processing
CN103996178A (en) Sand and dust weather color image enhancing method
CN102222323A (en) Histogram statistic extension and gradient filtering-based method for enhancing infrared image details
CN105354801B (en) A kind of image enchancing method based on HSV color space
CN110807742B (en) Low-light-level image enhancement method based on integrated network
CN103607589B (en) JND threshold value computational methods based on hierarchy selection visual attention mechanism
CN102831586A (en) Method for enhancing image/video in real time under poor lighting condition
CN102800054A (en) Image blind deblurring method based on sparsity metric
CN103106644A (en) Self-adaptation image quality enhancing method capable of overcoming non-uniform illumination of colored image
CN110009574B (en) Method for reversely generating high dynamic range image from low dynamic range image
CN104299213A (en) Method for synthesizing high-dynamic image based on detail features of low-dynamic images
CN106296626B (en) A kind of night video enhancement method based on gradient fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant