CN107481214A - A kind of twilight image and infrared image fusion method - Google Patents

A kind of twilight image and infrared image fusion method Download PDF

Info

Publication number
CN107481214A
CN107481214A CN201710754190.XA CN201710754190A CN107481214A CN 107481214 A CN107481214 A CN 107481214A CN 201710754190 A CN201710754190 A CN 201710754190A CN 107481214 A CN107481214 A CN 107481214A
Authority
CN
China
Prior art keywords
image
low
infrared
light
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710754190.XA
Other languages
Chinese (zh)
Inventor
冯华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huayi New Technology Co Ltd
Original Assignee
Beijing Huayi New Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Huayi New Technology Co Ltd filed Critical Beijing Huayi New Technology Co Ltd
Priority to CN201710754190.XA priority Critical patent/CN107481214A/en
Publication of CN107481214A publication Critical patent/CN107481214A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention discloses a kind of twilight image and infrared image fusion method, comprise the following steps:S1:For same target scene, twilight image and infrared image are gathered respectively;S2:Twilight image and infrared image to collection carry out denoising respectively;S3:The twilight image after denoising and infrared image are subjected to effectively accurate registration using based on the method for registering images of edge feature, obtain the twilight image after registration and infrared image;S4:Image co-registration is carried out to the image after registration;S5:Image enhancement processing, that is, the target image after being merged are carried out to the image after fusion.The present invention can merge twilight image and infrared image, and the image after fusion has the advantages of low-light clear background, infrared target protrude.

Description

Low-light-level image and infrared image fusion method
Technical Field
The invention relates to the technical field of image fusion, in particular to a method for fusing a low-light-level image and an infrared image.
Background
Night vision equipment mainly has two kinds of shimmer and infrared, and each has merit shortcoming. The glimmer has the advantages that the visual perception is close to visible light, objects with different visible light reflectivity can be well identified, and the glimmer has the defects that the glimmer is not sensitive to temperature and is difficult to detect a hidden target; the infrared has the advantages of being sensitive to temperature, capable of well detecting temperature difference targets, long in detection distance, capable of penetrating smoke and the like, and the defect that the targets are difficult to identify in scenes with close temperature.
Therefore, fusing and displaying the images of the two sensors to complement the information and enable people to better observe at night is a problem which needs to be solved at present.
Disclosure of Invention
Aiming at the technical problems in the related art, the invention provides a method for fusing a low-light-level image and an infrared image, which can overcome the defects in the prior art.
In order to achieve the technical purpose, the technical scheme of the invention is realized as follows:
a low-light-level image and infrared image fusion method comprises the following steps:
s1: respectively acquiring low-light images and infrared images aiming at the same target scene;
s2: respectively carrying out denoising processing on the collected low-light-level image and the collected infrared image, wherein the denoising processing comprises the following steps:
s2.1: inputting an image X to be denoised, and carrying out noise estimation on the image X to be denoised to obtain the noise level sigma of the image;
s2.2: setting a noise compression multiple K of the image X to be denoised and obtaining a target weight Wt
S2.3: calculating each pixel point X in the image X according to the noise level sigmaXAnd each pixel point X in the neighborhoodYWeight of similarity W betweenY
S2.4: according to similar weight WYAnd a target weight WtThe pixel point X is processedXNormalized to the target weight WtTo obtain a weighted average XX', is a pixel point XXA noise-reduced pixel value;
s2.5: traversing all pixel points of the image X to be denoised to obtain a denoised image;
s3: carrying out effective and accurate registration on the denoised low-light-level image and the infrared image by using an image registration method based on edge features to obtain the registered low-light-level image and the registered infrared image;
s4: carrying out image fusion on the registered images;
s5: and carrying out image enhancement processing on the fused image to obtain a fused target image.
Further, in step S3, the image registration step is as follows:
let the reference picture be I1(x, y) the image to be registered is I2(x, y) of the same size as P × Q pixels, each having a gray histogram of H1(n)、H2(n), for each grey level n ═ i, H1(n) and H2(n) respectively representing the number of pixels with the gray value i;
s3.1: performing edge extraction on a source image by using a wavelet transform method to obtain an edge image;
s3.2: obtainingAnd
for I1For each gray level n of (x, y), calculate I2(x, y) relative to I1(x, y) Gray value of a set of corresponding pixels having a Gray value of nSum variance
For I1For each gray level n of (x, y), calculate I1(x, y) relative to I2(x, y) Gray value of a set of corresponding pixels having a Gray value of nSum variance
S3.3: calculation of I1(x, y) and I2(x, y) ofExpected varianceAnd
s3.4: obtaining the interaction variance CI and the reciprocal AM of the two images;
s3.5: repeatedly calculating the AM of the two images by using an affine transformation model under the condition of different parameters;
s3.6: searching out a parameter corresponding to the maximum AM, wherein the parameter is a required registration parameter;
s3.7: and correcting the image to be registered according to the obtained registration parameters and the affine transformation model to obtain a registered image.
Further, in step S4, the image fusion step is as follows:
s4.1: performing double-tree complex wavelet transform on the registered infrared image and low-light-level image to obtain a low-frequency coefficient and a high-frequency coefficient;
s4.2: filtering the low-frequency coefficient and the high-frequency coefficient;
s4.3: respectively fusing the low-frequency coefficient and the high-frequency coefficient;
s4.4: and performing double-tree complex wavelet transformation on the fused low-frequency coefficient group and high-frequency coefficient group to obtain a fused image.
Preferably, in step S4.3, adaptive weighting fusion is used for low-frequency coefficients, and compressive sampling-pulse coupled neural network fusion is used for high-frequency coefficients.
Further, in step S5, the image enhancement step is as follows:
s5.1: enhancing the color saturation value of each pixel of the fused image;
s5.2: correcting the color saturation value of each pixel of the preset image according to the brightness value of each pixel of the preset image and a preset brightness threshold value so as to avoid color cast of the preset image;
s5.3: and converting the hue value, the color saturation value and the brightness value of each pixel of the preset image after correction into a red gray value, a blue gray value and a green gray value of each pixel of the preset image after correction so as to facilitate image display enhancement.
The invention has the beneficial effects that: the invention can fuse the low-light-level image and the infrared image, and the fused image has the advantages of clear low-light-level background and prominent infrared target.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a flowchart of a method for fusing a low-light-level image and an infrared image according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present invention.
As shown in fig. 1, a method for fusing a low-light-level image and an infrared image according to an embodiment of the present invention includes the following steps:
s1: respectively acquiring low-light images and infrared images aiming at the same target scene;
s2: respectively carrying out denoising processing on the collected low-light-level image and the collected infrared image, wherein the denoising processing comprises the following steps:
s2.1: inputting an image X to be denoised, and carrying out noise estimation on the image X to be denoised to obtain the noise level sigma of the image;
s2.2: setting a noise compression multiple K of the image X to be denoised and obtaining a target weight Wt
S2.3: calculating each pixel point X in the image X according to the noise level sigmaXAnd each pixel point X in the neighborhoodYWeight of similarity W betweenY
S2.4: according to similar weight WYAnd a target weight WtThe pixel point X is processedXNormalized to the target weight WtTo obtain a weighted average XX', is a pixel point XXA noise-reduced pixel value;
s2.5: traversing all pixel points of the image X to be denoised to obtain a denoised image;
s3: carrying out effective and accurate registration on the denoised low-light-level image and the infrared image by using an image registration method based on edge features to obtain the registered low-light-level image and the registered infrared image;
s4: carrying out image fusion on the registered images;
s5: and carrying out image enhancement processing on the fused image to obtain a fused target image.
In a specific embodiment, in step S3, the image registration step is as follows:
let the reference picture be I1(x, y) the image to be registered is I2(x, y) of the same size as P × Q pixels, each having a gray histogram of H1(n)、H2(n), for each grey level n ═ i, H1(n) and H2(n) respectively representing the number of pixels with the gray value i;
s3.1: performing edge extraction on a source image by using a wavelet transform method to obtain an edge image;
s3.2: obtainingAnd
for I1For each gray level n of (x, y), calculate I2(x, y) relative to I1(x, y) Gray value of a set of corresponding pixels having a Gray value of nSum varianceWherein:
for I1For each gray level n of (x, y), calculate I1(x, y) relative to I2(x, y) Gray value of a set of corresponding pixels having a Gray value of nSum varianceWherein:
s3.3: calculation of I1(x, y) and I2Expected variance of (x, y)Andwherein:
s3.4: and solving the interaction variance CI and the reciprocal AM of the two images, wherein:
wherein,andare respectively an image I1(x, y) and I2The variance of (x, y) is calculated as follows:
s3.5: repeatedly calculating the AM of the two images by using an affine transformation model under the condition of different parameters;
s3.6: searching out a parameter corresponding to the maximum AM, wherein the parameter is a required registration parameter;
s3.7: and correcting the image to be registered according to the obtained registration parameters and the affine transformation model to obtain a registered image.
In one embodiment, in step S4, the image fusion step is as follows:
s4.1: performing double-tree complex wavelet transform on the registered infrared image and low-light-level image to obtain a low-frequency coefficient and a high-frequency coefficient;
s4.2: filtering the low-frequency coefficient and the high-frequency coefficient;
s4.3: respectively fusing the low-frequency coefficient and the high-frequency coefficient;
s4.4: and performing double-tree complex wavelet transformation on the fused low-frequency coefficient group and high-frequency coefficient group to obtain a fused image.
In a specific embodiment, in step S4.3, adaptive weighted fusion is used for low-frequency coefficients, and compressive sampling-pulse coupled neural network fusion is used for high-frequency coefficients.
In one embodiment, in step S5, the image enhancement step is as follows:
s5.1: enhancing the color saturation value of each pixel of the fused image;
s5.2: correcting the color saturation value of each pixel of the preset image according to the brightness value of each pixel of the preset image and a preset brightness threshold value so as to avoid color cast of the preset image;
s5.3: and converting the hue value, the color saturation value and the brightness value of each pixel of the preset image after correction into a red gray value, a blue gray value and a green gray value of each pixel of the preset image after correction so as to facilitate image display enhancement.
In order to facilitate understanding of the above-described technical aspects of the present invention, the above-described technical aspects of the present invention will be described in detail below in terms of specific usage.
When the method is used specifically, the method for fusing the low-light-level image and the infrared image comprises the following steps: respectively acquiring low-light images and infrared images aiming at the same target scene; respectively denoising the collected low-light-level image and the collected infrared image; carrying out effective and accurate registration on the denoised low-light-level image and the infrared image by using an image registration method based on edge features to obtain the registered low-light-level image and the registered infrared image; carrying out image fusion on the registered images; and carrying out image enhancement processing on the fused image to obtain a fused target image. The invention can fuse the low-light-level image and the infrared image, and the fused image has the advantages of clear low-light-level background and prominent infrared target.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (5)

1. A low-light-level image and infrared image fusion method is characterized by comprising the following steps:
s1: respectively acquiring low-light images and infrared images aiming at the same target scene;
s2: respectively carrying out denoising processing on the collected low-light-level image and the collected infrared image, wherein the denoising processing comprises the following steps:
s2.1: inputting an image X to be denoised, and carrying out noise estimation on the image X to be denoised to obtain the noise level of the image;
s2.2: setting the image X to be denoisedThe noise compression multiple K is obtained, and the target weight W is obtainedt
S2.3: according to the noise level, calculating each pixel point X in the image XXAnd each pixel point X in the neighborhoodYWeight of similarity W betweenY
S2.4: according to similar weight WYAnd a target weight WtThe pixel point X is processedXNormalized to the target weight WtTo obtain a weighted average XX I.e. is pixel point XXA noise-reduced pixel value;
s2.5: traversing all pixel points of the image X to be denoised to obtain a denoised image;
s3: carrying out effective and accurate registration on the denoised low-light-level image and the infrared image by using an image registration method based on edge features to obtain the registered low-light-level image and the registered infrared image;
s4: carrying out image fusion on the registered images;
s5: and carrying out image enhancement processing on the fused image to obtain a fused target image.
2. A method for fusing a low-light image and an infrared image according to claim 1, wherein in step S3, the image registration comprises the following steps:
let the reference picture be I1(x, y) the image to be registered is I2(x, y) of the same size as P × Q pixels, each having a gray histogram of H1(n)、H2(n), for each grey level n ═ i, H1(n) and H2(n) respectively representing the number of pixels with the gray value i;
s3.1: performing edge extraction on a source image by using a wavelet transform method to obtain an edge image;
s3.2: for I1For each gray level n of (x, y), calculate I2(x, y) relative to I1(x, y) a gray value and variance of a corresponding set of pixels having a gray value of n;
for I1For each gray level of (x, y), calculate I1(x, y) relative toI2(x, y) a gray value and variance of a corresponding set of pixels having a gray value of n;
s3.3: calculation of I1(x, y) and I2(x, y) desired variance;
s3.4: obtaining the interaction variance CI and the reciprocal AM of the two images;
s3.5: repeatedly calculating the AM of the two images by using an affine transformation model under the condition of different parameters;
s3.6: searching out a parameter corresponding to the maximum AM, wherein the parameter is a required registration parameter;
s3.7: and correcting the image to be registered according to the obtained registration parameters and the affine transformation model to obtain a registered image.
3. A method for fusing a low-light image and an infrared image according to claim 1, wherein in step S4, the image fusion step is as follows:
s4.1: performing double-tree complex wavelet transform on the registered infrared image and low-light-level image to obtain a low-frequency coefficient and a high-frequency coefficient;
s4.2: filtering the low-frequency coefficient and the high-frequency coefficient;
s4.3: respectively fusing the low-frequency coefficient and the high-frequency coefficient;
s4.4: and performing double-tree complex wavelet transformation on the fused low-frequency coefficient group and high-frequency coefficient group to obtain a fused image.
4. A dim light image and infrared image fusion method according to claim 3, characterized in that in step S4.3, adaptive weighted fusion is applied to low frequency coefficients and compressive sampling-pulse coupled neural network fusion is applied to high frequency coefficients.
5. A method for fusing a low-light image and an infrared image according to claim 1, wherein in step S5, the image enhancement step is as follows:
s5.1: enhancing the color saturation value of each pixel of the fused image;
s5.2: correcting the color saturation value of each pixel of the preset image according to the brightness value of each pixel of the preset image and a preset brightness threshold value so as to avoid color cast of the preset image;
s5.3: and converting the hue value, the color saturation value and the brightness value of each pixel of the preset image after correction into a red gray value, a blue gray value and a green gray value of each pixel of the preset image after correction so as to facilitate image display enhancement.
CN201710754190.XA 2017-08-29 2017-08-29 A kind of twilight image and infrared image fusion method Pending CN107481214A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710754190.XA CN107481214A (en) 2017-08-29 2017-08-29 A kind of twilight image and infrared image fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710754190.XA CN107481214A (en) 2017-08-29 2017-08-29 A kind of twilight image and infrared image fusion method

Publications (1)

Publication Number Publication Date
CN107481214A true CN107481214A (en) 2017-12-15

Family

ID=60604129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710754190.XA Pending CN107481214A (en) 2017-08-29 2017-08-29 A kind of twilight image and infrared image fusion method

Country Status (1)

Country Link
CN (1) CN107481214A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109064504A (en) * 2018-08-24 2018-12-21 深圳市商汤科技有限公司 Image processing method, device and computer storage medium
CN110363731A (en) * 2018-04-10 2019-10-22 杭州海康威视数字技术股份有限公司 A kind of image interfusion method, device and electronic equipment
CN110389390A (en) * 2019-05-31 2019-10-29 中国人民解放军陆军工程大学 Large-view-field infrared shimmer naturalness color fusion system
CN110827375A (en) * 2019-10-31 2020-02-21 湖北大学 Infrared image true color coloring method and system based on low-light-level image
CN111445409A (en) * 2020-03-25 2020-07-24 东风汽车集团有限公司 Night AEB function performance improving method and system based on night vision camera assistance
CN115082968A (en) * 2022-08-23 2022-09-20 天津瑞津智能科技有限公司 Behavior identification method based on infrared light and visible light fusion and terminal equipment
CN115689960A (en) * 2022-10-27 2023-02-03 长春理工大学 Illumination self-adaptive infrared and visible light image fusion method in night scene

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104036457A (en) * 2013-09-02 2014-09-10 上海联影医疗科技有限公司 Image noise reduction method
CN105069756A (en) * 2015-08-10 2015-11-18 深圳市华星光电技术有限公司 Image enhancing method
CN105069769A (en) * 2015-08-26 2015-11-18 哈尔滨工业大学 Low-light and infrared night vision image fusion method
CN105447838A (en) * 2014-08-27 2016-03-30 北京计算机技术及应用研究所 Method and system for infrared and low-level-light/visible-light fusion imaging

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104036457A (en) * 2013-09-02 2014-09-10 上海联影医疗科技有限公司 Image noise reduction method
CN105447838A (en) * 2014-08-27 2016-03-30 北京计算机技术及应用研究所 Method and system for infrared and low-level-light/visible-light fusion imaging
CN105069756A (en) * 2015-08-10 2015-11-18 深圳市华星光电技术有限公司 Image enhancing method
CN105069769A (en) * 2015-08-26 2015-11-18 哈尔滨工业大学 Low-light and infrared night vision image fusion method

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110363731A (en) * 2018-04-10 2019-10-22 杭州海康威视数字技术股份有限公司 A kind of image interfusion method, device and electronic equipment
CN110363731B (en) * 2018-04-10 2021-09-03 杭州海康微影传感科技有限公司 Image fusion method and device and electronic equipment
CN109064504A (en) * 2018-08-24 2018-12-21 深圳市商汤科技有限公司 Image processing method, device and computer storage medium
CN109064504B (en) * 2018-08-24 2022-07-15 深圳市商汤科技有限公司 Image processing method, apparatus and computer storage medium
CN110389390A (en) * 2019-05-31 2019-10-29 中国人民解放军陆军工程大学 Large-view-field infrared shimmer naturalness color fusion system
CN110827375A (en) * 2019-10-31 2020-02-21 湖北大学 Infrared image true color coloring method and system based on low-light-level image
CN110827375B (en) * 2019-10-31 2023-05-30 湖北大学 Infrared image true color coloring method and system based on low-light-level image
CN111445409A (en) * 2020-03-25 2020-07-24 东风汽车集团有限公司 Night AEB function performance improving method and system based on night vision camera assistance
CN111445409B (en) * 2020-03-25 2023-02-28 东风汽车集团有限公司 Night AEB function performance improving method and system based on night vision camera assistance
CN115082968A (en) * 2022-08-23 2022-09-20 天津瑞津智能科技有限公司 Behavior identification method based on infrared light and visible light fusion and terminal equipment
CN115689960A (en) * 2022-10-27 2023-02-03 长春理工大学 Illumination self-adaptive infrared and visible light image fusion method in night scene

Similar Documents

Publication Publication Date Title
CN107481214A (en) A kind of twilight image and infrared image fusion method
CN108549874B (en) Target detection method, target detection equipment and computer-readable storage medium
Zhang et al. Nighttime haze removal based on a new imaging model
CN108090888B (en) Fusion detection method of infrared image and visible light image based on visual attention model
CN103606132B (en) Based on the multiframe Digital Image Noise method of spatial domain and time domain combined filtering
CN106846289A (en) A kind of infrared light intensity and polarization image fusion method based on conspicuousness migration with details classification
Chaudhry et al. A framework for outdoor RGB image enhancement and dehazing
CN107635099B (en) Human body induction double-optical network camera and security monitoring system
KR20130077726A (en) Apparatus and method for noise removal in a digital photograph
CN106600569B (en) Signal lamp color effect enhancement processing method and device
CN108133462B (en) Single image restoration method based on gradient field region segmentation
CN112435184B (en) Image recognition method for haze days based on Retinex and quaternion
CN110866889A (en) Multi-camera data fusion method in monitoring system
CN106846258A (en) A kind of single image to the fog method based on weighted least squares filtering
Lou et al. Integrating haze density features for fast nighttime image dehazing
Choi et al. Human detection using image fusion of thermal and visible image with new joint bilateral filter
CN110675332A (en) Method for enhancing quality of metal corrosion image
Sahu et al. Image dehazing based on luminance stretching
CN110298796B (en) Low-illumination image enhancement method based on improved Retinex and logarithmic image processing
TW202230279A (en) Image enhancement method and image enhncement apparatus
Wang et al. Nighttime image dehazing using color cast removal and dual path multi-scale fusion strategy
Chen et al. Night-time pedestrian detection by visual-infrared video fusion
Hu et al. A low illumination video enhancement algorithm based on the atmospheric physical model
CN102609710A (en) Smoke and fire object segmentation method aiming at smog covering scene in fire disaster image video
Zhang et al. Nighttime haze removal with illumination correction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171215

RJ01 Rejection of invention patent application after publication