WO2019144581A1 - 一种智能化红外图像场景增强方法 - Google Patents

一种智能化红外图像场景增强方法 Download PDF

Info

Publication number
WO2019144581A1
WO2019144581A1 PCT/CN2018/096021 CN2018096021W WO2019144581A1 WO 2019144581 A1 WO2019144581 A1 WO 2019144581A1 CN 2018096021 W CN2018096021 W CN 2018096021W WO 2019144581 A1 WO2019144581 A1 WO 2019144581A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
infrared image
detail
kernel function
layer component
Prior art date
Application number
PCT/CN2018/096021
Other languages
English (en)
French (fr)
Inventor
赵毅
张登平
钱晨
刘宁
杨超
马新华
谢小波
宋莽
Original Assignee
江苏宇特光电科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 江苏宇特光电科技股份有限公司 filed Critical 江苏宇特光电科技股份有限公司
Publication of WO2019144581A1 publication Critical patent/WO2019144581A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Definitions

  • the invention relates to the field of infrared image high dynamic range display technology, in particular to an intelligent infrared image scene enhancement method.
  • Infrared thermal imaging has a wide range of applications in the military and civilian fields, such as system design, system testing, system manufacturing, chemical imaging, night vision imaging, disaster search and rescue, target recognition and detection, and target tracking.
  • an ordinary thermal imaging camera has a very wide data dynamic range, and the conventional display device does not support such a high dynamic range. Therefore, when performing infrared image display, a commonly used method is to perform a histogram similar to the infrared image. Equalization contour dynamic range compression technology.
  • the simple histogram equalization technique has a very limited effect on enhancing the visual perception of infrared images, and does not express all the details in the real scene.
  • infrared thermal imaging is a temperature difference imaging method
  • its imaging effect is greatly affected by the infrared thermal energy radiated from the scene.
  • the infrared thermal energy is not highly differentiated
  • the human eye cannot carefully distinguish the slight temperature difference of the details in the scene, so that the scene cannot be carefully observed.
  • many scientific research institutions and researchers have done a lot of research on how to achieve detailed reduction and enhancement of infrared images.
  • infrared image detail enhancement methods there are two kinds of infrared image detail enhancement methods that are feasible.
  • One is the image edge enhancement method based on edge gradient operator, and the other is the image overall detail enhancement method based on linear or nonlinear filter.
  • the image edge enhancement method based on edge gradient operator the more mainstream operators are Sobel, Prewitt, Log and Laplacian operators.
  • the disadvantage of this type of algorithm is that it can only effectively enhance the details near the strong edges in the image. For weak edges or detail components with low temperature discrimination, it almost completely fails.
  • the overall detail enhancement method based on linear or nonlinear filter it can effectively enhance the details such as strong edges and weak edges in the image, and is suitable for scenes with high temperature discrimination and low, but in practical engineering applications.
  • the object of the present invention is to provide an intelligent infrared image scene enhancement method.
  • an embodiment of the present invention provides an intelligent infrared image scene enhancement method, including the following steps:
  • Step S1 jointly calculating two adjacent infrared images by using a modified joint bilateral filter, wherein the first frame of the adjacent two frames is used as a reference frame, and the second frame is set as a reference frame, a detail layer component and a baseband layer component of the reference frame image;
  • Step S2 using a guided gray-scale similarity kernel function to control the enhancement range of the detail layer component and eliminating the edge gradient inversion effect, and using the improved histogram calculation method to control the gray level of the entire image in the fundamental frequency layer component. distributed;
  • step S3 the detailed layer image and the baseband layer image processed in the step S2 are superimposed and restored to enhance the scene of the original reference frame infrared image.
  • the joint calculation of the adjacent two frames of infrared images is performed by using the following formula, including:
  • I JBF is the baseband layer
  • I d is the detail layer
  • I R is the reference frame
  • I B is the reference frame
  • is the filter window size
  • k is the normalization coefficient term of the coefficient of the improved joint bilateral filter ;
  • ⁇ s , ⁇ r are two Gaussian kernel functions
  • ⁇ s is the spatial domain kernel function
  • ⁇ r is the intensity domain kernel function
  • k is used to normalize the two kernel functions ⁇ s , ⁇ r To cope with the infrared thermal images collected by various infrared cameras.
  • the two kernel functions ⁇ s , ⁇ r respectively control the weights of the detail components in the filtering window obtained when the joint bilateral filtering is performed, wherein
  • ⁇ r and ⁇ s are the standard deviations of the specific gray space and intensity domains within the filtering window
  • ⁇ r defines the range of the Gaussian kernel function ⁇ r
  • ⁇ r determines the minimum amplitude of the image edge within the filtering window
  • the value, ⁇ s defines the range of the Gaussian kernel function ⁇ s
  • ⁇ s determines the size of the filtering window of the corresponding position pixel in the adjacent frame image
  • the size of the parameter should vary with the size of the entire image, if If the amplitude of the filtering window of the two frames is less than ⁇ r , the partial gradation will be smoothed and separated into the fundamental layer by the bilateral filter. Otherwise, if the amplitude changes by more than ⁇ r , the partial gradation will be separated. In the detail layer.
  • the expression of the gray-scale similarity kernel function is as follows:
  • f k is the kernel function
  • ⁇ e is the gradient term
  • ⁇ d is the guiding space similarity term
  • f k is the kernel function
  • ⁇ e is the gradient term
  • ⁇ d is the guiding space similarity term
  • ⁇ ( ⁇ ) is the adaptive fusion coefficient
  • ⁇ d is the standard deviation of the similarity of the guiding space
  • ⁇ ( ⁇ ) is the weight used to fuse the similarity items of the gray level with the similar items of the guiding space, which is expressed by the following formula:
  • is a constrained factor that prevents the occurrence of a standard deviation of zero values.
  • x, y represent horizontal and vertical directions
  • ⁇ e is the gradient standard deviation
  • G x/y represents the gradient change level of the corresponding pixel position in the adjacent frame.
  • step S3 after the detail layer component and the baseband layer component are respectively extracted, the two components are subjected to corresponding enhancement and histogram equalization processing, and the obtained processing result is back-stacked. To get the final enhancement.
  • the joint calculation of the adjacent two frames of infrared images is realized by using the improved joint bilateral filter, wherein the first frame is used as the reference frame and the second frame is the reference frame, thereby obtaining
  • the detail layer and the baseband layer component of the reference frame image are used to control the enhancement coefficient and eliminate the edge gradient inversion effect by using the guided gray scale similarity kernel function for the detail layer component, and realize the grayscale redistribution by using the improved histogram equalization technique for the fundamental frequency layer.
  • the two sub-images processed separately are superimposed and restored to realize the scene enhancement of the original reference frame infrared image.
  • the invention can effectively overcome the phenomenon that the ordinary infrared image detail enhancement method is too abrupt, so that the processed infrared image not only has excellent scene detail enhancement ability, but also the gray scale distribution is closer to the real scene, and the degree is great. Improves the visual perception of infrared images.
  • the present invention is very easy to implement in hardware by using an FPGA, and has a very good effect on improving the performance of the thermal imager in engineering.
  • FIG. 1 is a flowchart of an intelligent infrared image scene enhancement method according to an embodiment of the present invention
  • FIG. 2 is a schematic overall structural diagram of an intelligent infrared image scene enhancement method according to an embodiment of the invention
  • 3(a) to 3(e) are effect diagrams of filtering detail hierarchical suppression ghosting effect according to an embodiment of the present invention.
  • 5(a) and 5(b) are scene enhancement effect diagrams of actual infrared images according to an embodiment of the present invention.
  • 6(a) and 6(b) are scene enhancement effect diagrams of actual infrared images according to an embodiment of the present invention.
  • 7(a) and 7(b) are scene enhancement effect diagrams of actual infrared images according to an embodiment of the present invention.
  • 8(a) and 8(b) are diagrams showing scene enhancement effects on an actual infrared image according to an embodiment of the present invention.
  • Figure 9 is a comparison of indicators with prior methods in accordance with an embodiment of the present invention.
  • the invention provides an intelligent infrared image scene enhancement method. Based on the adjacent two frames of infrared images, the gray level and detail features between the images are calculated in parallel, and a kernel function is specially designed to target the edge gradient inversion effect.
  • the kernel function can efficiently and quickly calculate the detailed component features calculated by the joint bilateral filter, distinguish the strong and weak edge information, and perform the gradient flip effect suppression. Then the detail features are effectively enhanced to achieve a great improvement in the infrared image display effect.
  • the method for enhancing an enhanced infrared image scene includes the following steps:
  • Step S1 using a modified joint bilateral filter to jointly calculate the adjacent two frames of infrared images, wherein the first frame of the adjacent two frames is used as a reference frame, and the second frame is set as a reference frame to obtain a reference frame.
  • the detail layer component and the baseband layer component of the image, that is, the reference frame filtering is separated into the baseband layer and the detail layer by the joint calculation.
  • step S1 the joint calculation of the adjacent two frames of infrared images is performed by using the following formula, including:
  • I JBF is the baseband layer
  • I d is the detail layer
  • I R is the reference frame
  • I B is the reference frame
  • is the filter window size
  • k is the normalization coefficient term of the coefficient of the improved joint bilateral filter ;
  • ⁇ s , ⁇ r are two Gaussian kernel functions
  • ⁇ s is the spatial domain kernel function
  • ⁇ r is the intensity domain kernel function
  • k is used to normalize the two kernel functions ⁇ s , ⁇ r
  • the function of the coefficient k is to normalize the two kernel functions ⁇ s , ⁇ r that are solved, which has the advantage of being able to cope with the infrared thermal images collected by various infrared cameras. Because different manufacturers, different models of thermal imager output grayscale difference, normalization in the calculation can be all the gray interval between (0,1), thus eliminating the thermal imager The difference in response between devices improves the universality of the method.
  • the two kernel functions ⁇ s , ⁇ r respectively control the weights of the detail components in the filtering window obtained when the joint bilateral filtering is performed, wherein
  • ⁇ r and ⁇ s are the standard deviations of the specific gray space and intensity domains within the filtering window
  • ⁇ r defines the range of the Gaussian kernel function ⁇ r
  • ⁇ r determines the minimum amplitude of the image edge within the filtering window
  • ⁇ s defines the range of the Gaussian kernel function ⁇ s
  • ⁇ s determines the size of the filtering window of the corresponding position pixel in the adjacent frame image
  • the size of the parameter should vary with the size of the entire image.
  • the partial gray is combined with the bilateral filter to smooth and separate to the fundamental frequency.
  • the partial gray scale will be separated into the detail layer.
  • step S2 the enhanced grayscale function is used to control the enhancement range of the detail layer component and the edge gradient inversion effect is eliminated, and the improved histogram calculation method is used to control the grayscale redistribution of the entire image in the fundamental frequency layer component.
  • a kernel function is specially designed to effectively suppress the "ghosting" effect.
  • This kernel function is based on the consideration of the gradient constrained factor, and the gradient energy in the detail layer component is generally weak, and the stability of these gradients is There is an important relationship between the instability of gray scale. From the saliency structure of the image, the edge structure features of the gradient are more severe than the gray scale.
  • f k is the kernel function
  • ⁇ e is the gradient term
  • ⁇ d is the guiding space similarity term
  • f k is the kernel function
  • ⁇ e is the gradient term
  • ⁇ d is the guiding space similarity term
  • ⁇ ( ⁇ ) is the adaptive fusion coefficient
  • ⁇ d is the standard deviation of the similarity of the guiding space
  • ⁇ ( ⁇ ) is the weight used to fuse the similarity items of the gray level with the similar items of the guiding space, which is expressed by the following formula:
  • is a constrained factor that prevents the occurrence of a standard deviation of zero values.
  • 3(a) to 3(e) are effect diagrams of filtering detail hierarchical suppression ghosting effect according to an embodiment of the present invention.
  • the adaptive fusion factor term in the designed new kernel function is used to effectively achieve the object.
  • the value of ⁇ ( ⁇ ) approaches 0, and the value rises to a level not exceeding 1 as the gray level of the pixel fluctuates, in order to preserve the detail while suppressing noise to the greatest extent.
  • the threshold of the value of the present invention is set to 0.95, and once the value exceeds 0.95, it does not rise any more. Its control expression is as follows:
  • a and b are control coefficients.
  • Figure 4 shows the detail enhancement effect obtained by using different control coefficients for the detail layer components.
  • step S3 the detail layer image and the baseband layer image processed in step S2 are superimposed and restored to enhance the scene of the original reference frame infrared image.
  • the two components are subjected to corresponding enhancement and histogram equalization processing, and the obtained processing result is back-stacked, thereby obtaining a final result. Enhance the effect and greatly enhance the visual effect of the observer.
  • the two sub-images of the detail layer and the baseband layer respectively processed in step S2 are superimposed and restored to realize scene enhancement of the original reference frame infrared image.
  • the invention effectively overcomes the phenomenon that the ordinary infrared image detail enhancement method is too abrupt, so that the processed infrared image not only has excellent scene detail enhancement ability, but also the gray scale distribution is closer to the real scene, and the degree is greatly improved.
  • the method is very easy to implement in hardware by using FPGA, and it has a very good effect on improving the performance of the camera in engineering.
  • the method of the present invention has a significant improvement on the actual infrared image, and the improvement is mainly two aspects:
  • the image will be over-brightened by the enhanced image.
  • the result of the method of the present invention makes the new image very close to the original scene in the gray-scale perception, and does not appear to be too bright. The effect of human eye observation.
  • the method of the present invention is significantly superior to the conventional method in the root mean square contrast index.
  • a background-future fluctuation index is specifically introduced to measure. This index is an important parameter that defines the image enhancement effect.
  • the standard deviation between a pixel gray value in an image and the remaining pixels in its adjacent space is small, the pixel value is considered to be a background pixel, and vice versa is considered to be a foreground pixel.
  • the image is processed, if the pixel value of the same position is the background pixel, the standard deviation value should be smaller than the original value, and the foreground standard deviation value should be larger than the original value. The greater the difference degree, the better the processing effect, the invention
  • the comparison of the method with the traditional method is shown in the following table:
  • the method of the present invention has a better effect on scene detail enhancement of infrared images, and the method can be implemented in a hardware system, which greatly improves engineering practicability.
  • the joint calculation of the adjacent two frames of infrared images is realized by using the improved joint bilateral filter, wherein the first frame is used as the reference frame and the second frame is the reference frame, thereby obtaining
  • the detail layer and the baseband layer component of the reference frame image are used to control the enhancement coefficient and eliminate the edge gradient inversion effect by using the guided gray scale similarity kernel function for the detail layer component, and realize the grayscale redistribution by using the improved histogram equalization technique for the fundamental frequency layer.
  • the two sub-images processed separately are superimposed and restored to realize the scene enhancement of the original reference frame infrared image.
  • the invention can effectively overcome the phenomenon that the ordinary infrared image detail enhancement method is too abrupt, so that the processed infrared image not only has excellent scene detail enhancement ability, but also the gray scale distribution is closer to the real scene, and the degree is great. Improves the visual perception of infrared images.
  • the present invention is very easy to implement in hardware by using an FPGA, and has a very good effect on improving the performance of the thermal imager in engineering.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

一种智能化红外图像场景增强方法,包括:利用改进型联合双边滤波器对相邻两帧红外图像进行联合计算,得到所述基准帧图像的细节层分量和基频层分量(S1);利用引导灰度相似项核函数控制所述细节层分量的增强范围并消除边缘梯度翻转效应,利用改进型的直方图计算方法控制所述基频层分量中整幅图像的灰度再分布(S2);将处理好的细节层图像和基频层图像进行叠加还原,以对原始基准帧红外图像的场景增强(S3)。所述方法克服了普通红外图像细节增强方法效果过于突兀的现象,使得处理后的红外图像不仅具有极佳的场景细节增强能力,同时灰度分布更加趋近于真实场景,极大程度的提升了红外图像的视觉观感。

Description

一种智能化红外图像场景增强方法 技术领域
本发明涉及红外图像高动态范围显示技术领域,特别涉及一种智能化红外图像场景增强方法。
背景技术
红外热成像在军民领域中均有极其广泛的应用,比如在系统设计、系统测试、系统制造、化学成像、夜视成像、灾难搜救、目标识别及探测和目标跟踪等领域。通常情况下普通的红外热像仪具有非常宽的数据动态范围,而传统的显示设备并不支持如此高的动态范围,因此在进行红外图像显示时,常用的方法是对红外图像进行类似直方图均衡化等高动态范围压缩技术。但是简单的直方图均衡化技术对提升红外图像的视觉观感效果极其有限,并不能将真实场景中的细节全部表现出来。同时,由于红外热成像是一种温差成像方式,其成像效果受到场景辐射出的红外热能影响较大,对于不可以进行高动态范围显示的传统显示设备来说,如果红外热能的区分度不高,在显示时人眼便无法仔细的区分场景中细节的微小温度差异,从而无法对场景进行仔细的观测。近年来,许多科研机构及研究人员就如何实现对红外图像的细节还原及增强做了大量的研究工作。
目前提出的具有一定可行性的红外图像细节增强方法主要有两大类,一类是基于边缘梯度算子的图像边缘增强方法,另一类是基于线性或非线性滤波器的图像整体细节增强方法。对于基于边缘梯度算子的增强方法来 说,较主流的算子有Sobel,Prewitt,Log及Laplacian算子等,这一类算法的缺点是只能够对图像中强边缘附近的细节进行有效的增强,对于温度区分度不高的弱边缘或细节成分几乎完全失效。对于基于线性或非线性滤波器的整体细节增强方法来说,其可以有效的增强图像中的强边缘和弱边缘等细节,对温度区分度高和低的场景均适用,但是在实际工程化应用的过程中,这些方法由于计算复杂度等问题,使得工程应用受限于超高的计算要求,无法做到实时化显示,这也在很大程度上限制了将这些方法应用于实际工程系统中,供军方或民品使用。例如目前比较主流的两种线性或非线性滤波器增强方法—基于双边滤波器的图像细节增强方法以及基于引导滤波器的图像细节增强方法。前者受限于增强过程中,双边滤波器带来的非线性边缘梯度翻转效应,使得处理后的图像在强边缘附近会出现一黑一白两个边缘的“鬼影”伪像,极大的降低了图像的真实感。为了克服这个问题,有科研人员运用高斯滤波器对强边缘进行了处理,但是由于数学模型的不完全匹配,导致这种“鬼影”未能完全消除,极大的降低了其在实际工程中的应用效果;后者采用引导线性滤波器,虽然能够获得很好的“鬼影”消除效果,但由于线性滤波器的计算效率约束,导致最终的图像增强效果一般,无法完全突出场景中的全部微小细节,同样也限制了其在实际工程中的应用效果。因此直到目前,在红外热成像系统中,尚未有极优的场景细节增强算法应用到工程机中。
发明内容
本发明的目的旨在至少解决所述技术缺陷之一。
为此,本发明的目的在于提出一种智能化红外图像场景增强方法。
为了实现上述目的,本发明的实施例提供一种智能化红外图像场景增强方法,包括如下步骤:
步骤S1,利用改进型联合双边滤波器对相邻两帧红外图像进行联合计算,其中,将相邻两帧中的第一帧设为基准帧,将第二帧设为参考帧,得到所述基准帧图像的细节层分量和基频层分量;
步骤S2,利用引导灰度相似项核函数控制所述细节层分量的增强范围并消除边缘梯度翻转效应,利用改进型的直方图计算方法控制所述基频层分量中整幅图像的灰度再分布;
步骤S3,将所述步骤S2中处理好的细节层图像和基频层图像进行叠加还原,以对原始基准帧红外图像的场景增强。
进一步,在所述步骤S1中,在所述步骤S1中,采用下式进行对相邻两帧红外图像进行联合计算,包括:
Figure PCTCN2018096021-appb-000001
I d=I R-I JBF
其中,I JBF是基频层,I d是细节层,I R是参考帧,I B是基准帧,Ω是滤波器窗口大小,k为改进型联合双边滤波器的系数的归一化系数项;
Figure PCTCN2018096021-appb-000002
其中,ω s,ω r为两个高斯核函数,ω s为空间域核函数,ω r为强度域核函数,k的作用是将求解的两个核函数ω s,ω r进行归一化,以应对各种不 同的红外热像仪采集到的红外热图像。
进一步,所述ω s,ω r两个核函数分别控制联合双边滤波时获取的滤波窗口内细节分量的权重,其中,
Figure PCTCN2018096021-appb-000003
Figure PCTCN2018096021-appb-000004
其中,σ r和σ s是滤波窗口内的各具体灰度空间域和强度域的标准差,σ r定义了高斯核函数ω r的范围,σ r决定了滤波窗口内图像边缘的最小变化幅值,σ s定义了高斯核函数ω s的范围,σ s决定了相邻帧图像中对应位置像素点的滤波窗口大小,并且该参数的大小应当随着整幅图像尺寸的变化而变化,如果两帧图像的滤波窗口幅度变化小于σ r,则该部分灰度将会联合双边滤波器平滑并分离到基频层内,反之若幅度变化大于σ r,则该部分灰度将会被分离到细节层中。
进一步,在所述步骤S2中,所述引导灰度相似项核函数的表达式如下:
f(i-i',j-j')=ω s(i-i',j-j')ω e(i-i',j-j')f k
其中,f k即为该核函数,ω e为梯度项,ω d为引导空间相似项。
进一步,所述引导灰度相似项核函数在抑制边缘梯度翻转效应时的表达式为:
f k=α(Ω)ω r(I B-I R)+(1-α(Ω))ω d(i-i',j-j')
其中,f k即为该核函数,ω e为梯度项,ω d为引导空间相似项,α(Ω)为 自适应融合系数,
Figure PCTCN2018096021-appb-000005
其中,σ d为引导空间相似项的标准差,α(Ω)是用来将灰度相似项与引导空间相似项进行融合的权重,用下式表示:
Figure PCTCN2018096021-appb-000006
Figure PCTCN2018096021-appb-000007
式中,ε是受限因子,该因子可以防止出现0值标准差的问题。当灰度变化变小而空域波动变大时,α(Ω)趋向于1,此时强度相似项受限;反之,α(Ω)趋向于0,此时空间相似项受限。
进一步,所述梯度项ω e的表达式如下:
Figure PCTCN2018096021-appb-000008
Figure PCTCN2018096021-appb-000009
Figure PCTCN2018096021-appb-000010
其中,x,y表示水平及垂直方向,
Figure PCTCN2018096021-appb-000011
Figure PCTCN2018096021-appb-000012
为水平及垂直方向的梯 度,σ e为梯度标准差,G x/y表示相邻帧中对应像素点位置的梯度变化等级。
进一步,在所述步骤S3中,在对细节层分量与基频层分量被分别提取之后,对上述两个分量进行相应的增强及直方图均衡化处理,并将得到的处理结果进行回叠,以得到最终的增强效果。
根据本发明实施例的智能化红外图像场景增强方法,利用改进型联合双边滤波器实现对相邻两帧红外图像的联合计算,以第一帧为基准帧,第二帧为参考帧,从而得到基准帧图像的细节层和基频层分量,对细节层分量利用引导灰度相似项核函数控制增强系数及消除边缘梯度翻转效应,对基频层利用改进型直方图均衡技术实现灰度再分布,最后将分别处理好的两幅子图像进行叠加还原,实现对原始基准帧红外图像的场景增强。本发明可以有效的克服了普通红外图像细节增强方法效果过于突兀的现象,使得处理后的红外图像不仅具有极佳的场景细节增强能力,同时灰度分布更加趋近于真实场景,极大程度的提升了红外图像的视觉观感。此外,本发明非常便于利用FPGA在硬件中实现,在工程化中对提升热像仪性能具有非常好的作用。
本发明附加的方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本发明的实践了解到。
附图说明
本发明的上述和/或附加的方面和优点从结合下面附图对实施例的描述中将变得明显和容易理解,其中:
图1为根据本发明实施例的智能化红外图像场景增强方法的流程图;
图2为根据本发明实施例的智能化红外图像场景增强方法的整体流程架构图;
图3(a)至图3(e)为根据本发明实施例的滤波细节分层级抑制鬼影效应的效果图;
图4为根据本发明实施例的对细节层分量采用不同的控制系数得到的细节增强效果图;
图5(a)和图5(b)为根据本发明实施例的对实际红外图像的场景增强效果图;
图6(a)和图6(b)为根据本发明实施例的对实际红外图像的场景增强效果图;
图7(a)和图7(b)为根据本发明实施例的对实际红外图像的场景增强效果图;
图8(a)和图8(b)为为根据本发明实施例的对实际红外图像的场景增强效果图;
图9为根据本发明实施例的与现有方法的指标对比图。
具体实施方式
下面详细描述本发明的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本 发明,而不能理解为对本发明的限制。
本发明提出一种智能化红外图像场景增强方法,以相邻两帧红外图像为基础,联立计算图像之间的灰度及细节特征,并特别设计了一种核函数来针对边缘梯度翻转效应,该核函数能够高效、快速的计算联合双边滤波器计算得到的细节分量特征,区分强弱边缘信息,并进行梯度翻转效应抑制。继而对细节特征进行有效增强,实现红外图像显示效果的极大提升。
如图1和图2所示,本发明实施例的智能化红外图像场景增强方法,包括如下步骤:
步骤S1,利用改进型联合双边滤波器对相邻两帧红外图像进行联合计算,其中,将相邻两帧中的第一帧设为基准帧,将第二帧设为参考帧,得到基准帧图像的细节层分量和基频层分量,即通过该联合计算将基准帧滤波分离成基频层和细节层。
在步骤S1中,采用下式进行对相邻两帧红外图像进行联合计算,包括:
Figure PCTCN2018096021-appb-000013
I d=I R-I JBF    (2)
其中,I JBF是基频层,I d是细节层,I R是参考帧,I B是基准帧,Ω是滤波器窗口大小,k为改进型联合双边滤波器的系数的归一化系数项;
Figure PCTCN2018096021-appb-000014
其中,ω s,ω r为两个高斯核函数,ω s为空间域核函数,ω r为强度域核函数,k的作用是将求解的两个核函数ω s,ω r进行归一化,以应对各种不 同的红外热像仪采集到的红外热图像。该系数k的作用是将求解的两个核函数ω s,ω r进行归一化,这样做的好处是能够应对各种不同的红外热像仪采集到的红外热图像。因为不同厂家,不同型号的热像仪其输出的灰度大小有区别,在计算时进行归一化可以将所有的灰度区间都统一到(0,1)之间,从而排除掉热像仪设备之间的响应差异,提高本方法的普适性。
在本发明的一个实施例中,ω s,ω r两个核函数分别控制联合双边滤波时获取的滤波窗口内细节分量的权重,其中,
Figure PCTCN2018096021-appb-000015
Figure PCTCN2018096021-appb-000016
其中,σ r和σ s是滤波窗口内的各具体灰度空间域和强度域的标准差,σ r定义了高斯核函数ω r的范围,σ r决定了滤波窗口内图像边缘的最小变化幅值,σ s定义了高斯核函数ω s的范围,σ s决定了相邻帧图像中对应位置像素点的滤波窗口大小,并且该参数的大小应当随着整幅图像尺寸的变化而变化。因为在联合双边滤波过程中选取的相邻帧图像之间差异性很小,如果两帧图像的滤波窗口幅度变化小于σ r,则该部分灰度将会联合双边滤波器平滑并分离到基频层内,反之若幅度变化大于σ r,则该部分灰度将会被分离到细节层中。
步骤S2,利用引导灰度相似项核函数控制细节层分量的增强范围并消 除边缘梯度翻转效应,利用改进型的直方图计算方法控制基频层分量中整幅图像的灰度再分布。
传统的自适应高斯滤波去“鬼影”技术在实现上具有相当大的技术难度,在硬件系统中甚至无法实现自适应过程,这在很大程度上降低了该方法的有效性。本发明中特别设计了一种核函数来有效抑制“鬼影”效应,这个核函数是基于梯度受限因子的考虑,细节层分量中的梯度能量普遍较弱,而这些梯度的稳定性又与灰度的不稳定性有重要的联系,从图像的显著性结构来说,梯度的边缘结构特征要比灰度的变化剧烈程度还要大。
引导灰度相似项核函数的表达式如下:
f(i-i',j-j')=ω s(i-i',j-j')ω e(i-i',j-j')f k    (6)
其中,f k即为该核函数,ω e为梯度项,ω d为引导空间相似项。
引导灰度相似项核函数在抑制边缘梯度翻转效应时的表达式为:
f k=α(Ω)ω r(I B-I R)+(1-α(Ω))ω d(i-i',j-j')    (7)
其中,f k即为该核函数,ω e为梯度项,ω d为引导空间相似项,α(Ω)为自适应融合系数,
Figure PCTCN2018096021-appb-000017
其中,σ d为引导空间相似项的标准差,α(Ω)是用来将灰度相似项与引导空间相似项进行融合的权重,用下式表示:
Figure PCTCN2018096021-appb-000018
Figure PCTCN2018096021-appb-000019
式中,ε是受限因子,该因子可以防止出现0值标准差的问题。当灰度变化变小而空域波动变大时,α(Ω)趋向于1,此时强度相似项受限;反之,α(Ω)趋向于0,此时空间相似项受限。
梯度项ω e的表达式如下:
Figure PCTCN2018096021-appb-000020
Figure PCTCN2018096021-appb-000021
Figure PCTCN2018096021-appb-000022
其中,x,y表示水平及垂直方向,
Figure PCTCN2018096021-appb-000023
Figure PCTCN2018096021-appb-000024
为水平及垂直方向的梯度,σ e为梯度标准差,G x/y表示相邻帧中对应像素点位置的梯度变化等级。经过这一系列的处理,鬼影效应可以被完全消除。图3(a)至图3(e)为根据本发明实施例的滤波细节分层级抑制鬼影效应的效果图。
当被抑制过鬼影效应的细节分量正确提取出来之后,由于噪声和细节较难以区分,因此部分高频噪声也会被当成细节分量而滤波进入细节层, 因此需要通过合适的判断方法来压制噪声,同时保留细节。本发明中利用设计的新核函数中的自适应融合因子项来有效的实现该目的。在图像中的平坦区域,此时α(Ω)的值趋近于0,随着像素灰度的波动该值会上升到不超过1的等级,为了最大程度的在抑制噪声的同时保留细节,本发明将该值的门限设为0.95,一旦该值超过0.95,则不再上升。其控制表达式如下:
I' d=I d*(α(Ω)*a+b)    (14)
其中,a和b是控制系数。在本发明中,当a=0.3,b=0.65时,图像细节和噪声有最好的表现。图4示出了对细节层分量采用不同的控制系数得到的细节增强效果。
步骤S3,将步骤S2中处理好的细节层图像和基频层图像进行叠加还原,以对原始基准帧红外图像的场景增强。
在本步骤中,在对细节层分量与基频层分量被分别提取之后,对上述两个分量进行相应的增强及直方图均衡化处理,并将得到的处理结果进行回叠,从而得到最终的增强效果,大大提升观察者的视觉效果。
具体的,将步骤S2中分别处理好的细节层和基频层两幅子图像进行叠加还原,实现对原始基准帧红外图像的场景增强。本发明有效的克服了普通红外图像细节增强方法效果过于突兀的现象,使得处理后的红外图像不仅具有极佳的场景细节增强能力,同时灰度分布更加趋近于真实场景,极大程度的提升了红外图像的视觉观感,此外,本方法非常便于利用FPGA在硬件中实现,在工程化中对提升热像仪性能具有非常好的作用。
结合图4到图7的最终处理效果我们可以看出,本发明的方法对实际红外图像有着明显的提升,该提升主要是两方面:
(1)图像的整体细节得到了极大程度的突出,图像层次感分明,细节明显,视觉效果极佳;
(2)不同于传统的方法会导致增强后的图像出现过亮的失真现象,本发明方法处理的结果使得新图像在灰度观感上非常接近于原始场景,不会出现过亮的现象而影响人眼观察的效果。
结合图8可以看出,本发明方法在均方根对比度指标上明显优于传统的方法。为了更好的判断本发明方法的效果,特别引入了背景-前景波动指数来衡量。该指数是界定图像增强效果的重要参数。图像中的某个像素灰度值与其邻近空间内的其余像素之间的标准差变化较小时,该像素值被认为是背景像素,反之则被认为是前景像素。当对图像进行处理后,相同位置的像素值若为背景像素,则其标准差值应当小于原值,而其前景标准差值应当大于原值,差异程度越大说明处理效果越好,本发明方法与传统方法的对比结果如下表所示:
Figure PCTCN2018096021-appb-000025
表1
经过各种总和对比,本发明方法对红外图像的场景细节增强具有更优的效果,并且该方法可以在硬件系统中实现,大大提高了工程实用性。
根据本发明实施例的智能化红外图像场景增强方法,利用改进型联合双边滤波器实现对相邻两帧红外图像的联合计算,以第一帧为基准帧,第二帧为参考帧,从而得到基准帧图像的细节层和基频层分量,对细节层分量利用引导灰度相似项核函数控制增强系数及消除边缘梯度翻转效应,对基频层利用改进型直方图均衡技术实现灰度再分布,最后将分别处理好的两幅子图像进行叠加还原,实现对原始基准帧红外图像的场景增强。本发明可以有效的克服了普通红外图像细节增强方法效果过于突兀的现象,使得处理后的红外图像不仅具有极佳的场景细节增强能力,同时灰度分布更加趋近于真实场景,极大程度的提升了红外图像的视觉观感。此外,本发明非常便于利用FPGA在硬件中实现,在工程化中对提升热像仪性能具有非常好的作用。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施例或示例中以合适的方式结合。
尽管上面已经示出和描述了本发明的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本发明的限制,本领域的普通技术人员在不脱离本发明的原理和宗旨的情况下在本发明的范围内可以对上述实施例进行变化、修改、替换和变型。本发明的范围由所附权利要求及其等同限定。

Claims (7)

  1. 一种智能化红外图像场景增强方法,其特征在于,包括如下步骤:
    步骤S1,利用改进型联合双边滤波器对相邻两帧红外图像进行联合计算,其中,将相邻两帧中的第一帧设为基准帧,将第二帧设为参考帧,得到所述基准帧图像的细节层分量和基频层分量;
    步骤S2,利用引导灰度相似项核函数控制所述细节层分量的增强范围并消除边缘梯度翻转效应,利用改进型的直方图计算方法控制所述基频层分量中整幅图像的灰度再分布;
    步骤S3,将所述步骤S2中处理好的细节层图像和基频层图像进行叠加还原,以对原始基准帧红外图像的场景增强。
  2. 如权利要求1所述的智能化红外图像场景增强方法,其特征在于,在所述步骤S1中,在所述步骤S1中,采用下式进行对相邻两帧红外图像进行联合计算,包括:
    Figure PCTCN2018096021-appb-100001
    I d=I R-I JBF
    其中,I JBF是基频层,I d是细节层,I R是参考帧,I B是基准帧,Ω是滤波器窗口大小,k为改进型联合双边滤波器的系数的归一化系数项;
    Figure PCTCN2018096021-appb-100002
    其中,ω s,ω r为两个高斯核函数,ω s为空间域核函数,ω r为强度域核函数,k的作用是将求解的两个核函数ω s,ω r进行归一化,以应对各种不 同的红外热像仪采集到的红外热图像。
  3. 如权利要求2所述的智能化红外图像场景增强方法,其特征在于,所述ω s,ω r两个核函数分别控制联合双边滤波时获取的滤波窗口内细节分量的权重,其中,
    Figure PCTCN2018096021-appb-100003
    Figure PCTCN2018096021-appb-100004
    其中,σ r和σ s是滤波窗口内的各具体灰度空间域和强度域的标准差,σ r定义了高斯核函数ω r的范围,σ r决定了滤波窗口内图像边缘的最小变化幅值,σ s定义了高斯核函数ω s的范围,σ s决定了相邻帧图像中对应位置像素点的滤波窗口大小,并且该参数的大小应当随着整幅图像尺寸的变化而变化,如果两帧图像的滤波窗口幅度变化小于σ r,则该部分灰度将会联合双边滤波器平滑并分离到基频层内,反之若幅度变化大于σ r,则该部分灰度将会被分离到细节层中。
  4. 如权利要求1所述的智能化红外图像场景增强方法,其特征在于,在所述步骤S2中,所述引导灰度相似项核函数的表达式如下:
    f(i-i',j-j')=ω s(i-i',j-j')ω e(i-i',j-j')f k
    其中,f k即为该核函数,ω e为梯度项,ω d为引导空间相似项。
  5. 如权利要求4所述的智能化红外图像场景增强方法,其特征在于,所述引导灰度相似项核函数在抑制边缘梯度翻转效应时的表达式为:
    f k=α(Ω)ω r(I B-I R)+(1-α(Ω))ω d(i-i',j-j')
    其中,f k即为该核函数,ω e为梯度项,ω d为引导空间相似项,α(Ω)为自适应融合系数,
    Figure PCTCN2018096021-appb-100005
    其中,σ d为引导空间相似项的标准差,α(Ω)是用来将灰度相似项与引导空间相似项进行融合的权重,用下式表示:
    Figure PCTCN2018096021-appb-100006
    Figure PCTCN2018096021-appb-100007
    式中,ε是受限因子,该因子可以防止出现0值标准差的问题。当灰度变化变小而空域波动变大时,α(Ω)趋向于1,此时强度相似项受限;反之,α(Ω)趋向于0,此时空间相似项受限。
  6. 如权利要求5所述的智能化红外图像场景增强方法,其特征在于,所述梯度项ω e的表达式如下:
    Figure PCTCN2018096021-appb-100008
    Figure PCTCN2018096021-appb-100009
    Figure PCTCN2018096021-appb-100010
    其中,x,y表示水平及垂直方向,
    Figure PCTCN2018096021-appb-100011
    Figure PCTCN2018096021-appb-100012
    为水平及垂直方向的梯度,σ e为梯度标准差,G x/y表示相邻帧中对应像素点位置的梯度变化等级。
  7. 如权利要求1所述的智能化红外图像场景增强方法,其特征在于,在所述步骤S3中,在对细节层分量与基频层分量被分别提取之后,对上述两个分量进行相应的增强及直方图均衡化处理,并将得到的处理结果进行回叠,以得到最终的增强效果。
PCT/CN2018/096021 2018-01-29 2018-07-17 一种智能化红外图像场景增强方法 WO2019144581A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810085091.1A CN108376391B (zh) 2018-01-29 2018-01-29 一种智能化红外图像场景增强方法
CN201810085091.1 2018-01-29

Publications (1)

Publication Number Publication Date
WO2019144581A1 true WO2019144581A1 (zh) 2019-08-01

Family

ID=63016918

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/096021 WO2019144581A1 (zh) 2018-01-29 2018-07-17 一种智能化红外图像场景增强方法

Country Status (2)

Country Link
CN (1) CN108376391B (zh)
WO (1) WO2019144581A1 (zh)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110852977A (zh) * 2019-10-29 2020-02-28 天津大学 融合边缘灰度直方图与人眼视觉感知特性的图像增强方法
CN110992287A (zh) * 2019-12-03 2020-04-10 中国电子科技集团公司信息科学研究院 一种非均匀光照视频的清晰化方法
CN111080538A (zh) * 2019-11-29 2020-04-28 中国电子科技集团公司第五十二研究所 一种红外融合边缘增强方法
CN111369458A (zh) * 2020-02-28 2020-07-03 中国人民解放军空军工程大学 基于多尺度滚动引导滤波平滑的红外弱小目标背景抑制方法
CN111489319A (zh) * 2020-04-17 2020-08-04 电子科技大学 基于多尺度双边滤波和视觉显著性的红外图像增强方法
CN112819772A (zh) * 2021-01-28 2021-05-18 南京挥戈智能科技有限公司 一种高精度快速图形检测识别方法
CN112862665A (zh) * 2019-11-12 2021-05-28 北京华茂通科技有限公司 一种激光驱鸟设备的红外图像动态范围压缩方法
CN112950516A (zh) * 2021-01-29 2021-06-11 Oppo广东移动通信有限公司 图像局部对比度增强的方法及装置、存储介质及电子设备
CN113096053A (zh) * 2021-03-17 2021-07-09 西安电子科技大学 基于多尺度引导滤波的高动态红外图像细节增强方法
CN113421305A (zh) * 2021-06-29 2021-09-21 上海高德威智能交通系统有限公司 目标检测方法、装置、系统、电子设备及存储介质
CN113487525A (zh) * 2021-07-06 2021-10-08 河南慧联世安信息技术有限公司 一种基于双平台直方图的自迭代红外图像增强方法
CN113592729A (zh) * 2021-06-30 2021-11-02 国网吉林省电力有限公司延边供电公司 基于nsct域的电力设备红外图像增强方法
CN113763367A (zh) * 2021-09-13 2021-12-07 中国空气动力研究与发展中心超高速空气动力研究所 一种大尺寸试件红外检测特征综合判读方法
CN113763368A (zh) * 2021-09-13 2021-12-07 中国空气动力研究与发展中心超高速空气动力研究所 一种大尺寸试件多类型损伤检测特征分析方法
CN113822878A (zh) * 2021-11-18 2021-12-21 南京智谱科技有限公司 一种红外图像处理的方法及装置
CN113822352A (zh) * 2021-09-15 2021-12-21 中北大学 基于多特征融合的红外弱小目标检测方法
CN113902641A (zh) * 2021-10-12 2022-01-07 西安交通大学 一种基于红外图像的数据中心热区判别方法及系统
CN114092353A (zh) * 2021-11-19 2022-02-25 长春理工大学 一种基于加权引导滤波的红外图像增强方法
US11346938B2 (en) 2019-03-15 2022-05-31 Msa Technology, Llc Safety device for providing output to an individual associated with a hazardous environment
CN114742732A (zh) * 2022-04-19 2022-07-12 武汉博宇光电系统有限责任公司 一种基于细节丰富度的红外图像增强方法
CN114862739A (zh) * 2022-07-06 2022-08-05 珠海市人民医院 一种医学影像智能增强方法和系统
CN115619659A (zh) * 2022-09-22 2023-01-17 北方夜视科技(南京)研究院有限公司 基于正则化高斯场模型的低照度图像增强方法与系统
CN115797453A (zh) * 2023-01-17 2023-03-14 西南科技大学 一种红外微弱目标的定位方法、定位装置及可读存储介质
CN116245880A (zh) * 2023-05-09 2023-06-09 深圳市银河通信科技有限公司 基于红外识别的电动车充电桩火灾风险检测方法
CN116342588B (zh) * 2023-05-22 2023-08-11 徕兄健康科技(威海)有限责任公司 一种脑血管图像增强方法
CN117078568A (zh) * 2023-10-12 2023-11-17 成都智明达电子股份有限公司 一种红外图像增强的方法
CN117853817A (zh) * 2024-01-24 2024-04-09 江苏电子信息职业学院 一种基于图像识别的智慧社区垃圾分类报警管理方法
CN117853817B (zh) * 2024-01-24 2024-06-04 江苏电子信息职业学院 一种基于图像识别的智慧社区垃圾分类报警管理方法

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109741267B (zh) * 2018-12-05 2023-04-25 西安电子科技大学 基于三边滤波和神经网络的红外图像非均匀性校正方法
EP3671625B1 (en) * 2018-12-18 2020-11-25 Axis AB Method, device, and system for enhancing changes in an image captured by a thermal camera
CN110570374B (zh) * 2019-09-05 2022-04-22 湖北南邦创电科技有限公司 一种对红外传感器所获得图像的处理方法
CN111476732B (zh) * 2020-04-03 2021-07-20 江苏宇特光电科技股份有限公司 一种图像融合及去噪的方法及系统
CN113724162B (zh) * 2021-08-31 2023-09-29 南京邮电大学 一种零补光实时全彩夜视成像方法及系统
CN116433035B (zh) * 2023-06-13 2023-09-15 中科数创(临沂)数字科技有限公司 一种基于人工智能的建筑电气火灾风险评估预测方法
CN116452594B (zh) * 2023-06-19 2023-08-29 安徽百胜电子系统集成有限责任公司 一种输电线路状态可视化监测预警方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101399950A (zh) * 2007-09-30 2009-04-01 上海携昌电子科技有限公司 一种基于图像轮廓匹配的帧频加倍的方法
US20110091100A1 (en) * 2009-10-20 2011-04-21 Samsung Electronics Co., Ltd. Apparatus and method of removing false color in image
CN105517671A (zh) * 2015-05-25 2016-04-20 北京大学深圳研究生院 一种基于光流法的视频插帧方法及系统
CN106303546A (zh) * 2016-08-31 2017-01-04 四川长虹通信科技有限公司 一种帧速率上转换方法及系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103177429A (zh) * 2013-04-16 2013-06-26 南京理工大学 基于fpga的红外图像细节增强系统及其方法
US20170243326A1 (en) * 2016-02-19 2017-08-24 Seek Thermal, Inc. Pixel decimation for an imaging system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101399950A (zh) * 2007-09-30 2009-04-01 上海携昌电子科技有限公司 一种基于图像轮廓匹配的帧频加倍的方法
US20110091100A1 (en) * 2009-10-20 2011-04-21 Samsung Electronics Co., Ltd. Apparatus and method of removing false color in image
CN105517671A (zh) * 2015-05-25 2016-04-20 北京大学深圳研究生院 一种基于光流法的视频插帧方法及系统
CN106303546A (zh) * 2016-08-31 2017-01-04 四川长虹通信科技有限公司 一种帧速率上转换方法及系统

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11346938B2 (en) 2019-03-15 2022-05-31 Msa Technology, Llc Safety device for providing output to an individual associated with a hazardous environment
CN110852977A (zh) * 2019-10-29 2020-02-28 天津大学 融合边缘灰度直方图与人眼视觉感知特性的图像增强方法
CN110852977B (zh) * 2019-10-29 2023-04-11 天津大学 融合边缘灰度直方图与人眼视觉感知特性的图像增强方法
CN112862665B (zh) * 2019-11-12 2024-01-23 北京华茂通科技有限公司 一种激光驱鸟设备的红外图像动态范围压缩方法
CN112862665A (zh) * 2019-11-12 2021-05-28 北京华茂通科技有限公司 一种激光驱鸟设备的红外图像动态范围压缩方法
CN111080538B (zh) * 2019-11-29 2022-08-16 中国电子科技集团公司第五十二研究所 一种红外融合边缘增强方法
CN111080538A (zh) * 2019-11-29 2020-04-28 中国电子科技集团公司第五十二研究所 一种红外融合边缘增强方法
CN110992287B (zh) * 2019-12-03 2023-02-24 中国电子科技集团公司信息科学研究院 一种非均匀光照视频的清晰化方法
CN110992287A (zh) * 2019-12-03 2020-04-10 中国电子科技集团公司信息科学研究院 一种非均匀光照视频的清晰化方法
CN111369458B (zh) * 2020-02-28 2023-04-07 中国人民解放军空军工程大学 基于多尺度滚动引导滤波平滑的红外弱小目标背景抑制方法
CN111369458A (zh) * 2020-02-28 2020-07-03 中国人民解放军空军工程大学 基于多尺度滚动引导滤波平滑的红外弱小目标背景抑制方法
CN111489319A (zh) * 2020-04-17 2020-08-04 电子科技大学 基于多尺度双边滤波和视觉显著性的红外图像增强方法
CN112819772B (zh) * 2021-01-28 2024-05-03 南京挥戈智能科技有限公司 一种高精度快速图形检测识别方法
CN112819772A (zh) * 2021-01-28 2021-05-18 南京挥戈智能科技有限公司 一种高精度快速图形检测识别方法
CN112950516A (zh) * 2021-01-29 2021-06-11 Oppo广东移动通信有限公司 图像局部对比度增强的方法及装置、存储介质及电子设备
CN112950516B (zh) * 2021-01-29 2024-05-28 Oppo广东移动通信有限公司 图像局部对比度增强的方法及装置、存储介质及电子设备
CN113096053A (zh) * 2021-03-17 2021-07-09 西安电子科技大学 基于多尺度引导滤波的高动态红外图像细节增强方法
CN113096053B (zh) * 2021-03-17 2024-02-09 西安电子科技大学 基于多尺度引导滤波的高动态红外图像细节增强方法
CN113421305A (zh) * 2021-06-29 2021-09-21 上海高德威智能交通系统有限公司 目标检测方法、装置、系统、电子设备及存储介质
CN113421305B (zh) * 2021-06-29 2023-06-02 上海高德威智能交通系统有限公司 目标检测方法、装置、系统、电子设备及存储介质
CN113592729A (zh) * 2021-06-30 2021-11-02 国网吉林省电力有限公司延边供电公司 基于nsct域的电力设备红外图像增强方法
CN113487525A (zh) * 2021-07-06 2021-10-08 河南慧联世安信息技术有限公司 一种基于双平台直方图的自迭代红外图像增强方法
CN113763367A (zh) * 2021-09-13 2021-12-07 中国空气动力研究与发展中心超高速空气动力研究所 一种大尺寸试件红外检测特征综合判读方法
CN113763368A (zh) * 2021-09-13 2021-12-07 中国空气动力研究与发展中心超高速空气动力研究所 一种大尺寸试件多类型损伤检测特征分析方法
CN113822352B (zh) * 2021-09-15 2024-05-17 中北大学 基于多特征融合的红外弱小目标检测方法
CN113822352A (zh) * 2021-09-15 2021-12-21 中北大学 基于多特征融合的红外弱小目标检测方法
CN113902641A (zh) * 2021-10-12 2022-01-07 西安交通大学 一种基于红外图像的数据中心热区判别方法及系统
CN113902641B (zh) * 2021-10-12 2023-09-12 西安交通大学 一种基于红外图像的数据中心热区判别方法及系统
CN113822878A (zh) * 2021-11-18 2021-12-21 南京智谱科技有限公司 一种红外图像处理的方法及装置
CN114092353B (zh) * 2021-11-19 2024-06-04 长春理工大学 一种基于加权引导滤波的红外图像增强方法
CN114092353A (zh) * 2021-11-19 2022-02-25 长春理工大学 一种基于加权引导滤波的红外图像增强方法
CN114742732B (zh) * 2022-04-19 2024-05-28 武汉博宇光电系统有限责任公司 一种基于细节丰富度的红外图像增强方法
CN114742732A (zh) * 2022-04-19 2022-07-12 武汉博宇光电系统有限责任公司 一种基于细节丰富度的红外图像增强方法
CN114862739B (zh) * 2022-07-06 2022-09-23 珠海市人民医院 一种医学影像智能增强方法和系统
CN114862739A (zh) * 2022-07-06 2022-08-05 珠海市人民医院 一种医学影像智能增强方法和系统
CN115619659B (zh) * 2022-09-22 2024-01-23 北方夜视科技(南京)研究院有限公司 基于正则化高斯场模型的低照度图像增强方法与系统
CN115619659A (zh) * 2022-09-22 2023-01-17 北方夜视科技(南京)研究院有限公司 基于正则化高斯场模型的低照度图像增强方法与系统
CN115797453B (zh) * 2023-01-17 2023-06-16 西南科技大学 一种红外微弱目标的定位方法、定位装置及可读存储介质
CN115797453A (zh) * 2023-01-17 2023-03-14 西南科技大学 一种红外微弱目标的定位方法、定位装置及可读存储介质
CN116245880A (zh) * 2023-05-09 2023-06-09 深圳市银河通信科技有限公司 基于红外识别的电动车充电桩火灾风险检测方法
CN116342588B (zh) * 2023-05-22 2023-08-11 徕兄健康科技(威海)有限责任公司 一种脑血管图像增强方法
CN117078568A (zh) * 2023-10-12 2023-11-17 成都智明达电子股份有限公司 一种红外图像增强的方法
CN117078568B (zh) * 2023-10-12 2024-02-23 成都智明达电子股份有限公司 一种红外图像增强的方法
CN117853817A (zh) * 2024-01-24 2024-04-09 江苏电子信息职业学院 一种基于图像识别的智慧社区垃圾分类报警管理方法
CN117853817B (zh) * 2024-01-24 2024-06-04 江苏电子信息职业学院 一种基于图像识别的智慧社区垃圾分类报警管理方法

Also Published As

Publication number Publication date
CN108376391A (zh) 2018-08-07
CN108376391B (zh) 2022-04-05

Similar Documents

Publication Publication Date Title
WO2019144581A1 (zh) 一种智能化红外图像场景增强方法
Chen et al. Robust image and video dehazing with visual artifact suppression via gradient residual minimization
Zhou et al. Retinex-based laplacian pyramid method for image defogging
CN110796616B (zh) 基于范数约束和自适应加权梯度的湍流退化图像恢复方法
Li et al. Contrast enhancement based single image dehazing via TV-L 1 minimization
Liu et al. Single color image dehazing based on digital total variation filter with color transfer
CN107451986B (zh) 一种基于融合技术的单幅红外图像增强方法
Dou et al. Image smoothing via truncated total variation
Chen et al. Improve transmission by designing filters for image dehazing
CN111652809A (zh) 一种增强细节的红外图像噪声抑制方法
CN111899200B (zh) 一种基于3d滤波的红外图像增强方法
Pei et al. Joint edge detector based on Laplacian pyramid
WO2022016326A1 (zh) 图像处理方法、电子设备和计算机可读介质
Yin et al. Combined window filtering and its applications
Zhou et al. Single image dehazing based on weighted variational regularized model
CN110136081A (zh) 一种基于高斯核偏态校正弥撒滤波器的图像增强方法
Dou et al. Image dehaze using alternating Laplacian and Beltrami regularizations
Elhefnawy et al. Effective visibility restoration and enhancement of air polluted images with high information fidelity
Yin et al. Low illumination image Retinex enhancement algorithm based on guided filtering
Chaudhry et al. Model-assisted content adaptive detail enhancement and quadtree decomposition for image visibility enhancement
Naseeba et al. KP Visibility Restoration of Single Hazy Images Captured in Real-World Weather Conditions
Hu et al. A method for dehazed image quality assessment
Park et al. Variational image dehazing using a fuzzy membership function
Rajendran et al. A pixel-based color transfer system to recolor nighttime imagery
Valderrama et al. Single image dehazing using local adaptive signal processing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902385

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902385

Country of ref document: EP

Kind code of ref document: A1