WO2017014396A1 - Dispositif et procédé pour débrumage d'image adaptatif à la longueur d'onde - Google Patents

Dispositif et procédé pour débrumage d'image adaptatif à la longueur d'onde Download PDF

Info

Publication number
WO2017014396A1
WO2017014396A1 PCT/KR2016/002359 KR2016002359W WO2017014396A1 WO 2017014396 A1 WO2017014396 A1 WO 2017014396A1 KR 2016002359 W KR2016002359 W KR 2016002359W WO 2017014396 A1 WO2017014396 A1 WO 2017014396A1
Authority
WO
WIPO (PCT)
Prior art keywords
fog
image
input image
region
wavelength
Prior art date
Application number
PCT/KR2016/002359
Other languages
English (en)
Korean (ko)
Inventor
백준기
윤인혜
정석화
Original Assignee
중앙대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 중앙대학교 산학협력단 filed Critical 중앙대학교 산학협력단
Priority to CN201680030045.0A priority Critical patent/CN107636724A/zh
Publication of WO2017014396A1 publication Critical patent/WO2017014396A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding

Definitions

  • Embodiments of the present invention relate to an apparatus and method for adaptive wavelength removal of an image.
  • External video surveillance systems e.g., drone video, black box video, cctv video, etc.
  • drone video e.g., drone video, black box video, cctv video, etc.
  • cctv video e.g., black box video
  • cctv video e.g., cctv video
  • the method proposed by Narasimhan acquires two images with different brightness values due to deterioration factors at the same location, and then removes the fog by obtaining transmission degree and depth information according to atmospheric degradation factors.
  • Schwartz's method uses two images obtained by mounting different polarization filters at the same location. Since the intensity of light transmitted through each polarization filter is different, the polarized value is calculated by dividing the polarized amount and the polarized fog is removed by using the polarized value.
  • Fattal's proposed method eliminates the fog on the assumption that "the measurement of the reflectance of an image is always the same and the direction of light reflection at the same location is always the same.”
  • Tan's method eliminates atmospheric degradation factors by using the feature that "images with fog have a lower contrast than images without fog". However, this excessively increases the contrast, causing a problem of partial color distortion.
  • Kong's method obtains an image with improved contrast by performing histogram equalization across the pixels of the image and segmenting the image locally.
  • the method proposed by Xu enhances the contrast and clarity by using a virtual histogram distribution technique through parameter control of the image.
  • the above methods can improve the visibility of the image, but it is difficult to restore the color of the subject in the fog image.
  • the present invention is to propose a fog removal apparatus and method that can reduce the amount of calculation and can remove the fog adaptively.
  • the input using the brightness value histogram of the input image including the fog component A segmentation unit for segmenting an image to generate a level image including a plurality of region classes, wherein the plurality of region classes are composed of a sky region class, a ground region class, and a vertical region class;
  • a delivery map generator for generating a delivery map for removing the fog component by using the input image and the level image;
  • a fog value estimator for estimating a fog value corresponding to the fog component by using the input image and the sky region class;
  • an image reconstructing unit which generates a reconstructed image by removing the fog component by using the delivery map and the fog value.
  • the brightness value histogram may include a dark area, a middle area, and a bright area according to a preset brightness reference value.
  • the delivery map may be expressed as in the following equation.
  • T ( ⁇ ) is the transfer map
  • is the scattering coefficient of the atmosphere
  • is the wavelength
  • is the wavelength index
  • W ij (g L ( ⁇ )) is the filter kernel
  • g L ( ⁇ ) is the level image
  • g ( ⁇ ) is the input image
  • ⁇ k is the window with the center pixel k
  • is the number of pixels of ⁇ k
  • i and j are the coordinates of the pixel
  • is the normalization parameter
  • ⁇ k is the average of ⁇ k
  • ⁇ k is the variance of ⁇ k , respectively.
  • the wavelength may include a red wavelength, a green wavelength, and a blue wavelength
  • the scattering coefficient of the atmosphere may have a different value for each of the red wavelength, the green wavelength, and the blue wavelength.
  • the atmospheric scattering coefficient is a scattering coefficient of an angle angle of 60 degrees of the camera acquiring the input image, and may be expressed by the following equation.
  • the fog value may be expressed as in the following equation.
  • a ( ⁇ ) is the fog value
  • g ( ⁇ ) is the input image
  • g SL ( ⁇ ) is the level image of the sky domain class
  • max () means a maximizing function for comparing brightness values.
  • FIG. 1 illustrates a model for forming a wavelength adaptive fog image according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a schematic configuration of an apparatus for adaptively removing wavelength of an image according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a wavelength adaptive fog removing method of an image according to an exemplary embodiment of the present invention.
  • 4 and 6 are views for explaining the operation of the wavelength adaptive fog removing device and method according to an embodiment of the present invention.
  • FIG. 7 is a view showing a simulation result of the wavelength adaptive fog removing device and method according to an embodiment of the present invention.
  • FIG. 1 illustrates a model for forming a wavelength adaptive fog image according to an embodiment of the present invention.
  • Equation 1 an image obtained from a camera in an unmanned aerial vehicle (UAV), that is, an input image, is represented by Equation 1 below.
  • g ( ⁇ ) is the input image
  • ⁇ ( ⁇ ⁇ red , ⁇ green , ⁇ blue ⁇ ) is the wavelength
  • f ( ⁇ ) is the fog-free image
  • f sun ( ⁇ ) is light that is not scattered in the atmosphere
  • f sky ( ⁇ ) denotes light scattered in the atmosphere
  • a ( ⁇ ) denotes an atmospheric bet fog value
  • T ( ⁇ ) denotes a conversion map or a transmission map.
  • T ( ⁇ ) which is an image without fog
  • f sky ( ⁇ ) is scattered, causing distortion of the subject and natural color of the image.
  • Equation 1 f ( ⁇ ) T ( ⁇ ) is light entering the camera without being affected by the deterioration factor in the object, that is, direct attenuation, and A ( ⁇ ) (1-T ( ⁇ )) is applied to the camera. Atmospheric scattered light, which is scattered by the fog before reaching it.
  • the fog value A ( ⁇ ) and the transfer map T ( ⁇ ) are estimated from the image g ( ⁇ ) to restore the image without fog.
  • FIG. 2 is a diagram illustrating a schematic configuration of an apparatus for adaptively removing wavelength of an image according to an embodiment of the present invention.
  • the fog removing apparatus 200 includes an input unit 210, a segmentation unit 220, a transfer map generator 230, a fog value estimator 240, and image restoration.
  • the unit 250 is included.
  • FIG. 3 is a flowchart illustrating a wavelength adaptive fog removing method of an image according to an exemplary embodiment of the present invention.
  • step 310 the input unit 210 receives an image obtained from the camera.
  • the input image is an image including a fog component, which may be an unmanned aerial vehicle image, a black box image, or a cctv image.
  • a fog component which may be an unmanned aerial vehicle image, a black box image, or a cctv image.
  • the input image is obtained from the unmanned aerial vehicle, but the present invention is not limited thereto.
  • the segmentation unit 220 generates a level image including a plurality of region classes by subdividing the input image including the fog component.
  • the plurality of area classes are composed of a sky area class, a ground area class, and a vertical area class, and the vertical area class includes a vertical structure such as a building.
  • the fog of the uniform turbidity is not distributed due to the visible distance and external influences, and the farther the object, the less the wavelength dependence on the scattering degree.
  • we estimate the transfer map by generating a level image using geometric class-based image classification and adaptive region merging.
  • geometric classes are classified into three classes, that is, an sky area class, a ground area class, and a vertical area class, for efficient segmentation of a fog image.
  • the segmentation unit 220 may subdivide the sky region class, the ground region class, and the vertical region class by using the brightness value histogram of the input image.
  • the segmentation unit 220 classifies the image into small regions while maintaining the boundary of the object of the image using the superpixel illustrated in FIG. 4, and performs the region merging using the brightness value histogram distribution illustrated in FIG. 5. Subdivide the zone class, ground zone class, and vertical zone class. This will be described in more detail as follows.
  • FIG. 4A a sample input image including five super pixels si is illustrated.
  • the segmentation unit 220 performs segmentation through different hypotheses to assign the super pixels to one of three classes (sky region class, ground region class and vertical region class, ⁇ S, V, G ⁇ ). do.
  • C is the level confidence value
  • P is the likelihood
  • h ji is the jth hypothesis for s i
  • b i is the level of the superpixel
  • b ji is the level of h ji , respectively.
  • the level confidence value for the empty domain class s 1 may be expressed as Equation 3 below.
  • the segmentation unit 220 estimates the super pixel from the small and homogeneous area of the image.
  • the segmentation unit 220 merges adjacent areas using the brightness value histogram classification.
  • the segmentation unit 220 quantizes each color channel uniformly at 16 levels, and estimates the histogram of each region in the feature region.
  • the segmentation unit 220 merges the image area into a brightness value area such as "dark brightness, medium brightness, bright brightness", and the like.
  • Equation 4 The similarity measure ⁇ (R A, R c) between the R A and R C regions may be expressed by Equation 4 below.
  • the segmentation unit 220 calculates a plurality of levels, that is, n s , based on simple shapes such as color, color, texture, and shape.
  • the segmentation unit 220 determines an optimal class for each region using the estimated probability.
  • the level of the super pixel may be expressed as in Equation 6 below.
  • v is the value of a possible level
  • Nh is the number of hypotheses
  • g ( ⁇ )) is the homogeneity likelihood value
  • P (b j v
  • h ji is the level likelihood
  • Each value represents a value.
  • the segmentation unit 220 obtains a histogram by uniformly quantizing each color channel of the fog image in the input image (FIG. 5 (a)), and then divides the histogram characteristic space into a brightness value region to obtain a dark region, It is classified into three characteristic spaces of a middle region and a bright region (FIG. 5B), and merging adjacent regions having similar characteristics between adjacent regions of the segmented image (FIG. 5C). ).
  • information such as color, texture, and shape is used to classify the area into a sky area class, a vertical area class, and a ground area class to generate a level image by leveling between areas (FIG. 5D).
  • the brightness value histogram is classified by a preset brightness reference value. As an example, in FIG. 5B, brightness reference values are illustrated as 100 and 200.
  • the delivery map generator 230 generates a delivery map for removing the fog component by using the input image and the level image.
  • the transfer map has a wavelength adaptive characteristic.
  • the transmission map according to the present invention performs fog adaptively using wavelength-based segmentation-based level image described above.
  • the delivery map may be expressed by Equation 7 below.
  • T ( ⁇ ) is the transfer map
  • is the scattering coefficient of the atmosphere
  • is the wavelength
  • is the wavelength index (for example, 1.5)
  • W ij (g L ( ⁇ )) is the filter kernel
  • g ( ⁇ ) is the input image
  • ⁇ k is the window with the center pixel k
  • is the number of pixels in ⁇ k
  • i and j are the coordinates of the pixel
  • is the normalization parameter
  • ⁇ k is ⁇ k ⁇ k means the variance of ⁇ k , respectively.
  • g Gi ( ⁇ ) which is a context-adaptive image while maintaining the edge characteristic of the level image is maintained by using the edge preservation characteristic of the filter. Can be generated.
  • the wavelength may include a red wavelength, a green wavelength, and a blue wavelength
  • the scattering coefficient of the atmosphere may have a different value for each of the red wavelength, the green wavelength, and the blue wavelength.
  • the atmospheric scattering coefficient is a scattering coefficient of an angle angle of 60 degrees of a camera that acquires an input image, and may be expressed by Equation 8 below.
  • FIG. 6A illustrates a delivery map corresponding to the input image illustrated in FIG. 5A.
  • step 340 the fog value estimator 240 estimates the fog value corresponding to the fog component by using the input image and the sky region class.
  • the fog value estimator 240 estimates the fog value by using the image leveled as the sky domain class among the three classes together with the input image, but uses the highest brightness value of each channel in the sky domain class image as the fog value. It can be estimated as, which can be expressed as Equation 9.
  • g SL ( ⁇ ) denotes a level image of the sky domain class
  • max () denotes a maximizing function for comparing brightness values.
  • FIG. 6B illustrates a level image of a sky image class according to an embodiment of the present invention.
  • step 350 the image reconstruction unit 250 generates a reconstructed image by removing the fog component by using the delivery map and the fog value. This is expressed as in Equation 10 below.
  • FIG. 6C illustrates a reconstructed image according to an embodiment of the present invention.
  • FIG. 7 is a view showing a simulation result of the wavelength adaptive fog removing device and method according to an embodiment of the present invention.
  • FIG. 7 (a) shows an input image
  • FIG. 7 (b) shows a delivery map according to the present invention
  • FIG. 7 (c) shows a reconstructed image, and a result image without color distortion can be confirmed.
  • embodiments of the present invention can be implemented in the form of program instructions that can be executed by various computer means may be recorded on a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
  • Examples of program instructions such as magneto-optical, ROM, RAM, flash memory, etc.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of one embodiment of the present invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un dispositif et un procédé pour le débrumage d'image adaptatif à la longueur d'onde. Le dispositif selon l'invention pour le débrumage adaptatif à la longueur d'onde d'une image comprend : une unité de segmentation pour générer une image de niveau comprenant une pluralité de classes de zone en utilisant un histogramme de valeurs de luminosité d'une image d'entrée comprenant un élément constitutif de brouillard de façon à segmenter l'image d'entrée, la pluralité de classes de zone comprenant une classe de zone de ciel, une classe de zone de sol et une classe de zone verticale ; une unité de génération de carte de transfert pour générer une carte de transfert pour retirer l'élément constitutif de brouillard, en utilisant l'image d'entrée et l'image de niveau ; une unité d'estimation de valeur de brouillard pour estimer une valeur de brouillard correspondant à l'élément constitutif de brouillard en utilisant l'image d'entrée et la classe de zone de ciel ; et une unité de restauration d'image pour générer une image restaurée en utilisant la carte de transfert et la valeur de brouillard de façon à éliminer l'élément constitutif de brouillard.
PCT/KR2016/002359 2015-07-17 2016-03-09 Dispositif et procédé pour débrumage d'image adaptatif à la longueur d'onde WO2017014396A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680030045.0A CN107636724A (zh) 2015-07-17 2016-03-09 影像的波长自适应去雾装置及方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2015-0101648 2015-07-17
KR1020150101648A KR101582779B1 (ko) 2015-07-17 2015-07-17 영상의 파장 적응적 안개 제거 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2017014396A1 true WO2017014396A1 (fr) 2017-01-26

Family

ID=55165499

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/002359 WO2017014396A1 (fr) 2015-07-17 2016-03-09 Dispositif et procédé pour débrumage d'image adaptatif à la longueur d'onde

Country Status (3)

Country Link
KR (1) KR101582779B1 (fr)
CN (1) CN107636724A (fr)
WO (1) WO2017014396A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111489420A (zh) * 2020-02-25 2020-08-04 天津大学 一种多光谱遥感卫星影像的雾霾分量模拟方法
CN111738941A (zh) * 2020-06-05 2020-10-02 大连海事大学 融合光场和偏振信息的水下图像优化方法
US10970824B2 (en) * 2016-06-29 2021-04-06 Nokia Technologies Oy Method and apparatus for removing turbid objects in an image
CN116188331A (zh) * 2023-04-28 2023-05-30 淄博市淄川区市政环卫服务中心 一种建设工程施工状态变化监测方法及系统

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101827361B1 (ko) 2016-12-08 2018-02-08 한국항공대학교산학협력단 안개 제거를 위한 대기 강도 추정 장치 및 방법
CN108805190B (zh) * 2018-05-30 2021-06-22 北京奇艺世纪科技有限公司 一种图像处理方法及装置
CN111539891A (zh) * 2020-04-27 2020-08-14 高小翎 单张遥感图像的波段自适应除雾优化处理方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100021952A (ko) * 2008-08-18 2010-02-26 삼성테크윈 주식회사 안개 등의 대기 산란 입자로 인한 왜곡 보정용 영상 처리 방법 및 장치
KR20140017776A (ko) * 2012-08-01 2014-02-12 엠텍비젼 주식회사 영상 처리 장치 및 영상 내의 안개 제거 방법
KR20140026747A (ko) * 2012-08-23 2014-03-06 중앙대학교 산학협력단 Hsv 색상 공간에서 영상의 안개 제거 장치 및 방법, 그리고 그 방법을 컴퓨터에서 실행시키기 위한 프로그램을 기록한 기록매체
KR20140140163A (ko) * 2013-05-28 2014-12-09 전남대학교산학협력단 사용자 제어가 가능한 거듭제곱근 연산자를 이용한 안개영상 개선 장치
KR20140142381A (ko) * 2013-05-28 2014-12-12 삼성테크윈 주식회사 단일영상 내의 안개 제거 방법 및 장치

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8396324B2 (en) * 2008-08-18 2013-03-12 Samsung Techwin Co., Ltd. Image processing method and apparatus for correcting distortion caused by air particles as in fog
CN103020917B (zh) * 2012-12-29 2015-05-27 中南大学 一种基于显著性检测的中国古代书法绘画图像复原方法
CN104166968A (zh) * 2014-08-25 2014-11-26 广东欧珀移动通信有限公司 一种图像去雾的方法、装置及移动终端
CN104182943B (zh) * 2014-08-27 2015-12-02 湖南大学 一种融合人眼视觉特性的单幅图像去雾方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100021952A (ko) * 2008-08-18 2010-02-26 삼성테크윈 주식회사 안개 등의 대기 산란 입자로 인한 왜곡 보정용 영상 처리 방법 및 장치
KR20140017776A (ko) * 2012-08-01 2014-02-12 엠텍비젼 주식회사 영상 처리 장치 및 영상 내의 안개 제거 방법
KR20140026747A (ko) * 2012-08-23 2014-03-06 중앙대학교 산학협력단 Hsv 색상 공간에서 영상의 안개 제거 장치 및 방법, 그리고 그 방법을 컴퓨터에서 실행시키기 위한 프로그램을 기록한 기록매체
KR20140140163A (ko) * 2013-05-28 2014-12-09 전남대학교산학협력단 사용자 제어가 가능한 거듭제곱근 연산자를 이용한 안개영상 개선 장치
KR20140142381A (ko) * 2013-05-28 2014-12-12 삼성테크윈 주식회사 단일영상 내의 안개 제거 방법 및 장치

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YOON ET AL.: "Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images.", SENSORS., vol. 15, no. 3, 2015, XP055348555 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10970824B2 (en) * 2016-06-29 2021-04-06 Nokia Technologies Oy Method and apparatus for removing turbid objects in an image
CN111489420A (zh) * 2020-02-25 2020-08-04 天津大学 一种多光谱遥感卫星影像的雾霾分量模拟方法
CN111738941A (zh) * 2020-06-05 2020-10-02 大连海事大学 融合光场和偏振信息的水下图像优化方法
CN111738941B (zh) * 2020-06-05 2023-08-29 大连海事大学 融合光场和偏振信息的水下图像优化方法
CN116188331A (zh) * 2023-04-28 2023-05-30 淄博市淄川区市政环卫服务中心 一种建设工程施工状态变化监测方法及系统

Also Published As

Publication number Publication date
KR101582779B1 (ko) 2016-01-06
CN107636724A (zh) 2018-01-26

Similar Documents

Publication Publication Date Title
WO2017014396A1 (fr) Dispositif et procédé pour débrumage d'image adaptatif à la longueur d'onde
WO2014142417A1 (fr) Système pour améliorer une image de luminance brumeuse à l'aide d'un modèle d'estimation de réduction de la brume
Carr et al. Improved single image dehazing using geometry
WO2014193055A1 (fr) Appareil permettant d'améliorer une image floue à l'aide d'un opérateur racinaire pouvant être commandé par l'utilisateur
TWI808406B (zh) 圖像去霧方法和使用圖像去霧方法的圖像去霧設備
WO2013165210A1 (fr) Appareil de traitement d'image pour éliminer de la brume contenue dans une image fixe et procédé à cet effet
Tan et al. Fast single-image defogging
CN111598800B (zh) 基于空间域同态滤波和暗通道先验的单幅图像去雾方法
CN110335210B (zh) 一种水下图像复原方法
Patel et al. A review on methods of image dehazing
Wu et al. Image haze removal: Status, challenges and prospects
Toka et al. A fast method of fog and haze removal
Al-Zubaidy et al. Removal of atmospheric particles in poor visibility outdoor images
Yang et al. A novel single image dehazing method
CN116433513A (zh) 一种道路监视视频去雾方法、系统、电子设备和存储介质
Chu et al. A content-adaptive method for single image dehazing
CN116757949A (zh) 一种大气-海洋散射环境退化图像复原方法及系统
CN109191405B (zh) 一种基于透射率全局估计的航空影像去雾算法
Mai et al. The latest challenges and opportunities in the current single image dehazing algorithms
CN107203979B (zh) 一种低照度图像增强的方法
CN115439349A (zh) 一种基于图像增强的水下slam优化方法
Mittal et al. IoT based image defogging system for road accident control during winters
Thomas et al. Effectual single image dehazing with color correction transform and dark channel prior
CN107845078B (zh) 一种元数据辅助的无人机图像多线程清晰化方法
Kaur et al. Comparative study on various single image defogging techniques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16827900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16827900

Country of ref document: EP

Kind code of ref document: A1