CN102999903B - Method for quantitatively evaluating illumination consistency of remote sensing images - Google Patents

Method for quantitatively evaluating illumination consistency of remote sensing images Download PDF

Info

Publication number
CN102999903B
CN102999903B CN201210455359.9A CN201210455359A CN102999903B CN 102999903 B CN102999903 B CN 102999903B CN 201210455359 A CN201210455359 A CN 201210455359A CN 102999903 B CN102999903 B CN 102999903B
Authority
CN
China
Prior art keywords
image
illumination
images
histogram
average
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210455359.9A
Other languages
Chinese (zh)
Other versions
CN102999903A (en
Inventor
陈强
孙权森
夏德深
张国际
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201210455359.9A priority Critical patent/CN102999903B/en
Publication of CN102999903A publication Critical patent/CN102999903A/en
Application granted granted Critical
Publication of CN102999903B publication Critical patent/CN102999903B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for quantitatively evaluating the illumination consistency of remote sensing images. The method is used for extracting elevation information from each single remote sensing image on the basis of dark channels and bilateral filtering, and includes a first step of estimating air light vectors: computing dark channel images of the inputted images, extracting the front 0.1% pixels, which have the highest brightness, of all pixels from the dark channel images, and acquiring the pixels with the highest intensity in the previously acquired pixels in the inputted images to be used as the air light vectors; and a second step of computing optimized transmission images and computing depth-of-field images: computing dark images, computing a local mean value and a local standard deviation of each dark channel image, computing the difference between the local mean value and the local standard deviation of each dark channel image to obtain an atmosphere light curtain, and subtracting a quotient of each atmosphere light curtain to a mean value of the air light vectors from 1 to obtain the corresponding field-of-depth image. Each dark image corresponds to the minimum one of three channels of the corresponding color image. Heights of photographic fields in the images can be estimated by the aid of the depth-of-field images.

Description

Remote sensing images illumination consistency method for quantitatively evaluating
Technical field
The invention belongs to a kind of method for quantitatively evaluating, particularly a kind of remote sensing images illumination consistency method for quantitatively evaluating.
Background technology
High-definition picture Comparison Method is simple in image quality evaluation and effective method, and its prerequisite is that two width have good consistance for the high-low resolution image compared.But actual conditions are often difficult to meet these coherence requests, in order to make the image that can not meet illumination consistency also can be used for satellite parameter monitoring in-orbit, need to carry out illumination consistency correction.A process not rounded system at present to image irradiation Concordance, the methods such as most methods extracts illumination invariant in recognition of face, or utilize gamma correction, Retinex eliminate region excessive or excessively dark in single image.Concrete Measure Indexes is not had for the conforming degree of image irradiation, can only be evaluated by the subjective sensation of human eye, the present invention does quantitative evaluation to image before and after treatment, and various bearing calibration by quantitative comparison, and no longer merely can be relied on subjective feeling to evaluate.
Illumination correction method between multiple image has histogram equalization, wavelet histogram equalization, partial histogram equalization, adaptive histogram equalization.Histogram equalization method is simple and practical, when requiring more wide in range to illumination consistency, has certain using value, when subsequent treatment requires that the impact that elimination different light is brought is comparatively strict, then demonstrates its limitation.The bearing calibration of single image has gamma correction, homomorphic filtering, single scale Retinex, multiple dimensioned Retinex, and single scale is from quotient images, multiple dimensioned from the full variation of quotient images logarithm, based on the method for transform domain, based on the method etc. of gradient field.These methods regulate for the light and shade region of single image, and dynamic range compression, the effects such as color fidelity are ideal.
Single scale Retinex is better to the calibration result of single image, and the illumination of multiple image can not be made to reach consistent.Wavelet histogram method is applicable to the illumination correction between image, the process of wavelet histogram equalization method is, picture breakdown is estimation coefficient and detail coefficients by wavelet transformation, histogram equalization operation is carried out to estimation coefficient, detail coefficients is multiplied by the constant that is greater than 1 simultaneously, and corrected coefficient is obtained the consistent image of illumination through wavelet inverse transformation.Dynamic range compression method based on gradient field operates gradient fields at a gradient field partial gradient function, suppress larger gradient, strengthen less gradient, to strengthening the contrast at image detail place while compressed image overall dynamic range, rebuild gradient fields after operation, be enhanced image.Under the framework of said method, gradient fields process be have employed to the method for histogram equalization, the illumination consistency obtaining image is better.(the 1. Zhu Shu dragon of existing many illumination consistency bearing calibrations at present, Zhang Zhen, Zhu Baoshan, Cao Wen. the effectiveness comparison of remote sensing image brightness and contrast nonunf ormity correcting algorithm. remote sensing journal. 2011, 15(1): 111-122.2. Li Hui virtue, Shen Huanfeng, Zhang Liangpei, Li Pingxiang. a kind of remote sensing image nonuniformity correction method based on variation Retinex. mapping journal. 2010, vol.39:585-598), but can only evaluate from subjective feeling for the correction result of these methods, lack quantitative test, and method for quantitatively evaluating is absolutely necessary for the effect comparing illumination consistency bearing calibration.
Summary of the invention
The object of the present invention is to provide a kind of remote sensing images illumination consistency method for quantitatively evaluating, thus can quantized image illumination consistency degree.
Realizing technical solution of the present invention is: a kind of remote sensing images illumination consistency method for quantitatively evaluating, and step is as follows:
The first step, ask the illumination component of one group of image to be evaluated: image is made up of illumination component and reflecting component, namely the image obtained after original image being done gaussian filtering is illumination component, shown in following formula, and L (x, y) illumination component is represented, S (x, y) represents input picture, F (x, y) be Gaussian function, Gaussian function and original image done convolution and namely obtains illumination component:
L(x,y)=S(x,y)*F(x,y)
Second step, arranges the weights W of each gray level of illumination component histogram j: first the histogram of all illumination component is done on average, obtain average histogram M, obtain the gray average m of average histogram 0, the weights of other gray levels are the distance of this gray level to average, i.e. W j=| m 0-j|+1, j are a certain gray levels in histogram;
3rd step, calculates the difference between each illumination component: by illumination component histogram G ido difference with average histogram M to take absolute value, the number of pixels of each gray level is multiplied by the summation of corresponding weights, obtains the difference of illumination component histogram and average histogram, and the difference of all illumination component and average histogram being sued for peace obtains total variances;
4th step, calculates illumination consistency coefficient p: illumination consistency coefficient p when light conditions is completely the same is set to 1, and the complete asynchronous illumination consistency coefficient p of light conditions is set to 0, by total variances divided by the complete asynchronous total variances S of illumination kobtain business, S knumber for the pixel of all images is multiplied by maximum weights; Deduct business with 1 again and obtain illumination consistency coefficient p, namely , wherein k is the quantity of image in one group of image, and i represents the histogram of the i-th width image, G ij () is that in this group image, the gray level of the i-th width image is the number of the pixel of j, the number of M (j) to be the gray level of the average image of this group image the be pixel of j.
The present invention compared with prior art, its remarkable advantage: (1) can the illumination consistency of quantitative evaluation image, and no longer only relies on the subjective feeling of people to evaluate, thus can the effect of more various illumination consistency bearing calibration.(2) this method is by the constraint of image scene, although the scene of each image is different, remote sensing images illumination consistency method for quantitatively evaluating only compares their illumination component, can objectively respond the illumination consistency degree of image.(3) can the illumination consistency of any multiple image of quantitative evaluation, 2 width, 3 width, 4 width, 5 width, being greater than arbitrarily 2 width images is one group, can both obtain the illumination consistency coefficient between 0 to 1.(4) not by the impact of image size, as long as the size of one group of image is identical.No matter image size is 512*512 or 256*256, and the illumination consistency coefficient obtained is substantially constant, the result obtained after image down one times with originally differed within 0.02.
Below in conjunction with accompanying drawing, the present invention is described in further detail.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of remote sensing images illumination consistency method for quantitatively evaluating of the present invention.
Fig. 2 is the schematic diagram of input picture and its illumination component in the present invention.A () is input picture, (b) is illumination component.
Fig. 3 is illumination consistency coefficient calculation method schematic diagram of the present invention.
Fig. 4 to be 2 width images be experimental result picture of a group.A () is input picture (p=0.8045), b () is the effect (p=0.9587) after wavelet histogram equalization corrects, c () is the effect (p=0.9353) after single scale Retinex corrects, (d) is the result (p=0.9793) after correcting based on gradient field method.
Fig. 5 to be 3 width images be experimental result picture of a group.A () is input picture (p=0.6304), b () is the effect (p=0.7777) after wavelet histogram equalization corrects, c () is the effect (p=0.7705) after single scale Retinex corrects, (d) is the result (p=0.8818) after correcting based on gradient field method.
Fig. 6 to be 4 width images be experimental result picture of a group.A () is input picture (p=0.8221), b () is the effect (p=0.8736) after wavelet histogram equalization corrects, c () is the effect (p=0.6856) after single scale Retinex corrects, (d) is the result (p=0.7362) after correcting based on gradient field method.
Fig. 7 to be 5 width images be experimental result picture of a group.A () is input picture (p=0.8221), b () is the effect (p=0.8736) after wavelet histogram equalization corrects, c () is the effect (p=0.6856) after single scale Retinex corrects, (d) is the result (p=0.7362) after correcting based on gradient field method.
Fig. 8 be after image down one times with original image illumination consistency difference compare schematic diagram.A () is 20 groups of experimental results of original image, (b) is 20 groups of experimental results of wavelet histogram equalization, and (c) is 20 groups of experimental results of single scale Retinex, and (d) is 20 groups of experimental results based on gradient field method.
Embodiment
Composition graphs 1, remote sensing images illumination consistency method for quantitatively evaluating of the present invention, step is as follows:
The first step, extracts illumination component according to Retinex theory.Because picture material is different, directly original image is compared and be difficult to reflect its illumination condition, therefore first extract illumination component by Retinex theory and compare again.Retinex theory is mainly used in compensating the serious image of exposure influence of light, and its fundamental purpose is that a sub-picture is decomposed into illumination component and reflecting component.For each pixel in image S, there is following formula
S(x,y)=R(x,y)L(x,y)
Wherein R represents reflecting component, and L represents illumination component.Retinex theory is the reflectivity properties R obtaining object from image S, namely casts aside the character of incident light L to obtain the style of object.Can obtain thus, the illumination condition of image can be estimated by illumination component L.
The reflecting component formula of single scale Retinex computed image is as follows
R(x,y)=logS(x,y)-log[F(x,y)*S(x,y)]
Wherein S (x, y) is input picture, and R (x, y) represents Output rusults, and * represents convolution algorithm, and F (x, y) is Gaussian function, and its expression formula is
F ( x , y ) = 1 2 π σ e - x 2 + y 2 2 σ 2
Therefore, illumination component can do gaussian filtering with original image and obtain, and wherein the standard deviation sigma of Gaussian function gets relatively large value, and from frequency domain angle, larger yardstick correspond to the low frequency part of image, more can represent the illumination part of image.From Retinex theory, make the color fidelity of image better compared with large scale.Fig. 2 illustrates input picture and its illumination component.
Second step, calculates the histogrammic difference of illumination component: the present invention starts with from the similarity of histogram shape, calculates the degree of consistency of multiple image illumination component.For the illumination component of one group of image, first calculate their average histogram M, then ask the absolute value of number of pixels difference in the histogram of each illumination component gray level corresponding to average histogram, be multiplied by corresponding weight value and sue for peace again.Fig. 3 gives five width image consistency coefficient calculations schematic diagram, obtains five histogrammic average histograms by the histogram of five width light images.
3rd step, calculates illumination consistency coefficient p.Illumination consistency computing formula is as follows:
p = 1 - Σ i = 1 k Σ j = 0 255 | G i ( j ) - M ( j ) | W j S k
Wherein k is the quantity of image in one group of image, G ibe a vector, represent the histogram of the i-th width image, G ij () is that in this group image, the pixel value of the i-th width image is the number of the pixel of j, M is a vector, represents average histogram.The number of M (j) to be the pixel value of the average image of this group image the be pixel of j, W j=| m 0-j|+1, m 0the average of this group image, S kbe all pixels of every width image be multiplied by maximum weights and.These images are all the light images after extraction.Weights W jbe defined as | m 0-j|+1 is because gray level is from average m 0far away, difference is visually more obvious, and corresponding weights are larger, ensure that m simultaneously 0the weights at place are non-vanishing.When identical image compares together time, the value of p is 1, when pixel value be entirely 0 and be entirely 255 two width image ratio comparatively time, the value of p is 0.
Fig. 4 to Fig. 7 is 2 width respectively, 3 width, 4 width, experimental result when 5 width images are a group.Image longitudinal arrangement, first is classified as input picture, after three be classified as the image after correction, and give corresponding illumination consistency coefficient.First lateral comparison, the consistency coefficient of Fig. 4 (d) is the highest, and the consistency coefficient of Fig. 4 (a) is minimum.From visual effect, the light and shade gap of original image (Fig. 4 (a)) is comparatively large, and the mountain range shaded side of Fig. 4 (b) is comparatively dark, and Fig. 4 (c), (d) illumination consistency degree is higher, comes to the same thing with the quantitative evaluation of illumination consistency coefficient.In Fig. 5 to Fig. 7, quantitative evaluation result is also all consistent with subjective feeling.Then carry out longitudinal comparison, Fig. 4 is two width images is one group, and Fig. 5 is 3 width images is one group, from visual effect, the light differential of Fig. 4 (a) and Fig. 5 (a) is all very large, but the former consistency coefficient is higher, this is consistent with reality, and amount of images is fewer, more easily reaches consistent.Therefore, experiment proves that illumination consistency quantifies evaluation result is experienced with subjective vision and is consistent, and during image quantity, consistency coefficient comparatively speaking can be slightly high.
Fig. 8 gives the experimental result of 20 groups of images (be respectively 2,3,4,5 four groups by image number to form, often organize each 5 width).Four width figure in Fig. 8 represent that the image size of original image and three kinds of bearing calibrations is the illumination consistency coefficient of 512*512 and 256*256 respectively.Wherein two dotted lines represent that size is the image of 512*512 and 256*256 respectively.In figure, horizontal ordinate represents picture numbers, and ordinate represents illumination consistency coefficient.Substantially overlapped by the major part point can observed in figure on two lines, only have small part to differ larger.The consistency coefficient of to be the consistency coefficient of 512*512 and image size by image size be 256*256 does difference, and the average obtaining difference is 0.0252, and standard deviation is 0.0286, and maximal value is 0.1326.So can obtain conclusion, illumination consistency coefficient, substantially not by the impact of image size, can be used for evaluating the illumination consistency of various sizes image, but must be consistent with the size of image in group.

Claims (3)

1. a remote sensing images illumination consistency method for quantitatively evaluating, is characterized in that said method comprising the steps of:
The first step, asks the illumination component of one group of image to be evaluated: image is made up of illumination component and reflecting component, and namely the image obtained after original image being done gaussian filtering is illumination component;
Second step, arranges the weights W of each gray level of illumination component histogram j: first the histogram of all illumination component is done on average, obtain average histogram M, obtain the gray average m of average histogram 0, the weights of other gray levels are the distance of this gray level to average, i.e. W j=| m 0-j|+1, j are a certain gray levels in histogram;
3rd step, calculates the difference between each illumination component: by illumination component histogram G ido difference with average histogram M to take absolute value, the number of pixels of each gray level is multiplied by the summation of corresponding weights, obtains the difference of illumination component histogram and average histogram, and the difference of all illumination component and average histogram being sued for peace obtains total variances;
4th step, calculate illumination consistency coefficient p: illumination consistency coefficient p when light conditions is completely the same is set to 1, the complete asynchronous illumination consistency coefficient p of light conditions is set to 0, by the total variances calculated by the 3rd step, divided by the complete asynchronous total variances S of illumination kobtain business, S knumber for the pixel of all images is multiplied by maximum weights; Deduct business with 1 again and obtain illumination consistency coefficient p, namely
p = 1 - Σ i = 1 k Σ j = 0 255 | G i ( j ) - M ( j ) | W j S k ;
Wherein k is the quantity of image in one group of image, and i represents the histogram of the i-th width image, G ij () is that in this group image, the gray level of the i-th width image is the number of the pixel of j, the number of M (j) to be the gray level of the average image of this group image the be pixel of j.
2. remote sensing images illumination consistency method for quantitatively evaluating according to claim 1, is characterized in that the illumination component described in the first step is obtained by following formula:
L(x,y)=S(x,y)*F(x,y)
Wherein, L (x, y) represents illumination component, and S (x, y) represents input picture, and F (x, y) is Gaussian function.
3. remote sensing images illumination consistency method for quantitatively evaluating according to claim 1, is characterized in that illumination consistency coefficient p is between 0-1.
CN201210455359.9A 2012-11-14 2012-11-14 Method for quantitatively evaluating illumination consistency of remote sensing images Expired - Fee Related CN102999903B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210455359.9A CN102999903B (en) 2012-11-14 2012-11-14 Method for quantitatively evaluating illumination consistency of remote sensing images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210455359.9A CN102999903B (en) 2012-11-14 2012-11-14 Method for quantitatively evaluating illumination consistency of remote sensing images

Publications (2)

Publication Number Publication Date
CN102999903A CN102999903A (en) 2013-03-27
CN102999903B true CN102999903B (en) 2014-12-24

Family

ID=47928437

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210455359.9A Expired - Fee Related CN102999903B (en) 2012-11-14 2012-11-14 Method for quantitatively evaluating illumination consistency of remote sensing images

Country Status (1)

Country Link
CN (1) CN102999903B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105096275A (en) * 2015-08-27 2015-11-25 南京理工大学 Method for extracting elevation value of single mountain remote sensing images on basis of dark channel principle
CN109658364A (en) * 2018-11-29 2019-04-19 深圳市华星光电半导体显示技术有限公司 Image processing method
CN111413278B (en) * 2020-01-19 2021-04-09 中国科学院电子学研究所 Method for calculating solar illumination compensation value
CN111862074B (en) * 2020-07-30 2023-11-24 国网湖南省电力有限公司 Cable water-blocking buffer layer defect identification method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101014079A (en) * 2005-12-16 2007-08-08 富士施乐株式会社 Image evaluation apparatus, image evaluation method, computer readable medium and computer data signal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10225855A1 (en) * 2002-06-07 2003-12-24 Zeiss Carl Jena Gmbh Method and arrangement for evaluating images taken with a fundus camera

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101014079A (en) * 2005-12-16 2007-08-08 富士施乐株式会社 Image evaluation apparatus, image evaluation method, computer readable medium and computer data signal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
High Dynamic Range Image Rendering With a Retinex-Based Adaptive Filter;Laurence Meylan et al.;《IEEE Transactions on Image Processing》;20060930;第15卷(第9期);第2820-2830页 *
基于Retinex理论的小波域雾天图像增强方法;储昭辉等;《计算机工程与应用》;20110531;第47卷(第15期);第175-179页,190页 *

Also Published As

Publication number Publication date
CN102999903A (en) 2013-03-27

Similar Documents

Publication Publication Date Title
US20200273154A1 (en) Image enhancement method and system
Cao et al. Contrast enhancement of brightness-distorted images by improved adaptive gamma correction
Fu et al. A weighted variational model for simultaneous reflectance and illumination estimation
Wang et al. Simple low-light image enhancement based on Weber–Fechner law in logarithmic space
EP2806395B1 (en) Color enhancement method and device
CN108090886B (en) High dynamic range infrared image display and detail enhancement method
Tiwari et al. Brightness preserving contrast enhancement of medical images using adaptive gamma correction and homomorphic filtering
CN102999903B (en) Method for quantitatively evaluating illumination consistency of remote sensing images
CN104318545B (en) A kind of quality evaluating method for greasy weather polarization image
CN103295206B (en) A kind of twilight image Enhancement Method and device based on Retinex
Chen et al. Blind quality index for tone-mapped images based on luminance partition
Yang et al. On the Image enhancement histogram processing
CN103106644B (en) Overcome the self-adaptation picture quality enhancement method of coloured image inhomogeneous illumination
Wang et al. Screen content image quality assessment with edge features in gradient domain
Yang et al. Low-light image enhancement based on Retinex theory and dual-tree complex wavelet transform
CN110969584B (en) Low-illumination image enhancement method
Muniraj et al. Underwater image enhancement by modified color correction and adaptive Look-Up-Table with edge-preserving filter
Trivedi et al. A new contrast measurement index based on logarithmic image processing model
Wang et al. Retinex algorithm on changing scales for haze removal with depth map
CN110706180B (en) Method, system, equipment and medium for improving visual quality of extremely dark image
Jindal et al. Bio-medical image enhancement based on spatial domain technique
Zhang et al. An improved recursive retinex for rendering high dynamic range photographs
Weichao et al. Research on color image defogging algorithm based on MSR and CLAHE
Wen et al. TransIm: Transfer image local statistics across EOTFs for HDR image applications
Wharton et al. Adaptive multi-histogram equalization using human vision thresholding

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141224

Termination date: 20171114

CF01 Termination of patent right due to non-payment of annual fee