CN102170574B - Real-time video defogging system - Google Patents

Real-time video defogging system Download PDF

Info

Publication number
CN102170574B
CN102170574B CN 201110134572 CN201110134572A CN102170574B CN 102170574 B CN102170574 B CN 102170574B CN 201110134572 CN201110134572 CN 201110134572 CN 201110134572 A CN201110134572 A CN 201110134572A CN 102170574 B CN102170574 B CN 102170574B
Authority
CN
China
Prior art keywords
image
value
unit
sky
subelement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201110134572
Other languages
Chinese (zh)
Other versions
CN102170574A (en
Inventor
禹晶
肖创柏
彭力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN 201110134572 priority Critical patent/CN102170574B/en
Publication of CN102170574A publication Critical patent/CN102170574A/en
Application granted granted Critical
Publication of CN102170574B publication Critical patent/CN102170574B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a real-time video defogging system which belongs to the field of image processing and is characterized by being realized in a digital integrated circuit and comprising a data reading unit, a judgment unit, a sky brightness estimation unit, an atmosphere illumination white balance unit, an atmosphere dissipation image estimation unit and a clear scene recovery unit. As for the former K frames in the current lens of a video to be processed, the sky region thereof is estimated so that the sky brightness value is calculated; then the atmosphere illumination color of an image to be processed is corrected according to the sky brightness value by utilizing a white balance algorithm and a white balance image is normalized, the minimum values, minimum values of various color component are solved out so as to serve as rough estimated image; and based on the minimum values, an refined atmosphere dissipation image is calculated by utilizing an edge maintained flatting method, and the atmosphere scene albedo is calculated based on the atmosphere dissipation image, so that defogging recovery processing is carried out. As for a common-intermediate-format (CIF) video with the resolution of 288*352, the processing speed can be up to 60 fps (frames per second); and as for a D1-format video with the resolution of 578*720, the processing speed can be up to 15 fps, thus the system provided by the invention can be applied to monitoring systems and meet the requirement on real-time performance.

Description

A kind of real-time video mist elimination treatment system
Technical field
The invention belongs to image processing field, more particularly, relate to a kind of real-time video mist elimination treatment system.
Background technology
A lot of outdoor use of computer vision system like urban transportation, video monitoring, intelligent vehicle etc., all require the detection of characteristics of image to possess robustness.Yet under weather conditions such as mist, haze, the image that a large amount of small water droplet that suspends in the atmosphere, aerocolloidal scattering process cause gathering seriously degrades, and this greatly limits and influenced the function of outdoor system.Therefore, be necessary to the video that degrades carry out automatically, robust, real-time processing, could adapt to the demand of practical application.
In computer vision field, atmospheric scattering model commonly used is described the imaging process of scene under mist, the haze weather condition.The single image mist elimination algorithm that proposes recently on the atmospheric scattering model based, utilize view data structure itself retrain the scene albedo or (with) assumed condition of the depth of field.The surround lighting of Tan hypothesis regional area is a constant, and contrast significantly strengthens.This algorithm is intended to strengthen the contrast of image.Although improved the visibility of image significantly, yet, owing to recover the real scene albedo from physical model, the supersaturation that seems of the color after the recovery, and produce serious Halo effect at the juncture area of depth of field sudden change.Hypothesis such as He is in the regional area of at least one Color Channel, and the scene albedo is tending towards 0, uses minimum value filtering that the medium propagation function is carried out rough estimate.Then, scratch the analytic solutions algorithm of scheming with image the medium propagation function is carried out refinement.Yet such refinement is also unreasonable.In used cost function, data item plays a part very little.But if improve the value of regular parameter, then the color of depth of field sudden change edge is prone to produce over swing distortion.The hypothesis scene albedos such as Kratz and the depth of field are statistical independent, and available canonical probability priori is to they modelings.Need to choose depth of field prior model according to specific image, and the parameter in the rule of thumb given prior model.Hypothesis atmospheric dissipation functions such as Tarel approach maximum in feasible zone, and localized variation is mild.Utilize the variant of medium filtering to estimate the atmospheric dissipation function.But medium filtering is not that good edge keeps filter, and inappropriate parameter is provided with and is prone to introduce the Halo effect.In addition, all there is complicated, the consuming time problem of computational process in the single image mist elimination algorithm of supposing based on data, is difficult to be applied to practical matter.
Summary of the invention
In view of this, the invention provides a kind of real-time video mist elimination treatment system, the computational process that exists with the algorithm that solves prior art is complicated, consuming time, is difficult to be applied to the problem of practical matter.
A kind of real-time video mist elimination treatment system; It is characterized in that; In a digital integrated circuit, realize; Be provided with: data-reading unit U1, judging unit U2, sky brightness estimation unit U3, atmosphere illumination white balance unit U4, atmospheric dissipation image estimation unit U5 and clear scene recovery unit U6, wherein:
Data-reading unit U1 pursues the data that frame reads pending image in the video: from the current camera lens of said pending video, read pending image I (x)=(I that a frame is made up of R, G, three color components of B R(x), I G(x), I B(x)) T, be called for short I (x), together, the height of said pending image I (x) is N down hIndividual picture number, wide is N wIndividual picture number, x representes the two-dimensional space coordinate, use vector representation be (m, n), 0≤m≤N h-1,0≤n≤N w-1, m, n, N h, N wBe nonnegative integer;
Judging unit U2 receives the pending image I of a frame (x) from said data-reading unit, as current frame image, and carries out following steps:
If said pending image I (x) is the frame in the preceding K frame in the current camera lens, K=50 is sent to said sky brightness estimation unit U3, if not, then is non-preceding K frame, is sent to atmosphere illumination white balance unit U4;
Sky brightness estimation unit U3 is provided with and falls that sampling subelement U31, center minimum luminance value obtain subelement U32, region of partial sky is obtained subelement U33 and sky brightness value estimator unit U34, wherein:
Fall sampling subelement U31, import said pending image I (x) from said judging unit U2, I (x) cuts out the first half Ψ (x) that falls sampled images again after falling sampling, and x is the two-dimensional space coordinate,
The center minimum luminance value obtains subelement U32, from the said said the first half Ψ (x) that falls sampled images of sampling subelement U31 input that falls, calculates the minimum luminance value of each pixel successively according to following steps:
The first step, for each pixel x, the minimum value of the R at calculating respective coordinates place, G, three color components of B constitutes the minimum color component images L (x) of a width of cloth:
L ( x ) = ( min c ∈ { R , G , B } Ψ c ( x ) ) ,
Second step traveled through the pixel among the said L (x), and for each pixel x, calculating with pixel x is the minimum value of interior all the pixel y of neighborhood Ω (x) at center, constitutes a width of cloth minimum value filtering image I Min(x):
I min ( x ) = min y ∈ Ω ( x ) L ( y ) ,
Wherein, Ω (x) expression is the square neighborhood at center with coordinate x, the foursquare length of side be the wide and senior middle school of the said the first half Ψ (x) that falls sampled images than 0.025 times of minor face, c representes Color Channel, x representes two-dimensional space coordinate, Ψ c(x) said each color component images of falling the first half Ψ (x) of sampled images of expression, c ∈ R, G, B} represent R, G, B Color Channel respectively, and y is the two-dimensional coordinate point in the said Ω (x),
Region of partial sky is obtained subelement U33, according to the luminance threshold T that sets υ, will obtain the said I of subelement U32 output from said center minimum luminance value Min(x) brightness is not less than said threshold value T in υPixel as region of partial sky, with said I Min(x) pixel in according to pixels is worth from big to small arranges, and the minimum pixel value in preceding 30% pixel is set at luminance threshold T υ,
Sky brightness value estimator unit U34 will be from said I Min(x) choose in the detected said region of partial sky in each color component the pixel value of the brightest pixel in as sky brightness value A=(A R, A G, A B) T, and the mean value of the sky brightness value of the sky brightness value A of calculating current frame image and each two field picture of front
Figure BDA0000063051260000031
Atmosphere illumination white balance unit U4 is formed by syndrome unit U41 and normalization subelement U42 serial connection, wherein:
Syndrome unit U41; Be used to replace each the color component maximum in the Max-RGB white balance algorithm from each color component
Figure BDA0000063051260000032
of the average sky brightness value of said sky brightness value estimator unit U34; White balance image
Figure BDA0000063051260000033
to said pending image is proofreaied and correct after obtaining proofreading and correct is established the sequence number that k representes present frame
If k≤K then imports the mean value
Figure BDA0000063051260000034
from the sky brightness value of sky brightness value estimator unit U34
If k>K; Mean value
Figure BDA0000063051260000035
the normalization subelement U42 of the sky brightness value of K two field picture before then importing; Be calculated as follows white balance image
Figure BDA0000063051260000036
and the minimum value between 1 after the said correction, obtain normalization correcting image I ' (x):
I ′ ( x ) = min ( I ( x ) A ‾ , 1 ) ;
Atmospheric dissipation image estimation unit U5 propagates image estimator unit U53 by atmospheric dissipation image rough estimate subelement U51, atmospheric dissipation image thinning subelement U52 and medium and is connected in series successively and forms, wherein:
Atmospheric dissipation image rough estimate subelement U51, by following formula to (x) obtaining wherein the minimum value between the different colours component
Figure BDA0000063051260000038
as the rough estimate evaluation of atmosphere functional image from the normalized image I ' after the correction of said normalization subelement U42 input:
V ~ ( x ) = min c ∈ { R , G , B } I c ′ ( x ) ,
Atmospheric dissipation image thinning subelement U52 is calculated as follows the atmospheric dissipation refined image V (x) from the rough estimate evaluation
Figure BDA0000063051260000041
of said atmospheric dissipation image rough estimate subelement U51:
Wherein,
Figure BDA0000063051260000043
expression corrosion " expansion " formula reconstruction of function;
Figure BDA0000063051260000044
expression is carried out n corrosion with structural element b to rough estimate image
Figure BDA0000063051260000045
; N=1; Structural element b is collar plate shape zone; Its radius is 0.0125 times of minimum value in said picture traverse W and the height H; With rough estimate image
Figure BDA0000063051260000046
as template image; Its etch figures is looked like the image that serves as a mark; After said template image is corroded; Be not more than at the pixel value of correspondence position marking image under the condition of pixel value of said template image, obtain reconstructed image V (x)
Medium is propagated image estimator unit U53, and input is calculated as follows medium propagation function t (x) from the atmospheric dissipation refined image V (x) of said atmospheric dissipation image thinning subelement U52:
t(x)=1-V(x);
Clear scene recovery unit U6, press following formula and calculate scene albedo ρ (x) according to said atmospheric dissipation refined image V (x) and medium propagation function t (x) from said atmospheric dissipation image estimation unit U5:
ρ ( x ) = I ′ ( x ) - κV ( x ) t ( x ) ,
Wherein, κ is a factor of influence, 0<κ<1, and κ=0.95, κ is the factor that the force revert region of partial sky is introduced for white.
Real-time video mist elimination treatment system disclosed by the invention; Through to atmosphere illumination white balance and utilize the edge preserving smoothing method to estimate the atmospheric dissipation image; Suppress image colour cast that the muddy particle of atmosphere causes and produced the problem of Halo effect on the border of depth of field sudden change, improved video mist elimination effect effectively; And the speeding scheme of estimating through sky brightness reaches edge preserving smoothing method fast, has quickened the video mist elimination effectively and has handled.Collection video for common format in the traffic monitoring can reach the requirement of real-time.
Description of drawings
In order to be illustrated more clearly in the embodiment of the invention or technical scheme of the prior art; To do to introduce simply to the accompanying drawing of required use in embodiment or the description of the Prior Art below; Obviously, the accompanying drawing in describing below only is some embodiments of the present invention, for those of ordinary skills; Under the prerequisite of not paying creative work, can also obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is an atmospheric scattering model sketch map disclosed by the invention;
Fig. 2 is a real-time video mist elimination treatment system structural representation disclosed by the invention;
Fig. 3 is a sky brightness estimation unit structural representation disclosed by the invention;
Fig. 4 is an atmosphere illumination white balance cellular construction sketch map disclosed by the invention;
Fig. 5 is an atmospheric dissipation image estimation unit structural representation disclosed by the invention;
Fig. 6 is the disclosed real-time video mist elimination of an embodiment of the invention process chart;
Fig. 7 is the curve chart that the sky brightness value of inner 50 two field pictures of same camera lens changes: red (Red) component of zero expression; Green (Green) component of * expression; Blue (Blue) component of expression; Sky brightness (Sky intensity), frame (Frame);
Fig. 8 is that the disclosed real-time video mist elimination of the embodiment of the invention is handled particular flow sheet;
Fig. 9 is the process sketch map that calculates each pixel minimum luminance value in the embodiment of the invention;
Figure 10 is the gray scale morphology restructuring procedure sketch map of one dimension function: 10-1 expressive notation image, 10-2 representation template image, the expansion results repeatedly of 10-3 expressive notation image;
Figure 11 is the gray scale morphology reconstruction result sketch map of one dimension function: 11-1 representes reconstructed image.
Embodiment
A kind of real-time video mist elimination treatment system comprises:
Data-reading unit is used for reading the pending image of a frame of video;
Judging unit is used for judging that said pending image I (x) belongs to the preceding K frame of current camera lens, if, then carry out the sky brightness estimation unit, if not, then carry out atmosphere illumination white balance unit;
The sky brightness estimation unit is used to estimate the region of partial sky of said pending image, and estimates the sky brightness value according to said region of partial sky;
Atmosphere illumination white balance unit is used to utilize white balance algorithm, according to said sky brightness value, proofreaies and correct the color of atmosphere illumination, and with the white balance image normalization;
Atmospheric dissipation image estimation unit is used to utilize the edge preserving smoothing method, according to said normalization correcting image, estimates the atmospheric dissipation image of said pending image;
Clear scene recovery unit is used to utilize the atmospheric scattering model, according to said atmospheric dissipation image, recovers the scene albedo of said pending image.
Preferably, said sky brightness estimation unit comprises:
Fall the sampling subelement, be used for sampling is fallen in said pending image, and, cut out and fall sampled images the first half according to the position characteristic on the upper side that region of partial sky has;
The center minimum value is obtained subelement, be used to calculate said each pixel of falling sampled images the different colours component between and minimum luminance value in the neighborhood scope;
Region of partial sky is obtained subelement, is used for the bigger characteristic of brightness that has according to region of partial sky, said center minimum luminance value is not less than the zone that the pixel of preset luminance threshold forms confirms as region of partial sky;
Sky brightness value estimator unit is used for choosing the max pixel value of said pending image different colours component according to said region of partial sky, and it is confirmed as the sky brightness value of respective color component;
Preferably, said atmosphere illumination white balance unit comprises:
The syndrome unit is used to utilize white balance algorithm, according to the corresponding sky brightness value of said different colours component, proofreaies and correct the color of atmosphere illumination in the said pending image;
The normalization subelement is used to choose the minimum value between each pixel value and 1 of said white balance image, obtains the normalization correcting image.
Preferably, said atmospheric dissipation image estimation unit comprises:
Atmospheric dissipation image rough estimate subelement is used to calculate the minimum value between the said normalization correcting image different colours component, as the rough estimate evaluation of atmospheric dissipation image;
Atmospheric dissipation image thinning subelement is used to utilize the edge preserving smoothing method, and said atmospheric dissipation rough estimate image is carried out thinning processing, estimates the atmospheric dissipation image;
Medium is propagated image estimator unit, is used to utilize medium to propagate the relational expression of image and atmospheric dissipation image, and according to said atmospheric dissipation image, calculation medium is propagated the estimated value of image.
To combine the accompanying drawing in the embodiment of the invention below, the technical scheme in the embodiment of the invention is carried out clear, intactly description, obviously, described embodiment only is the present invention's part embodiment, rather than whole embodiment.Based on the embodiment among the present invention, those of ordinary skills are not making the every other embodiment that is obtained under the creative work prerequisite, all belong to the scope of the present invention's protection.
Under weather conditions such as mist, haze, the image that a large amount of small water droplet that suspends in the atmosphere, aerocolloidal scattering process cause gathering seriously degrades.Along with the increase of object to the imaging device distance, the scattering process of atmospheric particles and light increases the influence of imaging gradually.This influence is mainly caused by two scattering processes: 1) reverberation of body surface is in arriving the process of imaging device, owing to the scattering of atmospheric particles decays.2) natural daylight gets into imaging device participation imaging because of the scattering of atmospheric particles.Atmospheric scattering model shown in the available formula 1 of degraded image I (x) that narrow wave band video camera is gathered under weather conditions such as mist, haze is described:
I(x)=Aρ(x)e -βd(x)+A(1-e -βd(x)) 1
In the formula, A is sky brightness (Skylight), and β is the atmospheric scattering coefficient, and ρ (x) and d (x) are respectively the scene albedo and the depth of field at space coordinates x place.The atmospheric scattering model is the physical theory basis that the haze image presents characteristics such as colour cast, low contrast, is the imaging mechanism of understanding the haze image, the main foundation that the atmosphere degraded image is restored.Sky brightness A, atmospheric scattering factor beta, these 3 parameters of depth of field d (x) are unknown quantity in the formula, are called image restoration through oppositely finding the solution ρ (x) process.
For the description of formula of reduction 1, with medium propagation function t (x) expression exponential damping item e -β d (x), promptly available formula 2 is expressed as
t(x)=e -βd(x) 2
In the formula, 0<t (x)<1.The atmospheric dissipation function definition does
V(x)=1-t(x) 3
Obviously, 0<V (x)<1.Atmospheric dissipation function representation surround lighting is to the extention of scene imaging, and it is the increasing function about depth of field d (x).
As shown in Figure 1, the atmospheric scattering model is formed by two.First A ρ (x) e -β d (x)The expression attenuation model is also referred to as direct propagation or directly decay.Because the scattering process of atmospheric particles, the reverberation of a part of body surface loses because of scattering, and the part that is not scattered directly arrives imaging sensor, and the light intensity of arrival is exponential damping along with the distance increase of propagating.Second A (1-e -β d (x)) expression environment models.This is because atmospheric particles causes that to natural scattering of light atmosphere shows the characteristic of light source.The environment light intensity is along with the distance of propagating increases and increases gradually.
The invention discloses a kind of real-time video mist elimination treatment system, the computational process that exists with the algorithm that solves prior art is complicated, consuming time, is difficult to be applied to the problem of practical matter.Its structure is as shown in Figure 2, comprising: data-reading unit U1, judging unit U2, sky brightness estimation unit U3, atmosphere illumination white balance unit U4, atmospheric dissipation image estimation unit U5 and clear scene recovery unit U6, wherein:
Data-reading unit U1 is used for reading the pending image of a frame of video; Judging unit U2 is used for judging that said pending image I (x) belongs to the preceding K frame of current camera lens, if, then carry out sky brightness estimation unit U3, if not, then carry out atmosphere illumination white balance unit U4; Sky brightness estimation unit U3 is used to estimate the region of partial sky of said pending image, and estimates the sky brightness value according to said region of partial sky; Atmosphere illumination white balance unit U4 is used to utilize white balance algorithm, according to said sky brightness value, proofreaies and correct the color of atmosphere illumination, and with the white balance image normalization; Atmospheric dissipation image estimation unit U5 is used to utilize the edge preserving smoothing method, according to said normalization correcting image, estimates the atmospheric dissipation image of said pending image; Clear scene recovery unit U6 is used to utilize the atmospheric scattering model, according to said atmospheric dissipation image, recovers the scene albedo of said pending image.Its detailed content can be referring to following pairing embodiment.
Said sky brightness estimation unit U3 structure is as shown in Figure 3, comprising: fall sampling subelement U31, be used for sampling is fallen in said pending image, and according to the position characteristic on the upper side that region of partial sky has, cut out and fall sampled images the first half; The center minimum luminance value obtains subelement U32, be used to calculate said each pixel of falling sampled images the different colours component between and minimum luminance value in the neighborhood scope; Region of partial sky is obtained subelement U33, is used for the bigger characteristic of brightness that has according to region of partial sky, said center minimum luminance value is not less than the zone that the pixel of preset luminance threshold forms confirms as region of partial sky; Sky brightness value estimator unit U34 is used for choosing the max pixel value of said pending image different colours component according to said region of partial sky, and it is confirmed as the sky brightness value of respective color component.Its detailed content can be referring to following pairing embodiment.
Said atmosphere illumination white balance unit U4 structure is as shown in Figure 4, comprising: syndrome unit U41, be used to utilize white balance algorithm, and according to the corresponding sky brightness value of said different colours component, proofread and correct the color of atmosphere illumination in the said pending image; Normalization subelement U42 is used to choose the minimum value between each pixel value and 1 of said white balance image, obtains the normalization correcting image.Its detailed content can be referring to following pairing embodiment.
Said atmospheric dissipation image estimation unit U5 is as shown in Figure 5, comprising: atmospheric dissipation image rough estimate subelement U51 is used to calculate the minimum value between the said normalization correcting image different colours component, as the rough estimate evaluation of atmospheric dissipation image; Atmospheric dissipation image thinning subelement U52 is used to utilize the edge preserving smoothing method, and said atmospheric dissipation rough estimate image is carried out thinning processing, estimates the atmospheric dissipation image; Medium is propagated image estimator unit U53, is used to utilize medium to propagate the relational expression of image and atmospheric dissipation image, and according to said atmospheric dissipation image, calculation medium is propagated the estimated value of image.Its detailed content can be referring to following pairing embodiment.
Its embodiment is described below:
Embodiment one
Referring to Fig. 1, the disclosed real-time video mist elimination of present embodiment handling process may further comprise the steps:
Step S1, read the pending video of a frame;
From video, read the pending image I of a frame (x)=(I R(x), I G(x), I B(x)) T, to form by R, G, three color components of B, its height is N hIndividual pixel, wide is N wIndividual pixel, x representes the coordinate of two-dimensional space, can use vector (m, n) expression, wherein, 0≤m≤N h-1,0≤n≤N w-1, m, n, N h, N wBe nonnegative integer.
Step S2, the said pending image I of judgement (x) belong to the preceding K frame in the current camera lens; If, execution in step S3 then; If not, execution in step S4 then;
Camera lens is the ultimate constituent of video flowing, and one section video flowing is made up of many camera lenses.Camera lens is meant the video image of the similar content of being gathered, and mainly is that the frame difference according to video flowing carries out the camera lens change detection.For video, has very high correlation between adjacent two frames.When video is handled, the distinctive temporal correlation of considered video.If each frame or every a few frame are calculated sky brightness A respectively one time, then on the one hand, the A that at every turn calculates is slightly different, as shown in Figure 7, causes film flicker, brightness jump; On the other hand, increased the time overhead of Video processing, also there is no need so to do.Judge whether present frame belongs to the preceding K frame of current camera lens; If k representes the sequence number of present frame; If k≤K; Calculate the sky brightness A of present frame; And if the mean value
Figure BDA0000063051260000091
of the sky brightness of calculating present frame and front k-1 two field picture is k>K; Then utilize the time correlation characteristic of video, the mean value
Figure BDA0000063051260000101
of directly using the sky brightness of preceding K frame in the present embodiment, the K value is 50; Just the frame number of 2s video still is not limited in this.
The region of partial sky of step S3, the said pending image of estimation, and according to said region of partial sky estimation sky brightness value;
Whether what sky brightness was estimated accurately plays crucial effects to treatment effect.Higher and the position characteristic on the upper side of the brightness that utilizes region of partial sky to have among the present invention is confirmed region of partial sky.Owing to account for the half the of total time the search time of region of partial sky.Therefore, take two speeding schemes: 1) sampling is fallen in image, reduce its resolution; 2), cut out the first half of image according to the priori of region of partial sky position.For each the pixel x in the image, calculating pixel x the different colours component between and the neighborhood scope in minimum luminance value.The higher priori of brightness that has according to region of partial sky is not less than the zone that the pixel of preset luminance threshold forms with minimum luminance value and confirms as region of partial sky.In the corresponding region of pending image, max pixel value is confirmed as the estimated value of sky brightness A.
Step S4, utilize white balance algorithm,, proofread and correct the color of atmosphere illumination according to said sky brightness value, and with the white balance image normalization;
Do the size (10 of atmospheric molecule in the clean air -4Mm) much smaller than wavelength of visible light, short wavelength's blue light accounts for leading, thereby sky presents blueness.The size of contained little water droplet in rain, the greasy weather gas (>1mm) compare very greatly with wavelength of visible light, therefore, scattering coefficient depends on wavelength hardly.The visible light of different-waveband is with same ratio generation scattering, thereby pure mist looks like white or grey.Yet, contained aerocolloidal size range (10 in the grey haze weather -2~1mm) between between the two above, thereby grey haze presents blueness or brown.Therefore, utilize white balance algorithm, with sky brightness Proofread and correct and be white (1,1,1) T, and through the minimum value between each pixel value and 1 of choosing said white balance image, the pixel value of said white balance image is restricted in [0,1] scope.
Step S5, utilize the edge preserving smoothing method,, estimate the atmospheric dissipation image of said pending image according to said normalization correcting image;
Along with the distance increase of scene to imaging equipment, surround lighting increases the effect of imaging gradually, and the brightness of image increases gradually.Surround lighting only is the function about the depth of field, and therefore, the increment of image brightness is the foundation of the scene depth of field.Because the gray value of zero albedo (hypersorption) color bands imaging mainly is the contribution of surround lighting under mist, haze weather condition, the present invention is with the rough estimate evaluation of the minimum value between the normalization correcting image I ' different colours component (x) as the atmospheric dissipation image.Because the atmospheric dissipation function only is the function about depth of field d (x), and irrelevant with albedo ρ (x).Second step; The segment smoothing operation is carried out in the rough estimate
Figure BDA0000063051260000103
of atmospheric dissipation function, kept the edge details of depth of field sudden change.The purpose of edge preserving smoothing (filtering) be make output image and input picture as far as possible similar, still, should be as far as possible except the zone of passing big gradient smoothly.The filtering method that has the edge retention performance at present has medium filtering, bilateral filtering, guiding filtering, local extremum is level and smooth, weighted least-squares is level and smooth, two generations wavelet filtering, three limit filtering etc.Wherein, because bilateral filtering is simply theoretical and fast algorithm is arranged, becomes the most widely used at present edge and keep filtering method.Yet, the problem that is not easy to realize on above filtering method ubiquity algorithmic procedure complicacy, the hardware, therefore, the present invention carries out refinement with the method that combines of the open and close restructing operation in the mathematical morphology image processing to atmospheric dissipation rough estimate image.The mathematical morphology algorithm has the structure of the Parallel Implementation of being beneficial to, and is suitable for parallel work-flow, and realizes easily on the hardware.
Step S6, utilize the atmospheric scattering model,, recover the scene albedo of said pending image according to said atmospheric dissipation image;
Utilize the atmospheric dissipation function V (x) and the medium propagation function t (x) that have estimated, solve scene albedo ρ (x) according to formula 1.
In the present embodiment, referring to Fig. 8, the said sky brightness of step S3 is estimated specifically to realize with step S31-S34, may further comprise the steps:
Step S31, sampling is fallen in said pending image, and, cut out and fall sampled images the first half according to the position characteristic on the upper side that region of partial sky has;
Step S32, calculate said each pixel institute of falling sampled images the different colours component between reach the minimum luminance value in the neighborhood scope;
Step S33, the characteristic that the brightness that has according to region of partial sky is bigger are not less than the zone that the pixel of preset luminance threshold forms with said center minimum luminance value and confirm as region of partial sky;
Step S34, according to said region of partial sky, choose the max pixel value of said pending image different colours component, it is confirmed as the sky brightness value of respective color component.
The implementation process of step S31-step S34 is at length set forth as follows:
Sampling is fallen in said pending image I (x); And generally appear at this priori of image top according to region of partial sky, and cut out the first half of image, new images is designated as Ψ (x); Minimum color component to Ψ (x) carries out the minimum value filtering operation, promptly shown in the formula 4:
I min ( x ) = min y ∈ Ω ( x ) ( min c ∈ { R , G , B } Ψ c ( y ) ) - - - 4
In the formula, Ψ c(x) said each color component images of falling the first half Ψ (x) of sampled images of expression, { R, G, B} represent R, G, B Color Channel respectively to c ∈; Ω (x) expression is the neighborhood at center with pixel x, and its size is directly proportional with the minimum value of figure image width and senior middle school adaptively.In the present embodiment, the value of falling minor face between sample graph image width and the height is less than 300, and Ω (x) is a square neighborhood, and the size of this square neighborhood equals said 0.025 times of falling minimum value between sample graph image width and the height, but is not limited in this.The process of formula 4 is as shown in Figure 9, the first step, and for each pixel x, the minimum value of the R at calculating respective coordinates place, G, three color components of B constitutes the minimum color component images L (x) of a width of cloth:
L ( x ) = ( min c ∈ { R , G , B } Ψ c ( x ) ) - - - 5
Second step, the pixel among the traversal L (x), for each pixel x, calculating with pixel x is the minimum value of interior all the pixel y of neighborhood Ω (x) at center, constitutes a width of cloth minimum value filtering image I Min(x):
I min ( x ) = min y ∈ Ω ( x ) L ( y ) - - - 6
Relate to luminance threshold T when calculating sky brightness υThe present invention is with said I Min(x) pixel in according to pixels is worth from big to small arranges, and the minimum pixel value in preceding 30% pixel is set at luminance threshold T υ, I Min(x) be not less than threshold value T in υPixel as region of partial sky.In pending image, choose pixel the brightest in detected each color component of region of partial sky as sky brightness A.At last, utilize moving average method, press the mean value that following formula upgrades sky brightness:
A ‾ k = A + ( k - 1 ) A ‾ k - 1 k - - - 7
In the formula; K is the sequence number of present frame,
Figure BDA0000063051260000124
and
Figure BDA0000063051260000125
represent the mean value of the sky brightness of k-1 frame and k frame respectively.
It is pointed out that choosing of Ω (x) will influence follow-up judgement to candidate's region of partial sky.This is because for the influence of white object in the filtering image to the estimation sky brightness, the size of Ω (x) should be greater than the size of white object in the image.But, if the region of partial sky in the image is also less than the size of Ω (x), then with filtering region of partial sky by error.When concrete the realization, those skilled in the art can carry out flexible design, as long as avoid as far as possible falsely dropping and leaking choosing, do not give unnecessary details at this.
Process so that top is estimated sky brightness has been done concrete description, sampling is fallen in said pending image, and cut out said the first half of falling sampled images, to reach the purpose that reduces the processing time, accelerates the speed that sky brightness is estimated.Further, said minimum luminance value is not less than the zone that the pixel of preset luminance threshold forms and confirms as region of partial sky.In pending image, choose pixel the brightest in the detected region of partial sky as sky brightness A.At last; Utilize moving average method; Calculate average sky brightness and solved the sky brightness sequence problem of random fluctuation up and down, make the color smooth change of atmosphere illumination.
In the present embodiment, referring to Fig. 8, the said atmosphere illumination of step S4 white balance specifically realizes with step S41-S42, may further comprise the steps:
Step S41, utilize white balance algorithm,, proofread and correct the color of atmosphere illumination in the said pending image according to the corresponding sky brightness value of said different colours component;
Step S42, choose the minimum value between each pixel value and 1 of said white balance image, obtain the normalization correcting image;
The implementation process of step S41-step S42 is at length set forth as follows:
WP (White Point) algorithm is also referred to as the Max-RGB algorithm, utilizes the maximum of R, G, B color component to estimate the color of illumination.In the present invention; With sky brightness
Figure BDA0000063051260000132
the replacement maximum of having estimated, white balance is carried out in atmosphere illumination.Like this, can be reduced to formula 8 to formula 1 two ends divided by
Figure BDA0000063051260000133
atmospheric scattering model is:
I ( x ) A ‾ = ρ ( x ) t ( x ) + V ( x ) - - - 8
For the high zone of brightness in the image; Then corresponding is limited in [0 with the image
Figure BDA0000063051260000136
of white balance correction; 1] in the scope; (x) be expressed as
Figure BDA0000063051260000137
and the minimum value between 1 with I ', promptly shown in the formula 9:
I ′ x = min ( I ( x ) A ‾ , 1 ) - - - 9
For this reason, the atmospheric scattering model can be expressed as formula 10:
I′(x)=ρ(x)t(x)+V(x) 10
I ' (x) in, sky brightness
Figure BDA0000063051260000139
Proofread and correct and be white (1,1,1) T
With top the process of atmosphere illumination white balance has been done concrete description; Utilization is proofreaied and correct the color of atmosphere illumination according to said sky brightness value based on the white balance algorithm of WP; Avoid the problem of the image colour cast that the muddy particle of atmosphere causes, improved video mist elimination effect effectively, for example; The video of under grey haze weather condition, gathering, the deflection blueness or the brown of its atmosphere illumination.Present embodiment does not limit the white balance algorithm of employing based on WP, can adopt other white balance algorithm based on image low order statistical nature yet, for example, based on average (Gray World, GW), p rank Minkowski norm hypothesis such as (Shades-of-Gray).Can fast and effeciently proofread and correct the color of grey haze image based on the method for low order characteristic.
In the present embodiment, referring to Fig. 8, the said atmospheric dissipation image of step S5 is estimated specifically to realize with step S51-S53, may further comprise the steps:
Minimum value between step S51, the said normalization correcting image different colours component of calculating is as the rough estimate evaluation of atmospheric dissipation image;
Step S52, utilize the edge preserving smoothing method, said atmospheric dissipation rough estimate image is carried out thinning processing, estimate the atmospheric dissipation image;
Step S53, utilize medium to propagate the relational expression of image and atmospheric dissipation image, according to said atmospheric dissipation image, the estimated value of calculation medium propagation image.
The implementation process of step S51 is at length set forth as follows:
Because the existence of mist, haze, along with the distance increase of scene to imaging equipment, surround lighting increases the effect of imaging gradually, and the brightness of image increases gradually.Surround lighting only is the function about the depth of field, and therefore, the increment of image brightness is the foundation of the scene depth of field.The assumed condition of He etc. is that the scene albedo is tending towards 0 in the local zonule Ω (x) of at least one Color Channel, and available formula 11 is expressed as:
min y ∈ Ω ( x ) ( min c = { R , G , B } ρ ( y ) ) → 0 - - - 11
According to formula 8, and then the rough estimate evaluation of acquisition atmospheric dissipation function is:
V ~ ( x ) = min y ∈ Ω ( x ) ( min c = { R , G , B } I ( y ) A ‾ ) - - - 12
Atmospheric dissipation function V (x) receives the constraint of two conditions: 1) V (x)>=0, promptly V (x) be on the occasion of; 2) V (x)≤I ' (x), promptly V (x) is not more than I ' minimum color component (x).The present invention estimates the atmospheric dissipation function through two steps from coarse to fine.The first step is carried out rough estimate with I ' minimum color component (x) to the atmospheric dissipation function, promptly
V ~ ( x ) = min c ∈ { R , G , B } I ′ ( x ) - - - 13
This is consistent with the assumed condition of He etc. basically, and promptly the gray value of zero albedo (hypersorption) color bands imaging mainly is the contribution of surround lighting under mist, haze weather condition.
The implementation process of step S52-step S53 is at length set forth as follows:
Because the atmospheric dissipation function only is the function about depth of field d (x), and irrelevant with albedo ρ (x).Second step; The segment smoothing operation is carried out in the rough estimate
Figure BDA0000063051260000144
of atmospheric dissipation function; The edge details that keeps depth of field sudden change; The edge of non-depth of field sudden change, depth of field smooth variation makes the identical depth of field have identical big gas consumption religion value as much as possible.
In fact the first step of estimation medium propagation functions such as He is equivalent to carries out minimum value filtering to
Figure BDA0000063051260000151
; But single minimum value filtering meeting produces Halo effect and blocking effect.The analytic solutions algorithm that second step was scratched figure by image carries out Refinement operation to the medium propagation function.Tarel etc. carry out medium filtering to
Figure BDA0000063051260000152
to estimate the atmospheric dissipation function.
The reconstruct of gray scale morphology is a kind of important morphological transformation, and it is to utilize two width of cloth images, and a width of cloth is called marking image (marker), comprises the initial point set of conversion; Another width of cloth is called template image (mask), the constraint conversion.Morphology reconstruct is under the constraint of template image, comes the marks for treatment image.If f and g be expressive notation image and template image respectively, f and g's is measure-alike, and for gray value, f≤g.F is defined as about 1 geodetic expansion (geodesic dilation) of g
This shows that it is at first to calculate f to be expanded by b that 1 geodetic expands, again to each pixel (x, y) minimum value of calculating expansion results and g.F is defined as about n the geodetic expansion (geodesic dilation) of g
D g ( n ) ( f ) = D g ( 1 ) { D g ( n - 1 ) ( f ) } , D g ( 0 ) ( f ) = f - - - 15
In fact following formula shows that is n iteration of formula 14, and initial value
Figure BDA0000063051260000157
is marking image f.
Gray scale marking image f is defined as the geodetic expansion of f about g about the expansion type morphology reconstruct of gray scale template image g, and the iterative process of carrying out formula 15 can be expressed as till expansion no longer changes
R g D ( f ) = D g ( k ) ( f ) - - - 16
In the formula, k satisfies
Figure BDA0000063051260000159
Figure 10 has explained the explanation of the gray scale morphology reconstruct of one dimension function on intuitively, and morphology reconstruct is expanded to marking image repeatedly, until the border of marking image and the edge fitting in the template image.The curve representation template image of the top among Figure 10, and lower curve expressive notation image.In this figure, obtain marking image through from template image, deducting a constant.The pixel value of marking image can not be greater than the pixel value of correspondence position in the template image.The each expansion of marking image all receives the constraint of template image, repeats such process when pixel value no longer changes, and expands to stop, and middle black line shown in Figure 11 is a reconstructed image.
Opening restructing operation (Opening by reconstruction) and closing restructing operation (Closing by reconstruction) is two gray scale reconfiguration techniques commonly used, and they are different from opening operation and closed operation.In opening restructing operation, at first image is corroded operation, still, be different from the opening operation and carry out expansive working after the corrosion, but utilize etch figures to look like the image that serves as a mark, and original image is carried out morphology reconstruct as template image.Open the expansion type reconstruct that restructing operation is defined as its n time Corrosion results, can be expressed as for n time of the rough estimate of atmospheric dissipation image
Figure BDA0000063051260000161
Figure BDA0000063051260000162
In the formula;
Figure BDA0000063051260000163
expression is carried out n corrosion with structural element b to image
Figure BDA0000063051260000164
,
Figure BDA0000063051260000165
expression expansion type reconstruction of function.The effect of opening restructing operation is to keep corrosion to keep the global shape of composition.Close restructing operation and utilize the benefit of the expansion plans picture image that serves as a mark, and the operation of the benefit of original image is carried out and opened restructing operation as template image, what supplement realized to result images at last.Open restructing operation with close restructing operation combine the mistake of not only removing image bright with cross dark details, image is played a part level and smooth, and, can keep the global shape of target well, do not destroy the shape of former target.In the present embodiment, the value of n is 1, and structural element b is a disk property neighborhood, and its radius equals above-mentioned N hAnd N wIn 0.0125 times of minimum value, but be not limited in this.
Further, according to formula 3 calculation medium propagation function t (x), can be expressed as formula 18:
t(x)=1-V(x) 18
With top atmospheric dissipation image estimation procedure has been done concrete description, therefore, the present invention is with opening restructing operation and closing the restructing operation method that combines atmospheric dissipation rough estimate image is carried out refinement in the mathematical morphology image processing.The mathematical morphology algorithm has the structure of the Parallel Implementation of being beneficial to, and is suitable for parallel work-flow, and realizes easily on the hardware, has solved the problem that is not easy to realize on filtering method ubiquity algorithmic procedure complicacy, the hardware.Present embodiment does not limit the edge preserving smoothing method that adopts the open and close restructing operation to combine, and can adopt other quick filter methods with edge retention performance, for example medium filtering, bilateral filtering, guiding filtering etc. yet.Just, the problem that is not easy to realize on filtering method ubiquity algorithmic procedure complicacy, the hardware.As long as finally can obtain the edge preserving smoothing image, all be the scope that the present invention protects.
The implementation process of step S6 is at length set forth as follows:
Utilize the atmospheric dissipation function V (x) and the medium propagation function t (x) that have estimated, solve scene albedo ρ (x) according to formula 10.Can know that by formula 13 difference of image I ' (x) and atmospheric dissipation function V (x) is most probably near 0.Meanwhile, sky is positioned at infinite distant place, and its medium propagation function t (x) is tending towards 0.In this case, directly the restoration scenario albedo will cause the color of region of partial sky that serious distortion takes place.For fear of the O/O type uncertain value of (two perhaps very little numbers are divided by), introduce factor κ (0<κ<1), scene albedo ρ (x) is calculated as by formula 19
ρ ( x ) = I ′ ( x ) - κV ( x ) t ( x ) - - - 19
In the formula, the region of partial sky among the introducing force revert result of κ is a white.In the present embodiment, the value of κ is 0.95, but is not limited in this.At last,, block [0,1] scope value in addition for the scene that has the discontinuity depth of field and seriously block; And, use the dynamic range of Nonlinear Mapping function compression scene albedo for scene with continuity depth of field.Through to there being the mist image observation to find in a large number, the histogram shape of ρ (x) is the foundation of the above-mentioned two kinds of situation of difference.
Present embodiment has suppressed the image colour cast that the muddy particle of atmosphere causes through to atmosphere illumination white balance; Utilize the edge preserving smoothing method that atmospheric dissipation rough estimate image is carried out refinement, avoided producing the problem of Halo effect, improved video mist elimination effect effectively on the border of depth of field sudden change; The smoothing method that speeding scheme of estimating through sky brightness and open and close restructing operation combine has quickened the video mist elimination effectively and has handled; Application scenarios that can requirement of real time, and the smoothing method that combines of open and close restructing operation has on the hardware advantage of Parallel Implementation easily.If piece image is of a size of N h* N w, time complexity of the present invention can reach O (N hN w).
Above-mentioned each embodiment has carried out detailed description from theoretical side to a kind of real-time video mist elimination treatment system disclosed by the invention; From theoretical side its beneficial effect is described simultaneously; Below; The present invention will be from the processing procedure of reality, and the data that the present invention and prior art are handled same set of image data compare, to realize from practical application, supporting the object of the invention.
In most of practical applications, true picture or True Data are unknown.Owing to there is not true picture can do reference, the present invention uses nothing to carry out the evaluation and the comparison of algorithm performance with reference to the objective quality evaluating method.The video of under weather such as mist, haze, gathering, surround lighting causes image brightness increase, saturation, contrast to reduce and the image colour cast to the effect of imaging.A kind of good mist elimination algorithm is the visibility of restoration scenario, simultaneously the true colors of restoration scenario.In other words, restored image has good contrast, has high similarity with degraded image aspect tone, saturation and the brightness simultaneously, thereby is keeping the natural characteristic of tone, saturation and brightness simultaneously.As for color maintenance aspect, can use tone polar coordinates histogram to describe.Tone is with measuring about the angle of red axle on the colourity dish, the scope that tone value is represented with angle for [0,360).The tone polar coordinates histogram probability that all tones occur in the presentation video in unit circle.Because the corresponding tone of low saturation pixel is nonsensical, thereby, only to saturation greater than 0.05 pixels statistics tone polar coordinates histogram.
Similarity degree between two histogram distribution of histogram similarity measurement, the present invention uses the evaluation and test index of the coefficient correlation (correlation coefficient) of two histogram distribution as the histogram similarity.The coefficient correlation of two histogram h ' and h is defined as
d corr ( h ′ , h ) = Σ k ( h ′ k · h k ) Σ k h ′ k 2 · Σ k h k 2 - - - 20
d CorrBig more, matching degree is high more; When mating fully, d CorrBe 1, and when not matching fully, d CorrBe-1.And contrast can be measured through variance, and the RGB image is carried out principal component analysis, and (Principle Component Analysis, PCA), the characteristic value of principal component is exactly its variance.The present invention defines variance ratio η λRatio for mist elimination image feature value sum and pending image feature value sum.Form 1 has compared the hue, saturation, intensity similarity and the variance ratio of the mist elimination image of the present invention and Tan, He&al, Tarel&al, Krafiz&al scheduling algorithm.
Form 1: comprehensive evaluating index comparison
Figure BDA0000063051260000182
According to the integrated evaluating method of Wang etc., the comprehensive evaluating index Q of mist elimination picture quality mCan be expressed as
Q m=(h corr·s corr·υ corr) α·(η λ) β 21
In the formula, α, β are used for regulating fidelity that hue, saturation, intensity restores and contrast at the shared proportion of overall merit.The comprehensive evaluating index Q of Tan, He&al, Tarel&al, Kraftz&al and algorithm of the present invention m(10 -4) be respectively 0.0349,0.1667,0.1136,0.0747 and 0.1770.This shows, compare that algorithm of the present invention reaches optimum mist elimination result with the mist elimination image of other 4 kinds of algorithms.
Because the variance difference of each amount is bigger, through make each variance criterionization of measuring divided by variance to 1, before variance criterionization, tone similarity h Corr, saturation similarity s Corr, brightness similarity υ Corr, variance ratio η λVariance be respectively 0.0343,0.0022,0.0010,0.7458, standardized comprehensive evaluating Q mBe 62.8330,299.9157,204.3461,134.4478,318.4299.This shows; Mist elimination result's of the present invention comprehensive evaluating index is the highest, and this explanation the present invention is than the processing method of prior art, when improving picture contrast and saturation; The similitude that has kept tone, saturation and brightness aspect makes the processing image have natural characteristic.
Aspect speed, be 288 * 352 CIF format video for resolution, processing speed of the present invention can reach 60fps.For resolution is 576 * 720 D1 format video, and processing speed of the present invention can reach 15fps.Collection video for common format in the traffic monitoring can reach the requirement of real-time.
Each embodiment adopts the mode of going forward one by one to describe in this specification.The professional can also further recognize; Unit and algorithm steps in conjunction with each example of the disclosed embodiments description among the present invention; Can realize with electronic hardware, computer software or the combination of the two; For the interchangeability of hardware and software clearly is described, the composition and the step of each example described prevailingly according to function in above-mentioned explanation.These functions still are that software mode is carried out with hardware actually, depend on the application-specific and the design constraint of technical scheme.The professional and technical personnel can use distinct methods to realize described function to each certain applications, but this realization should not thought and exceeds scope of the present invention.
The method of describing in conjunction with the disclosed embodiments among the present invention or the step of algorithm can be directly with the software modules of hardware, processor execution, and perhaps the combination of the two is implemented.Software module can place known any other forms of storage medium in random asccess memory (RAM), internal memory, read-only memory (ROM), electrically programmable ROM, electrically erasable ROM, register, hard disk, moveable magnetic disc, CD-ROM or the technical field.
To the above-mentioned explanation of the disclosed embodiments, make this area professional and technical personnel can realize or use the present invention.Multiple modification to these embodiment will be conspicuous concerning those skilled in the art, and defined General Principle can realize under the situation that does not break away from the spirit or scope of the present invention in other embodiments among the present invention.Therefore, the present invention will can not be restricted to these embodiment shown in the present, but will meet and principles of this disclosure and features of novelty the wideest corresponding to scope.

Claims (1)

1. real-time video mist elimination treatment system; It is characterized in that; In a digital integrated circuit, realize; Be provided with: data-reading unit (U1), judging unit (U2), sky brightness estimation unit (U3), atmosphere illumination white balance unit (U4), atmospheric dissipation image estimation unit (U5) and clear scene recovery unit (U6), wherein:
Data-reading unit (U1) is pursued the data that frame reads pending image in the video: from the current camera lens of said pending video, read pending image I (x)=(I that a frame is made up of R, G, three color components of B R(x), I G(x), I B(x)) T, be called for short I (x), together, the height of said pending image I (x) is N down hIndividual picture number, wide is N wIndividual picture number, x representes the two-dimensional space coordinate, use vector representation be (m, n), 0≤m≤N h-1,0≤n≤N w-1, m, n, N h, N wBe nonnegative integer;
Judging unit (U2) receives the pending image I of a frame (x) from said data-reading unit, as current frame image, and carries out following steps:
If said pending image I (x) is the frame in the preceding K frame in the current camera lens, K=50 is sent to said sky brightness estimation unit (U3), if not, then is non-preceding K frame, is sent to atmosphere illumination white balance unit (U4);
Sky brightness estimation unit (U3) is provided with and falls that sampling subelement (U31), center minimum luminance value obtain subelement (U32), region of partial sky is obtained subelement (U33) and sky brightness value estimator unit (U34), wherein:
Fall sampling subelement (U31), from said judging unit (U2) input said pending image I (x), I (x) cuts out the first half Ψ (x) that falls sampled images again after falling sampling, and x is the two-dimensional space coordinate,
The center minimum luminance value obtains subelement (U32), from the said said the first half Ψ (x) that falls sampled images of sampling subelement (U31) input that falls, calculates the minimum luminance value of each pixel successively according to following steps:
The first step, for each pixel x, the minimum value of the R at calculating respective coordinates place, G, three color components of B constitutes the minimum color component images L (x) of a width of cloth:
L ( x ) = ( min c ∈ { R , G , B } Ψ c ( x ) ) ,
Second step traveled through the pixel among the said L (x), and for each pixel x, calculating with pixel x is the minimum value of interior all the pixel y of neighborhood Ω (x) at center, constitutes a width of cloth minimum value filtering image I Min(x):
I min ( x ) = min y ∈ Ω ( x ) L ( y ) ,
Wherein, Ω (x) expression is the square neighborhood at center with coordinate x, the foursquare length of side be the wide and senior middle school of the said the first half Ψ (x) that falls sampled images than 0.025 times of minor face, c representes Color Channel, x representes two-dimensional space coordinate, Ψ c(x) said each color component images of falling the first half Ψ (x) of sampled images of expression, c ∈ R, G, B} represent R, G, B Color Channel respectively, and y is the two-dimensional coordinate point in the said Ω (x),
Region of partial sky is obtained subelement (U33), according to the luminance threshold T that sets υ, will obtain the said I of subelement (U32) output from said center minimum luminance value Min(x) brightness is not less than said threshold value T in υPixel as region of partial sky, with said I Min(x) pixel in according to pixels is worth from big to small arranges, and the minimum pixel value in preceding 30% pixel is set at luminance threshold T υ,
Sky brightness value estimator unit (U34) will be from said I Min(x) choose in the detected said region of partial sky in each color component the pixel value of the brightest pixel in as sky brightness value A=(A R, A G, A B) T, and the mean value of the sky brightness value of the sky brightness value A of calculating current frame image and each two field picture of front
Figure FDA0000063051250000021
Atmosphere illumination white balance unit (U4) is formed by syndrome unit (U41) and normalization subelement (U42) serial connection, wherein:
Syndrome unit (U41); Each color component
Figure FDA0000063051250000022
that is used to from the average sky brightness value of said sky brightness value estimator unit (U34) replaces each the color component maximum in the Max-RGB white balance algorithm; White balance image
Figure FDA0000063051250000023
to said pending image is proofreaied and correct after obtaining proofreading and correct is established the sequence number that k representes present frame
If k≤K then imports the mean value
Figure FDA0000063051250000024
from the sky brightness value of sky brightness value estimator unit (U34)
If k>K; Mean value
Figure FDA0000063051250000025
the normalization subelement (U42) of the sky brightness value of K two field picture before then importing; Be calculated as follows white balance image
Figure FDA0000063051250000026
and the minimum value between 1 after the said correction, obtain normalization correcting image I ' (x):
I ′ ( x ) = min ( I ( x ) A ‾ , 1 ) ;
Atmospheric dissipation image estimation unit (U5) is propagated image estimator unit (U53) by atmospheric dissipation image rough estimate subelement (U51), atmospheric dissipation image thinning subelement (U52) and medium and is connected in series successively and forms, wherein:
Atmospheric dissipation image rough estimate subelement (U51), by following formula to (x) obtaining wherein the minimum value between the different colours component
Figure FDA0000063051250000028
as the rough estimate evaluation of atmosphere functional image from the normalized image I ' after the correction of said normalization subelement (U42) input:
V ~ ( x ) = min c ∈ { R , G , B } I c ′ ( x ) ,
Atmospheric dissipation image thinning subelement (U52) is calculated as follows the atmospheric dissipation refined image V (x) from the rough estimate evaluation of said atmospheric dissipation image rough estimate subelement (U51)
Figure FDA0000063051250000031
:
Figure 000003
Wherein, expression corrosion " expansion " formula reconstruction of function;
Figure FDA0000063051250000034
expression is carried out n corrosion with structural element b to rough estimate image ; N=1; Structural element b is collar plate shape zone; Its radius is 0.0125 times of minimum of a value in said picture traverse W and the height H; With rough estimate image
Figure FDA0000063051250000036
as template image; Its etch figures is looked like the image that serves as a mark; After said template image is corroded; Be not more than at the pixel value of correspondence position marking image under the condition of pixel value of said template image; Obtain reconstructed image V(x)
Medium is propagated image estimator unit (U53), and input is calculated as follows medium propagation function t (x) from the atmospheric dissipation refined image V (x) of said atmospheric dissipation image thinning subelement (U52):
t(x)=1-V(x);
Clear scene recovery unit (U6), press following formula and calculate scene albedo ρ (x) according to said atmospheric dissipation refined image V (x) and medium propagation function t (x) from said atmospheric dissipation image estimation unit (U5):
ρ ( x ) = I ′ ( x ) - κV ( x ) t ( x ) ,
Wherein, κ is a factor of influence, 0<κ<1, and κ=0.95, κ is the factor that the force revert region of partial sky is introduced for white.
CN 201110134572 2011-05-23 2011-05-23 Real-time video defogging system Expired - Fee Related CN102170574B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110134572 CN102170574B (en) 2011-05-23 2011-05-23 Real-time video defogging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110134572 CN102170574B (en) 2011-05-23 2011-05-23 Real-time video defogging system

Publications (2)

Publication Number Publication Date
CN102170574A CN102170574A (en) 2011-08-31
CN102170574B true CN102170574B (en) 2012-12-26

Family

ID=44491528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110134572 Expired - Fee Related CN102170574B (en) 2011-05-23 2011-05-23 Real-time video defogging system

Country Status (1)

Country Link
CN (1) CN102170574B (en)

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164845B (en) * 2011-12-16 2016-08-03 中国科学院沈阳自动化研究所 A kind of real-time image mist elimination device and method
CN102968767A (en) * 2012-11-26 2013-03-13 中国科学院长春光学精密机械与物理研究所 Method for real-time restoration of fog-degraded image with white balance correction
JP6249638B2 (en) * 2013-05-28 2017-12-20 キヤノン株式会社 Image processing apparatus, image processing method, and program
CN104281998A (en) * 2013-07-03 2015-01-14 中山大学深圳研究院 Quick single colored image defogging method based on guide filtering
CN103458156B (en) * 2013-08-27 2016-08-10 宁波海视智能系统有限公司 Traffic incidents detection preprocessing method of video signal under a kind of severe weather conditions
CN104008527B (en) * 2014-04-16 2017-02-01 南京航空航天大学 Method for defogging single image
CN104200435A (en) * 2014-08-29 2014-12-10 武汉大学 Single-image defogging method based on fog imaging model
CN105657230A (en) * 2015-04-01 2016-06-08 周杰 Method for automatically reading electric energy meter through automatic meter reading platform
CN104714068B (en) * 2015-04-01 2016-10-26 哈尔滨汇鑫仪器仪表有限责任公司 Robot to outdoor electric energy meter meter reading
CN106965186A (en) * 2015-04-01 2017-07-21 姜敏敏 Outdoor electric energy meter meter reading robot based on the Big Dipper omniselector and vision sensor
CN105635692A (en) * 2015-04-01 2016-06-01 周杰 Method for automatically reading electric energy meter through automatic meter reading platform based on image acquisition
CN105791755A (en) * 2015-04-01 2016-07-20 周杰 Method for automatically reading power meter by using automatic meter reading platform based on image collection
CN105828042A (en) * 2015-04-01 2016-08-03 石梦媛 CMOS image collection-based electric energy meter automatic meter reading platform
CN105791779A (en) * 2015-04-01 2016-07-20 石永录 CMOS image acquisition-based watt-hour meter automatic meter reading platform
CN105791778A (en) * 2015-04-01 2016-07-20 张兰花 CMOS image acquisition-based watt-hour meter automatic meter reading platform
CN105740849A (en) * 2015-04-01 2016-07-06 周杰 Method for performing automatic data logging on electric energy meter based on image collection
CN104732700B (en) * 2015-04-02 2016-07-06 泰州市翔达消防器材有限公司 Based on the fire alarm system taken photo by plane in the air
CN105791780A (en) * 2015-04-02 2016-07-20 李勇 Transmission device identification platform located on unmanned aerial vehicle
CN105791781A (en) * 2015-04-02 2016-07-20 陈思思 Transmission device identification platform located on unmanned aerial vehicle
CN104704990B (en) * 2015-04-08 2016-09-14 南通市广益机电有限责任公司 A kind of pomegranate tree electronization picking method automatically
CN106251301B (en) * 2016-07-26 2019-10-15 北京工业大学 A kind of single image to the fog method based on dark primary priori
US10528842B2 (en) * 2017-02-06 2020-01-07 Mediatek Inc. Image processing method and image processing system
CN107045723A (en) * 2017-03-10 2017-08-15 北京环境特性研究所 Smog recognition methods based on transmissivity dynamic detection
CN107396053B (en) * 2017-08-18 2019-11-15 卓源信息科技股份有限公司 A kind of outdoor safety defense monitoring system
CN111063017B (en) * 2018-10-15 2022-04-12 华为技术有限公司 Illumination estimation method and device
CN109784191B (en) * 2018-12-20 2021-01-01 华南理工大学 Multitask face illumination editing method based on business image
CN109726686B (en) * 2018-12-29 2021-03-30 西安天和防务技术股份有限公司 Scene recognition method and device, computer equipment and storage medium
CN111028175B (en) * 2019-12-10 2023-07-28 嘉应学院 Rapid image defogging method and system
CN110913195B (en) * 2019-12-26 2022-10-04 深圳壹账通智能科技有限公司 White balance automatic adjustment method, device and computer readable storage medium
CN113392801A (en) * 2021-06-30 2021-09-14 深圳市斯博科技有限公司 Image processing method, system, device and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007033987A (en) * 2005-07-28 2007-02-08 Canon Inc Image forming apparatus
CN101908210B (en) * 2010-08-13 2012-03-14 北京工业大学 Method and system for color image defogging treatment

Also Published As

Publication number Publication date
CN102170574A (en) 2011-08-31

Similar Documents

Publication Publication Date Title
CN102170574B (en) Real-time video defogging system
CN110232666B (en) Underground pipeline image rapid defogging method based on dark channel prior
CN108596849B (en) Single image defogging method based on sky region segmentation
CN107301623B (en) Traffic image defogging method and system based on dark channel and image segmentation
CN106296612B (en) A kind of stagewise monitor video sharpening system and method for image quality evaluation and weather conditions guidance
CN104794697B (en) A kind of image defogging method based on dark primary priori
Tripathi et al. Single image fog removal using bilateral filter
CN107451966B (en) Real-time video defogging method implemented by guiding filtering through gray level image
Xu et al. An improved guidance image based method to remove rain and snow in a single image
CN107103591B (en) Single image defogging method based on image haze concentration estimation
CN104299192B (en) A kind of single image to the fog method based on atmospheric light scattering physical model
Gao et al. Sand-dust image restoration based on reversing the blue channel prior
CN107301624B (en) Convolutional neural network defogging method based on region division and dense fog pretreatment
CN106548461B (en) Image defogging method
CN110378849B (en) Image defogging and rain removing method based on depth residual error network
CN111062293B (en) Unmanned aerial vehicle forest flame identification method based on deep learning
CN107292830B (en) Low-illumination image enhancement and evaluation method
CN112837233B (en) Polarization image defogging method for acquiring transmissivity based on differential polarization
CN102665034A (en) Night effect removal method for camera-collected video
CN104318524A (en) Method, device and system for image enhancement based on YCbCr color space
CN103578083B (en) Single image defogging method based on associating average drifting
CN105701783B (en) A kind of single image to the fog method and device based on environment light model
CN109389569B (en) Monitoring video real-time defogging method based on improved DehazeNet
CN105447825A (en) Image defogging method and system
CN103077504A (en) Image haze removal method on basis of self-adaptive illumination calculation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121226

Termination date: 20130523