CN105389784A - Image processing method and terminal - Google Patents

Image processing method and terminal Download PDF

Info

Publication number
CN105389784A
CN105389784A CN201510894229.9A CN201510894229A CN105389784A CN 105389784 A CN105389784 A CN 105389784A CN 201510894229 A CN201510894229 A CN 201510894229A CN 105389784 A CN105389784 A CN 105389784A
Authority
CN
China
Prior art keywords
formula
image
sigma
pixel
pending image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510894229.9A
Other languages
Chinese (zh)
Inventor
陈楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meizu Technology China Co Ltd
Original Assignee
Meizu Technology China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meizu Technology China Co Ltd filed Critical Meizu Technology China Co Ltd
Priority to CN201510894229.9A priority Critical patent/CN105389784A/en
Publication of CN105389784A publication Critical patent/CN105389784A/en
Pending legal-status Critical Current

Links

Classifications

    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30192Weather; Meteorology

Abstract

An embodiment of the invention discloses an image processing method. The image processing method comprises: determining a brightest area of a preset area size in a gray-scale image corresponding to a to-be-processed image; computing an atmospheric light value of the brightest area; computing an atmospheric propagation coefficient in a preset computing manner; and performing enhancement processing on the to-be-processed image according to the atmospheric light value and the atmospheric propagation coefficient. An embodiment of the invention further discloses a terminal. With the adoption of the embodiments of the invention, the to-be-processed image can be processed via the atmospheric light value and the atmospheric propagation coefficient, more image detail information can be reserved, and computing complexity during an image processing process is relatively low.

Description

A kind of method of image procossing and terminal
Technical field
The embodiment of the present invention relates to technical field of image processing, refers more particularly to a kind of method and terminal of image procossing.
Background technology
Along with the fast development of infotech, the function of terminal is more and more perfect, and the requirement of people to terminal is also more and more higher, and with regard to by taking pictures, under greasy weather or haze weather environment, the effect of the image obtained is poor.In prior art, for containing mist image or containing haze image, mainly comprise image enhaucament and image restoration two kinds, for image enhaucament, image enhaucament is while raising picture contrast, the detailed information of image cannot be taken into account well, be easy to cause some border textures in image to be lost; For image restoration, although Image Restoration Algorithm is effective, because calculation procedure is too complicated, time cost is too high and lack practical value.
Summary of the invention
Embodiments provide a kind of method and terminal of image procossing, to retaining more image detail information and image processing process complexity is lower.
Embodiment of the present invention first aspect provides a kind of method of image procossing, comprises step:
Determine the brightest area of the predeterminable area size in the gray level image that pending image is corresponding;
Air light value is calculated according to described brightest area;
Atmospheric propagation coefficient is calculated according to default account form;
According to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image.
In conjunction with the first aspect of the embodiment of the present invention, in the first possible embodiment of first aspect, described pending image is containing mist image, containing haze image, containing at least one in sleet image or iridescent image.
In conjunction with the first aspect of the embodiment of the present invention or the first possible embodiment of first aspect, in the embodiment that the second of first aspect is possible, describedly determine that the step of the brightest area of the predeterminable area size in the gray level image that pending image is corresponding specifically comprises:
Determine the center of the gray level image that pending image is corresponding, and according to described center, described gray level image is divided into quartern region;
Calculate described quartern region corresponding pixel value average u and standard deviation sd respectively, and build score coefficient score:score=u-sd respectively.
Determine the target area that maximal value in described score coefficient score is corresponding;
Judge whether the size of described target area is less than or equal to described predeterminable area size;
If so, described target area is defined as brightest area.
In conjunction with the first aspect of the embodiment of the present invention or the first possible embodiment of first aspect, in the third possible embodiment of first aspect, the described step according to described brightest area calculating air light value specifically comprises:
Ask for the average of pixel value corresponding to the pixel of described brightest area, described average is air light value.
In conjunction with the first aspect of the embodiment of the present invention or the first possible embodiment of first aspect, in the 4th kind of possible embodiment of first aspect, the described step according to default account form calculating atmospheric propagation coefficient specifically comprises:
According to formula (1) structure contrast function:
F c o n t r a s t ( t ) = - Σ c ∈ G r a y Σ p ∈ B ( I ( p ) - I ‾ ) 2 t 2 N B - - - ( 1 )
Wherein, in formula (1), t represents atmospheric propagation coefficient, and c ∈ Gray represents calculate contrast function, F on the gray level image that described pending image is corresponding contrastt () represents the contrast function about t, p represents pixel, I (p) is the gray-scale value of any point p point in described pending image, I represents the average of pixel gray-scale value in moving window B, the i.e. mean flow rate of pixel in described moving window B, described B is the first default moving window, N bfor the number of pixel comprised at described moving window B, p ∈ B represents and constructs contrast function for described moving window B;
According to formula (2) tectonic information loss function:
F l o s s ( t ) = Σ c ∈ G r a y { Σ i = 0 α ( i - A t + A ) 2 h ( i ) + Σ i = β 255 ( i - A t + A - 255 ) h ( i ) } - - - ( 2 )
Wherein, in formula (2), F losst () is about the information loss function of t, α is first threshold, and β is Second Threshold, and α < β, h (i) represents that the pixel number of gray-scale value i accounts for the number percent of total pixel number of described pending image, and A is air light value;
Described contrast function and described information loss function are substituted into formula (3) to calculate described atmospheric propagation coefficient t:
V min t F ( t ) = min t ( F c o n t r a s t ( t ) + &lambda;F l o s s ( t ) ) - - - ( 3 )
Wherein, in formula (3), about the minimization objective optimization function of t, λ is constant balance factor, and λ ∈ (0,1).
In conjunction with the 4th kind of possible embodiment of the first aspect of the embodiment of the present invention, in the 5th kind of possible embodiment of first aspect, the described formula (3) that described contrast function and described information loss function substituted into is with after calculating described atmospheric propagation coefficient t, and described method also comprises:
Be optimized described atmospheric propagation coefficient t, the step of described optimization can be:
Set up objective optimization function, as formula (4):
V min ( s , &psi; ) ( F ( t ) ) = min ( s , &psi; ) &Sigma; p &Element; W ( t ( p ) - t ^ ( p ) ) 2 = min ( s , &psi; ) &Sigma; p &Element; W ( ( s ( p ) * I ( p ) + &psi; ( p ) - t ( p ) ) 2 + &epsiv; * ( s ( p ) 2 ) ) - - - ( 4 )
Wherein, in formula (4), p represents pixel, s (p) is the change of scale factor of p point, and ψ (p) is the offset component of p point, and I (p) is the gray-scale value of p point in pending image, W is the second size presetting moving window, ε is weight factor, ε > 0 represent the atmospheric propagation coefficient after optimizing, represent the atmospheric propagation coefficient after the optimization of p point, account form as shown in formula (5):
t ^ ( p ) = s ( p ) * I ( p ) + &psi; ( p ) - - - ( 5 )
Formula (6) and formula (7) can be released by formula (4):
s = 1 N &Sigma; p &Element; W I ( p ) t ( p ) - 1 N &mu; &Sigma; p &Element; W t ( p ) &sigma; 2 + &epsiv; - - - ( 6 )
&psi; = 1 N &Sigma; p &Element; W t ( p ) - s * &mu; - - - ( 7 )
Wherein, in formula (6) and formula (7), μ and σ 2be respectively average and the variance of grey scale pixel value in corresponding W window, N is the described second number presetting pixel in moving window;
The result of calculation of formula (6) and formula (7) is substituted in formula (8) and formula (9) and calculates described pending image mid point p change of scale Summing Factor offset component respectively:
s ( p ) = 1 N &Sigma; k &Element; W P s k - - - ( 8 )
&psi; ( p ) = 1 N &Sigma; k &Element; W P &psi; k - - - ( 9 )
Wherein, in formula (8) and formula (9), W pfor comprising the moving window of pixel p, s kfor kth moving window passes through some p by the intermediate result calculated in formula (6), ψ kfor kth moving window passes through some p by the intermediate result calculated in formula (7); S (p) for all window filtering operation complete after the change of scale factor of corresponding pixel points p position, ψ (p) for all window filtering operation complete after the offset component of corresponding pixel points p position;
Described pending image mid point p change of scale Summing Factor offset component formula (8) and formula (9) determined substitutes into the atmospheric propagation coefficient in formula (5) after calculation optimization
In conjunction with the first possible embodiment of embodiment of the present invention first aspect or first aspect, in the 6th kind of possible embodiment of first aspect, describedly according to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image, comprising:
Formula (10) R, G, B tri-passages corresponding to described pending image are respectively adopted to carry out enhancing process
J c ( p ) = 1 t ( p ) ( I c ( p ) - A ) + A - - - ( 10 )
Wherein, in formula (10), A represents air light value, and t (p) represents the value at the atmospheric propagation coefficient of p point, and c represents R, G, B tri-passages, J cp () is the image after passage c enhancing process, I cp () is described pending image.
Correspondingly, embodiment of the present invention second aspect provides a kind of terminal, comprising:
Determining unit, for determining the brightest area of the predeterminable area size in the gray level image that pending image is corresponding;
First computing unit, calculates air light value for the brightest area determined according to described determining unit;
Second computing unit, for calculating atmospheric propagation coefficient according to default account form;
Enhancement unit, the atmospheric propagation coefficient calculated for the air light value that calculates according to described first computing unit and described second computing unit carries out enhancing process to described pending image.
In conjunction with embodiment of the present invention second aspect, in the first possible embodiment of second aspect, described pending image is containing mist image, containing haze image, containing at least one in sleet image or iridescent image.
In conjunction with the first possible embodiment of embodiment of the present invention second aspect or second aspect, in the embodiment that the second of second aspect is possible, described determining unit specifically for:
First determines subelement, for determining the center of the gray level image that pending image is corresponding, and according to described center, described gray level image is divided into quartern region;
Score coefficient construction unit, determines the pixel value average u that quartern region that subelement is determined is corresponding and standard deviation sd for calculating described first respectively, and builds score coefficient score:score=u-sd respectively.
Second determines subelement, for determining the target area that maximal value in the score coefficient score that described score coefficient construction unit constructs is corresponding;
Judging unit, for judging that described second determines whether the size of the target area that subelement is determined is less than or equal to described predeterminable area size;
3rd determines subelement, if judge that the size of target area is less than or equal to described predeterminable area size for described judging unit, described target area is defined as brightest area.
In conjunction with the first possible embodiment of embodiment of the present invention second aspect or second aspect, in the third possible embodiment of second aspect, described second computing unit comprises:
First tectonic element, for constructing contrast function according to formula (11):
F c o n t r a s t ( t ) = - &Sigma; c &Element; G r a y &Sigma; p &Element; B ( I ( p ) - I &OverBar; ) 2 t 2 N B - - - ( 11 )
Wherein, in formula (11), t represents atmospheric propagation coefficient, and c ∈ Gray represents calculate contrast function, F on the gray level image that described pending image is corresponding contrastt () represents the contrast function about t, p represents pixel, and I (p) is the gray-scale value of any point p point in described pending image, represent the average of pixel gray-scale value in moving window B, i.e. the mean flow rate of pixel in described moving window B, described B first presets moving window, N bfor the number of pixel comprised at described moving window B, p ∈ B represents and constructs contrast function for described moving window B;
Second tectonic element, for according to formula (12) tectonic information loss function:
F l o s s ( t ) = &Sigma; c &Element; G r a y { &Sigma; i = 0 &alpha; ( i - A t + A ) 2 h ( i ) + &Sigma; i = &beta; 255 ( i - A t + A - 255 ) h ( i ) } - - - ( 12 )
Wherein, in formula (12), F losst () is about the information loss function of t, α is first threshold, and β is Second Threshold, and α < β, h (i) represents that the pixel number of gray-scale value i accounts for the number percent of total pixel number of described pending image, and A is air light value;
First computation subunit, for described contrast function and described information loss function are substituted into formula (13) to calculate described atmospheric propagation coefficient t:
V min t F ( t ) = min t ( F c o n t r a s t ( t ) + &lambda;F l o s s ( t ) ) - - - ( 13 )
Wherein, in formula (13), about the minimization objective optimization function of t, λ is constant balance factor, and λ ∈ (0,1).
Set up objective optimization function, as formula (14):
V min ( s , &psi; ) ( F ( t ) ) = min ( s , &psi; ) &Sigma; p &Element; W ( t ( p ) - t ^ ( p ) ) 2 = min ( s , &psi; ) &Sigma; p &Element; W ( ( s ( p ) * I ( p ) + &psi; ( p ) - t ( p ) ) 2 + &epsiv; * ( s ( p ) 2 ) ) - - - ( 14 )
Wherein, in formula (14), p represents pixel, s (p) is the change of scale factor of p point, and ψ (p) is the offset component of p point, and I (p) is the gray-scale value of p point in pending image, W is the second size presetting moving window, ε is weight factor, ε > 0 represent the atmospheric propagation coefficient after optimizing, represent the atmospheric propagation coefficient after the optimization of p point, account form as shown in formula (15):
t ^ ( p ) = s ( p ) * I ( p ) + &psi; ( p ) - - - ( 15 )
Formula (16) and formula (17) can be released by formula (14):
s = 1 N &Sigma; p &Element; W I ( p ) t ( p ) - 1 N &mu; &Sigma; p &Element; W t ( p ) &sigma; 2 + &epsiv; - - - ( 16 )
&psi; = 1 N &Sigma; p &Element; W t ( p ) - s * &mu; - - - ( 17 )
Wherein, in formula (16) and formula (17), μ and σ 2be respectively average and the variance of grey scale pixel value in corresponding W window, N is the described second number presetting pixel in moving window;
The result of calculation of formula (16) and formula (17) is substituted in formula (18) and formula (19) and calculates described pending image mid point p change of scale Summing Factor offset component respectively:
s ( p ) = 1 N &Sigma; k &Element; W P s k - - - ( 18 )
&psi; ( p ) = 1 N &Sigma; k &Element; W P &psi; k - - - ( 19 )
Wherein, in formula (18) and formula (19), W pfor comprising the moving window of pixel p, s kfor kth moving window passes through some p by the intermediate result calculated in formula (16), ψ kfor kth moving window passes through some p by the intermediate result calculated in formula (17); S (p) for all window filtering operation complete after the change of scale factor of corresponding pixel points p position, ψ (p) for all window filtering operation complete after the offset component of corresponding pixel points p position;
Described pending image mid point p change of scale Summing Factor offset component formula (18) and formula (19) determined substitutes into the atmospheric propagation coefficient in formula (15) after calculation optimization
In conjunction with the first possible embodiment of embodiment of the present invention second aspect or second aspect, in the 4th kind of possible embodiment of second aspect, described enhancement unit specifically for:
Formula (20) R, G, B tri-passages corresponding to described pending image are respectively adopted to carry out enhancing process
J c ( p ) = 1 t ( p ) ( I c ( p ) - A ) + A - - - ( 20 )
Wherein, in formula (20), A represents air light value, and t (p) represents the value at the atmospheric propagation coefficient of p point, and c represents R, G, B tri-passages, J cp () is the image after passage c enhancing process, I cp () is described pending image.
Implement the embodiment of the present invention, there is following beneficial effect:
The embodiment of the present invention determines the brightest area of the predeterminable area size in the gray level image that pending image is corresponding; Air light value is calculated according to described brightest area; Atmospheric propagation coefficient is calculated according to default account form; According to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image.So, by air light value and atmospheric propagation coefficient, pending image is processed, more image detail information can be retained and image processing process complexity is lower.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, be briefly described to the accompanying drawing used required in embodiment, description below, apparently, accompanying drawing in the following describes is only some embodiments of the embodiment of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
The schematic flow sheet of the embodiment of the method for a kind of image procossing that Fig. 1 provides for the embodiment of the present invention;
The structural representation of the first embodiment of a kind of terminal that Fig. 2 a provides for the embodiment of the present invention;
The another structural representation of the first embodiment of a kind of terminal that Fig. 2 b provides for the embodiment of the present invention;
The another structural representation of the first embodiment of a kind of terminal that Fig. 2 c provides for the embodiment of the present invention;
The another structural representation of the first embodiment of a kind of terminal that Fig. 2 d provides for the embodiment of the present invention;
The structural representation of the second embodiment of a kind of terminal that Fig. 3 provides for the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is only embodiment of the present invention part embodiment, instead of whole embodiments.Based on the embodiment in the embodiment of the present invention, those of ordinary skill in the art are not making the every other embodiment obtained under creative work prerequisite, all belong to the scope of embodiment of the present invention protection.
In realization, in the embodiment of the present invention, terminal can include but not limited to: notebook computer, mobile phone, panel computer, intelligent wearable device etc.The system of terminal refers to the operating system of equipment, can include but not limited to: android system, Saipan system, Windows system, IOS (Mobile operating system of Apple's exploitation) etc.It should be noted that, Android terminal refers to the terminal of android system, and Saipan terminal refers to the terminal of Saipan system, etc.Above-mentioned terminal is only citing, and non exhaustive, including but not limited to above-mentioned terminal.
Embodiment of the present invention composition graphs 1 to Fig. 3 is described the method for a kind of image procossing that the embodiment of the present invention provides and terminal.
Refer to Fig. 1, Fig. 1 is the schematic flow sheet of the embodiment of the method for a kind of image procossing that the embodiment of the present invention provides.The method of the image procossing described in the present embodiment, only go haze to be illustrated with image mist elimination or image, comprise step:
S101, determine the brightest area of the predeterminable area size in the gray level image that pending image is corresponding.
Wherein, terminal can determine the brightest area of the predeterminable area size in the gray level image that pending image is corresponding, and the large I of predeterminable area includes but are not limited to: 3 × 3,5 × 5,7 × 7,9 × 9,11 × 11 etc.
Alternatively, pending image can include but are not limited to: containing mist image, containing haze image, containing sleet image or iridescent image etc., wherein, the image that greasy weather camera photographs is can be containing mist image, the mist that this mist produces under can be physical environment, or, the smog produced in burning objects process.Containing mist image, can be the image that haze weather photographs, haze can be the aerial dust granule of a kind of small suspension, such as, at the aerial a kind of solid granulates of suspension that specific work environments (the generation factory (cement mill, flour mill etc.) of building ground, powdery type) produces, or due to the dust scene produced under the environment that wind is larger, haze also can be in classroom, chalk dust particle that the word that writes with chalk descends slowly and lightly etc.Can be in image containing raindrop or snow shape crystal grain containing sleet image, iridescent image to can be in image containing reflective material or vitrina and causes reflective effect on the surface of reflective material or vitrina, thus, affect image effect, such as, reflective glass, performance in the picture, is then that retroreflective regions is very bright, or, user is when taking pictures facing to window-glass, because glass reflecting can make a part of image not affect, or, user to shooting water in time, due to light, a part of region in image is caused to there will be fuzzy.
As a kind of possible embodiment, terminal can adopt quartern method to determine the brightest area of the predeterminable area size of the gray level image that pending image is corresponding, and key step is:
Step1, determine the center of the gray level image that pending image is corresponding, and according to described center, described gray level image is divided into quartern region;
Step2, calculate described quartern region corresponding pixel value average u and standard deviation sd respectively, and build score coefficient score:score=u-sd respectively.
The region that the maximal value of Step3, the score coefficient determined in described Stpe2 is corresponding, goes to Step1;
Repeat above-mentioned steps, when the subregional size of the fourth class is less than predetermined threshold value, region corresponding for the maximal value of score coefficient in the Step2 determined in Step3 is defined as brightest area.
In specific implementation, if predeterminable area size is 1/4th of picture size size, then adopts said method just pending image to be divided into quartern region, then region maximum for score coefficient in this quartern region is defined as brightest area.
S102, calculate air light value according to described brightest area.
Wherein, terminal can calculate air light value according to brightest area, can using max pixel value corresponding for pixel in brightest area as air light value.
As a kind of possible embodiment, ask for the average of pixel value corresponding to the pixel of brightest area, this average is air light value.
S103, calculate atmospheric propagation coefficient according to default account form.
Wherein, terminal can calculate atmospheric propagation coefficient according to default account form, and preset algorithm can be as follows:
1, according to formula (1) structure contrast function:
F c o n t r a s t ( t ) = - &Sigma; c &Element; G r a y &Sigma; p &Element; B ( I ( p ) - I &OverBar; ) 2 t 2 N B - - - ( 1 )
Wherein, in formula (1), t represents atmospheric propagation coefficient, and c ∈ Gray represents calculate contrast function, F on the gray level image that pending image is corresponding contrastt () represents the contrast function about t, p represents pixel, and I (p) is the gray-scale value of any point p point in pending image, represent the average of pixel gray-scale value in moving window B, i.e. the mean flow rate of pixel in moving window B, B first presets moving window, N bfor the number of pixel comprised at described moving window B, p ∈ B represents and constructs contrast function for described moving window B.
2, according to formula (2) tectonic information loss function:
F l o s s ( t ) = &Sigma; c &Element; G r a y { &Sigma; i = 0 &alpha; ( i - A t + A ) 2 h ( i ) + &Sigma; i = &beta; 255 ( i - A t + A - 255 ) h ( i ) } - - - ( 2 )
Wherein, in formula (2), F losst () is about the information loss function of t, α is first threshold, and β is Second Threshold, and α < β, h (i) represents that the pixel number of gray-scale value i accounts for the number percent of total pixel number of described pending image, and A is air light value;
3, described contrast function and described information loss function are substituted into formula (3) to calculate described atmospheric propagation coefficient t:
V min t F ( t ) = min t ( F c o n t r a s t ( t ) + &lambda;F l o s s ( t ) ) - - - ( 3 )
Wherein, in formula (3), about the minimization objective optimization function of t, λ is constant balance factor, and λ ∈ (0,1).
Further, although above-mentioned steps can obtain better atmospheric propagation coefficient, also can specifically optimize, to obtain better mist elimination haze effect:
Described contrast function and described information loss function are substituted into formula (3) with after calculating described atmospheric propagation coefficient t, can also specifically optimize, as follows:
Be optimized described atmospheric propagation coefficient t, the step of described optimization can be:
Set up objective optimization function, as formula (4):
V min ( s , &psi; ) ( F ( t ) ) = min ( s , &psi; ) &Sigma; p &Element; W ( t ( p ) - t ^ ( p ) ) 2 = min ( s , &psi; ) &Sigma; p &Element; W ( ( s ( p ) * I ( p ) + &psi; ( p ) - t ( p ) ) 2 + &epsiv; * ( s ( p ) 2 ) ) - - - ( 4 )
Wherein, in formula (4), p represents pixel, s (p) is the change of scale factor of p point, and ψ (p) is the offset component of p point, and I (p) is the gray-scale value of p point in pending image, W is the second size presetting moving window, ε is weight factor, ε > 0 represent the atmospheric propagation coefficient after optimizing, represent the atmospheric propagation coefficient after the optimization of p point, account form as shown in formula (5):
t ^ ( p ) = s ( p ) * I ( p ) + &psi; ( p ) - - - ( 5 )
Formula (6) and formula (7) can be released by formula (4):
s = 1 N &Sigma; p &Element; W I ( p ) t ( p ) - 1 N &mu; &Sigma; p &Element; W t ( p ) &sigma; 2 + &epsiv; - - - ( 6 )
&psi; = 1 N &Sigma; p &Element; W t ( p ) - s * &mu; - - - ( 7 )
Wherein, in formula (6) and formula (7), μ and σ 2be respectively average and the variance of grey scale pixel value in corresponding W window, N is the described second number presetting pixel in moving window;
The result of calculation of formula (6) and formula (7) is substituted in formula (8) and formula (9) and calculates described pending image mid point p change of scale Summing Factor offset component respectively:
s ( p ) = 1 N &Sigma; k &Element; W P s k - - - ( 8 )
&psi; ( p ) = 1 N &Sigma; k &Element; W P &psi; k - - - ( 9 )
Wherein, in formula (8) and formula (9), W pfor comprising the moving window of pixel p, s kfor kth moving window passes through some p by the intermediate result calculated in formula (6), ψ kfor kth moving window passes through some p by the intermediate result calculated in formula (7); S (p) for all window filtering operation complete after the change of scale factor of corresponding pixel points p position, ψ (p) for all window filtering operation complete after the offset component of corresponding pixel points p position.
Described pending image mid point p change of scale Summing Factor offset component formula (8) and formula (9) determined substitutes into the atmospheric propagation coefficient in formula (5) after calculation optimization
As a kind of possible embodiment, in the process calculating propagation coefficient atmospheric propagation coefficient, use moving window to carry out having related to a lot of vectors when block calculates to take advantage of, add computing, this is single-instruction multiple-data stream (SIMD) (English: SingleInstructionMultipleData, abbreviation: SIMD) the well-adapted occasion of framework.There is provided software programming as Arm platform is just special to herein is provided a set of NEON instruction person calls, Intel platform then has the SSE instruction set of similar functions.Meanwhile, in order to ensure that this technology can (English: GraphicsProcessingUnit, abbreviation: GPU) parallel accelerate, we be selected to single location of pixels in partial operation by image processor.Particularly for the mobile platform supporting at present OpenGLES2.0, support that the form that texture is downloaded is extremely limited, several forms that the hyperchannel data texturing that most platform generates at Fast rendering and when downloading only supports RGBA8888 etc. limited at present.Consider these reasons above-mentioned, the image restoration operation in final step is placed on GPU end and processes by us, and algorithm overall performance is greatly improved.Single recovery operation considering final step, speed can be promoted more than 2-3 times by the concurrent operations of GPU.
S104, according to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image.
Wherein, terminal can carry out enhancing process according to air light value and atmospheric propagation coefficient to pending image, specific as follows:
Formula (10) R, G, B tri-passages corresponding to described pending image are respectively adopted to carry out enhancing process
J c ( p ) = 1 t ( p ) ( I c ( p ) - A ) + A - - - ( 10 )
Wherein, in formula (10), A represents air light value, and t (p) represents the value at the atmospheric propagation coefficient of p point, and c represents R, G, B tri-passages, J cp () is the image after passage c enhancing process, I cp () is pending image.
Alternatively, the embodiment of the present invention, not only at pending image for until mist elimination image or in time removing haze image, can carry out mist elimination to this pending image, at pending image for containing sleet image or iridescent image, can also carry out enhancings process this image.
The embodiment of the present invention determines the highlights of the predeterminable area size in the gray level image that pending image is corresponding; Air light value is calculated according to described brightest area; Atmospheric propagation coefficient is calculated according to default account form; According to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image.So, by air light value and atmospheric propagation coefficient, pending image is processed, more image detail information can be retained and image processing process complexity is lower.
Refer to Fig. 2 a, Fig. 2 a is the structural representation of the first embodiment of a kind of terminal that the embodiment of the present invention provides.Terminal described in the embodiment of the present invention can comprise: determining unit 201, first computing unit 202, second computing unit 203 and enhancement unit 204, specific as follows:
Determining unit 201, for determining the brightest area of the predeterminable area size in the gray level image that pending image is corresponding.
Wherein, determining unit 201 can determine the brightest area of the predeterminable area size in the gray level image that pending image is corresponding, and the large I of predeterminable area includes but are not limited to: 3 × 3,5 × 5,7 × 7,9 × 9,11 × 11 etc.
In specific implementation, determining unit 201 can determine the brightest area of the predeterminable area size in the gray level image that pending image is corresponding according to following step, as follows:
Step1, determine the center of the gray level image that pending image is corresponding, and according to described center, described gray level image is divided into quartern region;
Step2, calculate described quartern region corresponding pixel value average u and standard deviation sd respectively, and build score coefficient score:score=u-sd respectively.
The region that the maximal value of Step3, the score coefficient determined in described Stpe2 is corresponding, goes to Step1;
Repeat above-mentioned steps, when the subregional size of the fourth class is less than predetermined threshold value, region corresponding for the maximal value of score coefficient in the Step2 determined in Step3 is defined as brightest area.
First computing unit 202, calculates air light value for the brightest area determined according to described determining unit 201.
Wherein, the first computing unit 202 can calculate air light value according to brightest area, can using max pixel value corresponding for pixel in brightest area as air light value.
As a kind of possible embodiment, the first computing unit 202 can ask for the average of pixel value corresponding to the pixel of brightest area, and this average is air light value.
Second computing unit 203, for calculating atmospheric propagation coefficient according to default account form.
Wherein, the second computing unit 203 can calculate atmospheric propagation coefficient according to default account form.
As a kind of possible embodiment, as shown in Figure 2 b, the determining unit 201 in the terminal described by Fig. 2 a also can comprise: first determine subelement 2011, score coefficient construction unit 2012, second determines that subelement 2013, judging unit 2014, the 3rd determine subelement 2015.
First determines subelement 2011, for determining the center of the gray level image that pending image is corresponding, and according to described center, described gray level image is divided into quartern region;
Score coefficient construction unit 2012, determines the pixel value average u that quartern region that subelement 2011 is determined is corresponding and standard deviation sd for calculating described first respectively, and builds score coefficient score:score=u-sd respectively.
Second determines subelement 2013, for determining the target area that maximal value in the score coefficient score that described score coefficient construction unit 2012 constructs is corresponding;
Judging unit 2014, for judging that described second determines whether the size of the target area that subelement 2013 is determined is less than or equal to described predeterminable area size;
3rd determines subelement 2015, if judge that the size of target area is less than or equal to described predeterminable area size for described judging unit 2014, described target area is defined as brightest area.
As a kind of possible embodiment, as shown in Figure 2 c, the second computing unit 203 described in Fig. 2 a can further include: the first tectonic element 2031, second tectonic element 2032 and the first computation subunit 2033, specific as follows:
First tectonic element 2031, for constructing contrast function according to formula (11):
F c o n t r a s t ( t ) = - &Sigma; c &Element; G r a y &Sigma; p &Element; B ( I ( p ) - I &OverBar; ) 2 t 2 N B - - - ( 11 )
Wherein, in formula (11), t represents atmospheric propagation coefficient, and c ∈ Gray represents calculate contrast function, F on the gray level image that pending image is corresponding contrastt () represents the contrast function about t, p represents pixel, and I (p) is the gray-scale value of any point p point in pending image, I represents the average of pixel gray-scale value in moving window B, the i.e. mean flow rate of pixel in moving window B, B first presets moving window, N bfor the number of pixel comprised at moving window B, p ∈ B represents and constructs contrast function for described moving window B.
Second tectonic element 2032, for according to formula (12) tectonic information loss function:
F l o s s ( t ) = &Sigma; c &Element; G r a y { &Sigma; i = 0 &alpha; ( i - A t + A ) 2 h ( i ) + &Sigma; i = &beta; 255 ( i - A t + A - 255 ) h ( i ) } - - - ( 12 )
Wherein, in formula (12), F losst () is about the information loss function of t, α is first threshold, and β is Second Threshold, and α < β, h (i) represents that the pixel number of gray-scale value i accounts for the number percent of total pixel number of described pending image, and A is air light value;
First computation subunit 2033, for described contrast function and described information loss function are substituted into formula (13) to calculate described atmospheric propagation coefficient t:
V min t F ( t ) = min t ( F c o n t r a s t ( t ) + &lambda;F l o s s ( t ) ) - - - ( 13 )
Wherein, in formula (13), about the minimization objective optimization function of t, λ is constant balance factor, and λ ∈ (0,1).
As a kind of possible embodiment, as shown in Figure 2 d, the second computing unit 203 described in Fig. 2 a can further include: optimize unit 2034, specific as follows:
Optimize unit, for being optimized described atmospheric propagation coefficient t, the step of described optimization can be:
(1), objective optimization function is set up, as formula (14):
V min ( s , &psi; ) ( F ( t ) ) = min ( s , &psi; ) &Sigma; p &Element; W ( t ( p ) - t ^ ( p ) ) 2 = min ( s , &psi; ) &Sigma; p &Element; W ( ( s ( p ) * I ( p ) + &psi; ( p ) - t ( p ) ) 2 + &epsiv; * ( s ( p ) 2 ) ) - - - ( 14 )
Wherein, in formula (14), p represents pixel, s (p) is the change of scale factor of p point, and ψ (p) is the offset component of p point, and I (p) is the gray-scale value of p point in pending image, W is the second size presetting moving window, ε is weight factor, ε > 0 represent the atmospheric propagation coefficient after optimizing, represent the atmospheric propagation coefficient after the optimization of p point, account form as shown in formula (15):
t ^ ( p ) = s ( p ) * I ( p ) + &psi; ( p ) - - - ( 15 )
Formula (16) and formula (17) can be released by formula (14):
s = 1 N &Sigma; p &Element; W I ( p ) t ( p ) - 1 N &mu; &Sigma; p &Element; W t ( p ) &sigma; 2 + &epsiv; - - - ( 16 )
&psi; = 1 N &Sigma; p &Element; W t ( p ) - s * &mu; - - - ( 17 )
Wherein, in formula (16) and formula (17), μ and σ 2be respectively average and the variance of grey scale pixel value in corresponding W window, N is the described second number presetting pixel in moving window;
(2), the result of calculation of formula (16) and formula (17) substituted in formula (18) and formula (19) calculate described pending image mid point p change of scale Summing Factor offset component respectively:
s ( p ) = 1 N &Sigma; k &Element; W P s k - - - ( 18 )
&psi; ( p ) = 1 N &Sigma; k &Element; W P &psi; k - - - ( 19 )
Wherein, in formula (18) and formula (19), W pfor comprising the moving window of pixel p, s kfor kth moving window passes through some p by the intermediate result calculated in formula (16), ψ kfor kth moving window passes through some p by the intermediate result calculated in formula (17); S (p) for all window filtering operation complete after the change of scale factor of corresponding pixel points p position, ψ (p) for all window filtering operation complete after the offset component of corresponding pixel points p position;
(3) the described pending image mid point p change of scale Summing Factor offset component, formula (18) and formula (19) determined substitutes into the atmospheric propagation coefficient in formula (15) after calculation optimization
Enhancement unit 204, the atmospheric propagation coefficient calculated for the air light value that calculates according to described first computing unit 202 and described second computing unit 203 carries out enhancing process to described pending image.
Wherein, the atmospheric propagation coefficient that the enhancement unit 204 air light value that can calculate according to described first computing unit 202 and described second computing unit 203 calculate carries out enhancing process to described pending image.Preferably, enhancement unit 204 can adopt formula (20) R, G, B tri-passages corresponding to described pending image respectively to carry out enhancing process
J c ( p ) = 1 t ( p ) ( I c ( p ) - A ) + A - - - ( 20 )
Wherein, in formula (20), A represents air light value, and t (p) represents the value at the atmospheric propagation coefficient of p point, and c represents R, G, B tri-passages, J cp () is the image after passage c enhancing process, I cp () is pending image.
Terminal described by the embodiment of the present invention is by determining the highlights of the predeterminable area size in the gray level image that pending image is corresponding; Air light value is calculated according to described brightest area; Atmospheric propagation coefficient is calculated according to default account form; According to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image.So, by air light value and atmospheric propagation coefficient, pending image is processed, more image detail information can be retained and image processing process complexity is lower.
Refer to Fig. 3, the structural representation of the second embodiment of a kind of terminal that Fig. 3 provides for the embodiment of the present invention.Terminal described in the present embodiment comprises: at least one input equipment 1000; At least one output device 2000; At least one processor 3000, such as CPU; With storer 4000, above-mentioned input equipment 1000, output device 2000, processor 3000 are connected by bus 5000 with storer 4000.
Wherein, above-mentioned input equipment 1000 can be contact panel, common PC, liquid crystal display, touch screen, push button etc.
Above-mentioned storer 4000 can be high-speed RAM storer, also can be non-labile storer (non-volatilememory), such as magnetic disk memory.Above-mentioned storer 4000 is for storing batch processing code, and above-mentioned input equipment 1000, output device 2000 and processor 3000, for calling the program code stored in storer 4000, perform and operate as follows:
Above-mentioned processor 3000, for: the brightest area determining the predeterminable area size in the gray level image that pending image is corresponding.
As a kind of possibility embodiment, the brightest area of the predeterminable area size in the gray level image that pending image is corresponding determined by above-mentioned processor 3000, is specially:
Adopt quartern method to determine the brightest area of the predeterminable area size of the gray level image that pending image is corresponding, key step is:
Step1, determine the center of the gray level image that pending image is corresponding, and according to described center, described gray level image is divided into quartern region;
Step2, calculate described quartern region corresponding pixel value average u and standard deviation sd respectively, and build score coefficient score:score=u-sd respectively.
The region that the maximal value of Step3, the score coefficient determined in described Stpe2 is corresponding, goes to Step1;
Repeat above-mentioned steps, when the subregional size of the fourth class is less than predetermined threshold value, region corresponding for the maximal value of score coefficient in the Step2 determined in Step3 is defined as brightest area.
Alternatively, described pending image is containing mist image, containing haze image, containing at least one in sleet image or iridescent image.
Above-mentioned processor 3000, also specifically for:
Air light value is calculated according to described brightest area;
As a kind of possibility embodiment, above-mentioned processor 3000 calculates air light value according to described brightest area, is specially:
Ask for the average of pixel value corresponding to the pixel of described brightest area, described average is air light value.
Above-mentioned processor 3000, also specifically for:
Atmospheric propagation coefficient is calculated according to default account form;
As a kind of possibility embodiment, above-mentioned processor 3000 calculates atmospheric propagation coefficient according to default account form, is specially:
According to formula (21) structure contrast function:
F c o n t r a s t ( t ) = - &Sigma; c &Element; G r a y &Sigma; p &Element; B ( I ( p ) - I &OverBar; ) 2 t 2 N B - - - ( 21 )
Wherein, in formula (21), t represents atmospheric propagation coefficient, and c ∈ Gray represents calculate contrast function, F on the gray level image that described pending image is corresponding contrastt () represents the contrast function about t, p represents pixel, and I (p) is the gray-scale value of any point p point in described pending image, represent the average of pixel gray-scale value in moving window B, i.e. the mean flow rate of pixel in described moving window B, described B first presets moving window, N bfor the number of pixel comprised at described moving window B, p ∈ B represents and constructs contrast function for described moving window B.
According to formula (22) tectonic information loss function:
F l o s s ( t ) = &Sigma; c &Element; G r a y { &Sigma; i = 0 &alpha; ( i - A t + A ) 2 h ( i ) + &Sigma; i = &beta; 255 ( i - A t + A - 255 ) h ( i ) } - - - ( 22 )
Wherein, in formula (22), F losst () is about the information loss function of t, α is first threshold, and β is Second Threshold, and α < β, h (i) represents that the pixel number of gray-scale value i accounts for the number percent of total pixel number of described pending image, and A is air light value;
Described contrast function and described information loss function are substituted into formula (23) to calculate described atmospheric propagation coefficient t:
V min t F ( t ) = min t ( F c o n t r a s t ( t ) + &lambda;F l o s s ( t ) ) - - - ( 23 )
Wherein, in formula (23), about the minimization objective optimization function of t, λ is constant balance factor, and λ ∈ (0,1).
Further, above-mentioned processor 3000, substitutes into formula (23) with after calculating described atmospheric propagation coefficient t by described contrast function and described information loss function, also specifically for:
Be optimized described atmospheric propagation coefficient t, the step of described optimization can be:
Set up objective optimization function, as formula (24):
V min ( s , &psi; ) ( F ( t ) ) = min ( s , &psi; ) &Sigma; p &Element; W ( t ( p ) - t ^ ( p ) ) 2 = min ( s , &psi; ) &Sigma; p &Element; W ( ( s ( p ) * I ( p ) + &psi; ( p ) - t ( p ) ) 2 + &epsiv; * ( s ( p ) 2 ) ) - - - ( 24 )
Wherein, in formula (24), p represents pixel, s (p) is the change of scale factor of p point, and ψ (p) is the offset component of p point, and I (p) is the gray-scale value of p point in pending image, W is the second size presetting moving window, ε is weight factor, ε > 0 represent the atmospheric propagation coefficient after optimizing, represent the atmospheric propagation coefficient after the optimization of p point, account form as shown in formula (25)::
t ^ ( p ) = s ( p ) * I ( p ) + &psi; ( p ) - - - ( 25 )
Formula (26) and formula (27) can be released by formula (24):
s = 1 N &Sigma; p &Element; W I ( p ) t ( p ) - 1 N &mu; &Sigma; p &Element; W t ( p ) &sigma; 2 + &epsiv; - - - ( 26 )
&psi; = 1 N &Sigma; p &Element; W t ( p ) - s * &mu; - - - ( 27 )
Wherein, in formula (26) and formula (27), μ and σ 2be respectively average and the variance of grey scale pixel value in corresponding W window, N is the described second number presetting pixel in moving window;
The result of calculation of formula (26) and formula (27) is substituted in formula (28) and formula (29) and calculates described pending image mid point p change of scale Summing Factor offset component respectively:
s ( p ) = 1 N &Sigma; k &Element; W P s k - - - ( 28 )
&psi; ( p ) = 1 N &Sigma; k &Element; W P &psi; k - - - ( 29 )
Wherein, in formula (28) and formula (29), W pfor comprising the moving window of pixel p, s kfor kth moving window passes through some p by the intermediate result calculated in formula (6), ψ kfor kth moving window passes through some p by the intermediate result calculated in formula (7); S (p) for all window filtering operation complete after the change of scale factor of corresponding pixel points p position, ψ (p) for all window filtering operation complete after the offset component of corresponding pixel points p position;
Described pending image mid point p change of scale Summing Factor offset component formula (28) and formula (29) determined substitutes into the atmospheric propagation coefficient in formula (25) after calculation optimization
Above-mentioned processor 3000, also specifically for:
According to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image.
As a kind of possibility embodiment, above-mentioned processor 3000 carries out enhancing process according to described air light value and described atmospheric propagation coefficient to described pending image, is specially:
Formula (10) R, G, B tri-passages corresponding to described pending image are respectively adopted to carry out enhancing process
J c ( p ) = 1 t ( p ) ( I c ( p ) - A ) + A - - - ( 30 )
Wherein, in formula (30), A represents air light value, and t (p) represents the value at the atmospheric propagation coefficient of p point, and c represents R, G, B tri-passages, J cp () is the image after passage c enhancing process, I cp () is pending image.
Terminal described by the embodiment of the present invention determines the highlights of the predeterminable area size in the gray level image that pending image is corresponding; Air light value is calculated according to described brightest area; Atmospheric propagation coefficient is calculated according to default account form; According to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image.So, by air light value and atmospheric propagation coefficient, pending image is processed, more image detail information can be retained and image processing process complexity is lower.
The embodiment of the present invention also provides a kind of computer-readable storage medium, and wherein, this computer-readable storage medium can have program stored therein, and comprises the part or all of step of any one signal processing method recorded in said method embodiment when this program performs.
In the above-described embodiments, the description of each embodiment is all emphasized particularly on different fields, in certain embodiment, there is no the part described in detail, can see the associated description of other embodiments.
It should be noted that, for aforesaid each embodiment of the method, in order to simple description, therefore it is all expressed as a series of combination of actions, but those skilled in the art should know, the present invention is not by the restriction of described sequence of movement, because according to the present invention, some step may can adopt other orders or carry out simultaneously.Secondly, those skilled in the art also should know, the embodiment described in instructions all belongs to preferred embodiment, and involved action and module might not be that the present invention is necessary.
In several embodiments that the application provides, should be understood that, disclosed device, the mode by other realizes.Such as, device embodiment described above is only schematic, the division of such as said units, be only a kind of logic function to divide, actual can have other dividing mode when realizing, such as multiple unit or assembly can in conjunction with or another system can be integrated into, or some features can be ignored, or do not perform.Another point, shown or discussed coupling each other or direct-coupling or communication connection can be by some interfaces, and the indirect coupling of device or unit or communication connection can be electrical or other form.
The above-mentioned unit illustrated as separating component or can may not be and physically separates, and the parts as unit display can be or may not be physical location, namely can be positioned at a place, or also can be distributed in multiple network element.Some or all of unit wherein can be selected according to the actual needs to realize the object of the present embodiment scheme.
In addition, each functional unit in various embodiments of the present invention can be integrated in a processing unit, also can be that the independent physics of unit exists, also can two or more unit in a unit integrated.Above-mentioned integrated unit both can adopt the form of hardware to realize, and the form of SFU software functional unit also can be adopted to realize.
If above-mentioned integrated unit using the form of SFU software functional unit realize and as independently production marketing or use time, can be stored in a computer read/write memory medium.Based on such understanding, the part that technical scheme of the present invention contributes to prior art in essence in other words or all or part of of this technical scheme can embody with the form of software product, this computer software product is stored in a storage medium, comprise all or part of step of some instructions in order to make a computer equipment (can be personal computer, server or the network equipment etc., can be specifically the processor in computer equipment) perform each embodiment said method of the present invention.Wherein, and aforesaid storage medium can comprise: USB flash disk, portable hard drive, magnetic disc, CD, ROM (read-only memory) are (English: Read-OnlyMemory, abbreviation: ROM) or random access memory (English: RandomAccessMemory, abbreviation: RAM) etc. various can be program code stored medium.
The above, above embodiment only in order to technical scheme of the present invention to be described, is not intended to limit; Although with reference to previous embodiment to invention has been detailed description, those of ordinary skill in the art is to be understood that: it still can be modified to the technical scheme described in foregoing embodiments, or carries out equivalent replacement to wherein portion of techniques feature; And these amendments or replacement, do not make the essence of appropriate technical solution depart from the spirit and scope of various embodiments of the present invention technical scheme.

Claims (12)

1. a method for image procossing, is characterized in that, described method comprises:
Determine the brightest area of the predeterminable area size in the gray level image that pending image is corresponding;
Air light value is calculated according to described brightest area;
Atmospheric propagation coefficient is calculated according to default account form;
According to described air light value and described atmospheric propagation coefficient, enhancing process is carried out to described pending image.
2. the method for claim 1, is characterized in that, described pending image is containing mist image, containing haze image, containing at least one in sleet image or iridescent image.
3. the method as described in any one of claim 1 or 2, is characterized in that, describedly determines that the step of the brightest area of the predeterminable area size in the gray level image that pending image is corresponding specifically comprises:
Determine the center of the gray level image that pending image is corresponding, and according to described center, described gray level image is divided into quartern region;
Calculate described quartern region corresponding pixel value average u and standard deviation sd respectively, and build score coefficient score:score=u-sd respectively.
Determine the target area that maximal value in described score coefficient score is corresponding;
Judge whether the size of described target area is less than or equal to described predeterminable area size;
If so, described target area is defined as brightest area.
4. the method as described in any one of claim 1 or 2, is characterized in that, the described step according to described brightest area calculating air light value specifically comprises:
Ask for the average of pixel value corresponding to the pixel of described brightest area, described average is air light value.
5. the method as described in any one of claim 1 or 2, is characterized in that, the described step according to default account form calculating atmospheric propagation coefficient specifically comprises:
According to formula (1) structure contrast function:
F c o n t r a s t ( t ) = - &Sigma; c &Element; G r a y &Sigma; p &Element; B ( I ( p ) - I &OverBar; ) 2 t 2 N B - - - ( 1 )
Wherein, in formula (1), t represents atmospheric propagation coefficient, and c ∈ Gray represents calculate contrast function, F on the gray level image that described pending image is corresponding contrastt () represents the contrast function about t, p represents pixel, and I (p) is the gray-scale value of any point p point in described pending image, represent the average of pixel gray-scale value in moving window B, i.e. the mean flow rate of pixel in described moving window B, described B first presets moving window, N bfor the number of pixel comprised at described moving window B, p ∈ B represents and constructs contrast function for described moving window B;
According to formula (2) tectonic information loss function:
F l o s s ( t ) = &Sigma; c &Element; G r a y { &Sigma; i = 0 &alpha; ( i - A t + A ) 2 h ( i ) + &Sigma; i = &beta; 255 ( i - A t + A - 255 ) h ( i ) } - - - ( 2 )
Wherein, in formula (2), F losst () is about the information loss function of t, α is first threshold, and β is Second Threshold, and α < β, h (i) represents that the pixel number of gray-scale value i accounts for the number percent of total pixel number of described pending image, and A is air light value;
Described contrast function and described information loss function are substituted into formula (3) to calculate described atmospheric propagation coefficient t:
V min t F ( t ) = min t ( F c o n t r a s t ( t ) + &lambda;F l o s s ( t ) ) - - - ( 3 )
Wherein, in formula (3), about the minimization objective optimization function of t, λ is constant balance factor, and λ ∈ (0,1).
6. method as claimed in claim 5, is characterized in that, the described formula (3) that described contrast function and described information loss function substituted into is with after calculating described atmospheric propagation coefficient t, and described method also comprises:
Be optimized described atmospheric propagation coefficient t, the step of described optimization can be:
Set up objective optimization function, as formula (4):
V min ( s , &psi; ) ( F ( t ) ) = min ( s , &psi; ) &Sigma; p &Element; W ( t ( p ) - t ^ ( p ) ) 2 = min ( s , &psi; ) &Sigma; p &Element; W ( ( s ( p ) * I ( p ) + &psi; ( p ) - t ( p ) ) 2 + &epsiv; * ( s ( p ) 2 ) ) - - - ( 4 )
Wherein, in formula (4), p represents pixel, s (p) is the change of scale factor of p point, and ψ (p) is the offset component of p point, and I (p) is the gray-scale value of p point in pending image, W is the second size presetting moving window, ε is weight factor, ε > 0 represent the atmospheric propagation coefficient after optimizing, represent the atmospheric propagation coefficient after the optimization of p point, account form as shown in formula (5):
t ^ ( p ) = s ( p ) * I ( p ) + &psi; ( p ) - - - ( 5 )
Formula (6) and formula (7) can be released by formula (4):
s = 1 N &Sigma; p &Element; W I ( p ) t ( p ) - 1 N &mu; &Sigma; p &Element; W t ( p ) &sigma; 2 + &epsiv; - - - ( 6 )
&psi; = 1 N &Sigma; p &Element; W t ( p ) - s * &mu; - - - ( 7 )
Wherein, in formula (6) and formula (7), μ and σ 2be respectively average and the variance of grey scale pixel value in corresponding W window, N is the described second number presetting pixel in moving window;
The result of calculation of formula (6) and formula (7) is substituted in formula (8) and formula (9) and calculates described pending image mid point p change of scale Summing Factor offset component respectively:
s ( p ) = 1 N &Sigma; k &Element; W P s k - - - ( 8 )
&psi; ( p ) = 1 N &Sigma; k &Element; W P &psi; k - - - ( 9 )
Wherein, in formula (8) and formula (9), W pfor comprising the moving window of pixel p, s kfor kth moving window passes through some p by the intermediate result calculated in formula (6), ψ kfor kth moving window passes through some p by the intermediate result calculated in formula (7); S (p) for all window filtering operation complete after the change of scale factor of corresponding pixel points p position, ψ (p) for all window filtering operation complete after the offset component of corresponding pixel points p position;
Described pending image mid point p change of scale Summing Factor offset component formula (8) and formula (9) determined substitutes into the atmospheric propagation coefficient in formula (5) after calculation optimization
7. the method as described in any one of claim 1 or 2, is characterized in that, describedly carries out enhancing process according to described air light value and described atmospheric propagation coefficient to described pending image, comprising:
Formula (10) R, G, B tri-passages corresponding to described pending image are respectively adopted to carry out enhancing process
J c ( p ) = 1 t ( p ) ( I c ( p ) - A ) + A - - - ( 10 )
Wherein, in formula (10), A represents air light value, and t (p) represents the value at the atmospheric propagation coefficient of p point, and c represents R, G, B tri-passages, J cp () is the image after passage c enhancing process, I cp () is described pending image.
8. a terminal, is characterized in that, described terminal comprises:
Determining unit, for determining the brightest area of the predeterminable area size in the gray level image that pending image is corresponding;
First computing unit, calculates air light value for the brightest area determined according to described determining unit;
Second computing unit, for calculating atmospheric propagation coefficient according to default account form;
Enhancement unit, the atmospheric propagation coefficient calculated for the air light value that calculates according to described first computing unit and described second computing unit carries out enhancing process to described pending image.
9. terminal as claimed in claim 8, is characterized in that, described pending image is containing mist image, containing haze image, containing at least one in sleet image or iridescent image.
10. the terminal as described in any one of claim 8 or 9, is characterized in that, described determining unit specifically for:
First determines subelement, for determining the center of the gray level image that pending image is corresponding, and according to described center, described gray level image is divided into quartern region;
Score coefficient construction unit, determines the pixel value average u that quartern region that subelement is determined is corresponding and standard deviation sd for calculating described first respectively, and builds score coefficient score:score=u-sd respectively.
Second determines subelement, for determining the target area that maximal value in the score coefficient score that described score coefficient construction unit constructs is corresponding;
Judging unit, for judging that described second determines whether the size of the target area that subelement is determined is less than or equal to described predeterminable area size;
3rd determines subelement, if judge that the size of target area is less than or equal to described predeterminable area size for described judging unit, described target area is defined as brightest area.
11. terminals as described in any one of claim 8 or 9, it is characterized in that, described second computing unit comprises:
First tectonic element, for constructing contrast function according to formula (11):
F c o n t r a s t ( t ) = - &Sigma; c &Element; G r a y &Sigma; p &Element; B ( I ( p ) - I &OverBar; ) 2 t 2 N B - - - ( 11 )
Wherein, in formula (11), t represents atmospheric propagation coefficient, and c ∈ Gray represents calculate contrast function, F on the gray level image that described pending image is corresponding contrastt () represents the contrast function about t, p represents pixel, and I (p) is the gray-scale value of any point p point in described pending image, represent the average of pixel gray-scale value in moving window B, i.e. the mean flow rate of pixel in described moving window B, described B first presets moving window, N bfor the number of pixel comprised at described moving window B, p ∈ B represents and constructs contrast function for described moving window B;
Second tectonic element, for according to formula (12) tectonic information loss function:
F l o s s ( t ) = &Sigma; c &Element; G r a y { &Sigma; i = 0 &alpha; ( i - A t + A ) 2 h ( i ) + &Sigma; i = &beta; 255 ( i - A t + A - 255 ) h ( i ) } - - - ( 12 )
Wherein, in formula (12), F losst () is about the information loss function of t, α is first threshold, and β is Second Threshold, and α < β, h (i) represents that the pixel number of gray-scale value i accounts for the number percent of total pixel number of described pending image, and A is air light value;
First computation subunit, for described contrast function and described information loss function are substituted into formula (13) to calculate described atmospheric propagation coefficient t:
V min t F ( t ) = min t ( F c o n t r a s t ( t ) + &lambda;F l o s s ( t ) ) - - - ( 13 )
Wherein, in formula (13), about the minimization objective optimization function of t, λ is constant balance factor, and λ ∈ (0,1).
Set up objective optimization function, as formula (14):
V min ( s , &psi; ) ( F ( t ) ) = min ( s , &psi; ) &Sigma; p &Element; W ( t ( p ) - t ^ ( p ) ) 2 = min ( s , &psi; ) &Sigma; p &Element; W ( ( s ( p ) * I ( p ) + &psi; ( p ) - t ( p ) ) 2 + &epsiv; * ( s ( p ) 2 ) ) - - - ( 14 )
Wherein, in formula (14), p represents pixel, s (p) is the change of scale factor of p point, and ψ (p) is the offset component of p point, and I (p) is the gray-scale value of p point in pending image, W is the second size presetting moving window, ε is weight factor, ε > 0 represent the atmospheric propagation coefficient after optimizing, represent the atmospheric propagation coefficient after the optimization of p point, account form as shown in formula (15):
t ^ ( p ) = s ( p ) * I ( p ) + &psi; ( p ) - - - ( 15 )
Formula (16) and formula (17) can be released by formula (14):
s = 1 N &Sigma; p &Element; W I ( p ) t ( p ) - 1 N &mu; &Sigma; p &Element; W t ( p ) &sigma; 2 + &epsiv; - - - ( 16 )
&psi; = 1 N &Sigma; p &Element; W t ( p ) - s * &mu; - - - ( 17 )
Wherein, in formula (16) and formula (17), μ and σ 2be respectively average and the variance of grey scale pixel value in corresponding W window, N is the described second number presetting pixel in moving window;
The result of calculation of formula (16) and formula (17) is substituted in formula (18) and formula (19) and calculates described pending image mid point p change of scale Summing Factor offset component respectively:
s ( p ) = 1 N &Sigma; k &Element; W P s k - - - ( 18 )
&psi; ( p ) = 1 N &Sigma; k &Element; W P &psi; k - - - ( 19 )
Wherein, in formula (18) and formula (19), W pfor comprising the moving window of pixel p, s kfor kth moving window passes through some p by the intermediate result calculated in formula (16), ψ kfor kth moving window passes through some p by the intermediate result calculated in formula (17); S (p) for all window filtering operation complete after the change of scale factor of corresponding pixel points p position, ψ (p) for all window filtering operation complete after the offset component of corresponding pixel points p position;
Described pending image mid point p change of scale Summing Factor offset component formula (18) and formula (19) determined substitutes into the atmospheric propagation coefficient in formula (15) after calculation optimization
12. terminals as described in any one of claim 8 or 9, is characterized in that, described enhancement unit specifically for:
Formula (20) R, G, B tri-passages corresponding to described pending image are respectively adopted to carry out enhancing process:
J c ( p ) = 1 t ( p ) ( I c ( p ) - A ) + A - - - ( 20 )
Wherein, in formula (20), A represents air light value, and t (p) represents the value at the atmospheric propagation coefficient of p point, and c represents R, G, B tri-passages, J cp () is the image after passage c enhancing process, I cp () is described pending image.
CN201510894229.9A 2015-12-07 2015-12-07 Image processing method and terminal Pending CN105389784A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510894229.9A CN105389784A (en) 2015-12-07 2015-12-07 Image processing method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510894229.9A CN105389784A (en) 2015-12-07 2015-12-07 Image processing method and terminal

Publications (1)

Publication Number Publication Date
CN105389784A true CN105389784A (en) 2016-03-09

Family

ID=55422037

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510894229.9A Pending CN105389784A (en) 2015-12-07 2015-12-07 Image processing method and terminal

Country Status (1)

Country Link
CN (1) CN105389784A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106971166A (en) * 2017-03-29 2017-07-21 纵目科技(上海)股份有限公司 The image pre-processing method and system of parking stall detection

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102938136A (en) * 2012-07-19 2013-02-20 中国人民解放军国防科学技术大学 Method for defogging single images based on Bayer formats rapidly
CN104200445A (en) * 2014-09-26 2014-12-10 常熟理工学院 Image defogging method with optimal contrast ratio and minimal information loss
CN104252698A (en) * 2014-06-25 2014-12-31 西南科技大学 Semi-inverse method-based rapid single image dehazing algorithm
CN104281999A (en) * 2013-07-12 2015-01-14 东北师范大学 Single image defogging method based on structural information

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102938136A (en) * 2012-07-19 2013-02-20 中国人民解放军国防科学技术大学 Method for defogging single images based on Bayer formats rapidly
CN104281999A (en) * 2013-07-12 2015-01-14 东北师范大学 Single image defogging method based on structural information
CN104252698A (en) * 2014-06-25 2014-12-31 西南科技大学 Semi-inverse method-based rapid single image dehazing algorithm
CN104200445A (en) * 2014-09-26 2014-12-10 常熟理工学院 Image defogging method with optimal contrast ratio and minimal information loss

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KAIMING HE ET AL.: "Single Image Haze Removal Using Dark Channel Prior", 《IEEE CONFERENCE ON COMPUTER VISION & PATTERN RECOGNITION》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106971166A (en) * 2017-03-29 2017-07-21 纵目科技(上海)股份有限公司 The image pre-processing method and system of parking stall detection

Similar Documents

Publication Publication Date Title
US8331734B2 (en) Processing method and device for processing a polygon mesh that approximates a three-dimensional object with use of a polygonal shape
CN111260766B (en) Virtual light source processing method, device, medium and electronic equipment
US8547386B2 (en) Image processing device and non-transitory computer-readable storage medium
EP3629293A1 (en) Improved texture coordinate compression using texture atlas
US20220148190A1 (en) Method, electronic device and storage medium for detecting change of building
CN102222314A (en) Accelerating bitmap remoting by identifying and extracting patterns from source bitmaps through parallel processing techniques
US20080012874A1 (en) Dynamic selection of high-performance pixel shader code based on check of restrictions
CN113393371B (en) Image processing method and device and electronic equipment
US9679530B2 (en) Compressing graphics data rendered on a primary computer for transmission to a remote computer
CN111967297A (en) Semantic segmentation method and device for image, electronic equipment and medium
US8149245B1 (en) Adaptive linear contrast method for enhancement of low-visibility imagery
US11568254B2 (en) Electronic apparatus and control method thereof
CN105513024A (en) Method and terminal for processing image
CN114529658A (en) Graph rendering method and related equipment thereof
CN105389784A (en) Image processing method and terminal
CN110473279B (en) Weather particle rendering method and device, computer equipment and storage medium
CN113393468A (en) Image processing method, model training device and electronic equipment
US6906723B2 (en) Generating partials for perspective corrected texture coordinates in a four pixel texture pipeline
CN113657396A (en) Training method, translation display method, device, electronic equipment and storage medium
CN115082800B (en) Image segmentation method
WO2019088705A1 (en) Method and device for simplifying three-dimensional mesh data
CN115937537A (en) Intelligent identification method, device and equipment for target image and storage medium
CN115879004A (en) Target model training method, apparatus, electronic device, medium, and program product
CN107704340A (en) PE image files generation method, device and electronic equipment
CN113160377B (en) Method, apparatus, device and storage medium for processing image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20160309

RJ01 Rejection of invention patent application after publication