CN107424133B - Image defogging method and device, computer storage medium and mobile terminal - Google Patents

Image defogging method and device, computer storage medium and mobile terminal Download PDF

Info

Publication number
CN107424133B
CN107424133B CN201710624518.6A CN201710624518A CN107424133B CN 107424133 B CN107424133 B CN 107424133B CN 201710624518 A CN201710624518 A CN 201710624518A CN 107424133 B CN107424133 B CN 107424133B
Authority
CN
China
Prior art keywords
defogging
image
preview image
haze concentration
haze
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710624518.6A
Other languages
Chinese (zh)
Other versions
CN107424133A (en
Inventor
袁全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710624518.6A priority Critical patent/CN107424133B/en
Publication of CN107424133A publication Critical patent/CN107424133A/en
Application granted granted Critical
Publication of CN107424133B publication Critical patent/CN107424133B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

The invention relates to an image defogging method and device, a computer storage medium and a mobile terminal. An image defogging method comprising: entering a photographing preview mode and displaying a preview image; acquiring the current visibility of the area where the preview image is located; determining the level of haze concentration according to the visibility; and carrying out defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level. According to the image defogging method, the haze concentration level can be determined according to the obtained visibility, and the corresponding defogging method is adopted for the preview images with different haze concentration levels to perform self-adaptive defogging processing, so that the defogging effect is enhanced, and the user experience is improved.

Description

Image defogging method and device, computer storage medium and mobile terminal
Technical Field
The invention relates to the technical field of computers, in particular to an image defogging method and device, a computer storage medium and a mobile terminal.
Background
With the popularization and application of mobile terminals, such as smart phones, tablet computers and other mobile terminals are indispensable in life, various functions of mobile terminal accessories are also more and more concerned by people. In daily life, with the popularization of a camera function of a mobile terminal, a demand for image processing using the mobile terminal is increasing, and various image processing software is also beginning to emerge. Under weather conditions such as fog and haze, visibility is reduced due to suspended substances in the atmosphere, and image quality of pictures taken under such weather conditions is affected. Therefore, an image defogging technology is proposed to remove the influence of weather factors such as fog and haze on the quality of a shot image and enhance the visibility of objects in the image.
The traditional defogging technology adopts the same defogging algorithm, and no matter how the haze degree of weather is, the defogging degree of the image after final defogging treatment is the same, so that the user experience degree is low.
Disclosure of Invention
The embodiment of the invention provides an image defogging method and device, a computer storage medium and a mobile terminal, which can perform adaptive defogging processing of different algorithms on a preview image, enhance the defogging effect and improve the user experience.
An image defogging method comprising:
entering a photographing preview mode and displaying a preview image;
acquiring the current visibility of the area where the preview image is located;
determining the level of haze concentration according to the visibility;
and carrying out defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level.
The image defogging method enters a photographing preview mode and displays a preview image; acquiring the current visibility of the area where the preview image is located; determining the level of haze concentration according to the visibility; and carrying out defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level. According to the image defogging method, the haze concentration level can be determined according to the obtained visibility, and the corresponding defogging method is adopted for the preview images with different haze concentration levels to perform self-adaptive defogging processing, so that the defogging effect is enhanced, and the user experience is improved.
An embodiment of the present invention further provides an image defogging device, including:
the display module is used for displaying the preview image when entering a photographing preview mode;
the acquisition module is used for acquiring the current visibility of the area where the preview image is located;
the determining module is used for determining a haze concentration level according to the visibility, wherein the haze concentration level and the visibility are in an inverse proportional relation; and
and the defogging module is used for performing defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level.
Embodiments of the present invention also provide a computer-readable storage medium having stored thereon a computer program that, when executed by a processor, implements an image defogging method.
A mobile terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing an image defogging method when executing the program.
Drawings
FIG. 1 is a flow diagram of a method for image defogging in one embodiment;
FIG. 2 is a flowchart of an image defogging method in another embodiment;
FIG. 3 is a flow diagram of defogging a preview image using a dark channel prior based algorithm in one embodiment;
FIG. 4 is a flowchart illustrating defogging of the preview image based on the haze concentration factor and the guided filtering method in one embodiment;
FIG. 5 is a physical model of imaging haze weather in one embodiment;
FIG. 6 is a diagram showing an internal frame of an image defogging device in one embodiment;
FIG. 7 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
An embodiment of the invention provides an image defogging method, and fig. 1 is a flowchart of the image defogging method in the embodiment. An image defogging method comprising the steps of:
step 102: and entering a photographing preview mode and displaying a preview image.
It should be noted that the image defogging method provided by the embodiment of the invention is implemented in a scene of taking a picture on a mobile terminal. When a user wants to take a picture, the imaging device of the mobile terminal is started, and when the user wants to take a picture, the imaging device of the terminal is started, wherein the imaging device can be a front camera, a rear camera, a double camera and the like. Starting the imaging equipment of the mobile terminal to enable the imaging equipment to enter a photographing preview mode, displaying a photographed object in a display window of the mobile terminal, and defining an image displayed in the display window at the moment as a preview image.
The imaging device generally comprises five parts in hardware: a housing (motor), a lens, an infrared filter, an image sensor (e.g., CCD or COMS), and a Flexible Printed Circuit Board (FPCB), etc. In the shooting preview mode, in the process of displaying a preview image, the lens is driven by the motor to move, and a shot object passes through the lens to be imaged on the image sensor. The image sensor converts the optical signal into an electric signal through optical-electric conversion and transmits the electric signal to the image processing circuit for subsequent processing. The Image Processing circuit may be implemented using hardware and/or software components, and may include various Processing units that define an ISP (Image Signal Processing) pipeline.
Step 104: and acquiring the current visibility of the area where the preview image is located.
Specifically, through a weather forecast plug-in built in the mobile terminal, the weather information of the area where the mobile terminal is located, which is provided by the weather forecast plug-in, is obtained in an online manner. The weather information includes weather forecast (rain, fog, snow, fog early warning, etc.), air quality (air quality index, fog and haze early warning), visibility, etc. Visibility, which is an indicator of atmospheric transparency, is defined in the aviation industry as the maximum distance that a person with normal vision can see clearly the outline of an object under the weather conditions at that time. Visibility is closely related to weather conditions at that time, and when rainfall, fog, haze, smoke, snow, sand storm and other weather processes occur, the atmosphere transparency is low, so the visibility is poor. Visibility is the most direct standard of measuring haze concentration, and the visibility is big and indicates that haze concentration is little, otherwise indicates that haze concentration is big. When the visibility is less than 1000 meters, the weather is indicated as haze weather, and the visibility is unclear, which indicates that fog, haze, smoke and the like exist in the air.
Optionally, the visibility may also be obtained by accessing a corresponding application server, for example, accessing a local application server such as a weather station, an aerospace plane, and the like, to obtain the local visibility at that time.
Step 106: and determining a haze concentration level according to the visibility, wherein the haze concentration level and the visibility are in an inverse proportional relation.
The high visibility (the long distance of visibility) indicates that the haze concentration is low, otherwise, the haze concentration is high. As shown in table 1, according to the statistical rule of the haze concentration level and the visibility, the corresponding relationship between the haze concentration level V and the visibility L can be given. According to actual requirements, the haze concentration levels can be divided into four levels, namely level 0, level 1, level 2 and level 3, wherein the level 0 can be understood as the visibility distance larger than 1000 m, the weather is clear, and no haze exists. Level 1 corresponds to the visibility distance in the thin haze weather between 500 ~ 1000 meters, level 2 corresponds to the visibility distance in the dense haze weather between 50 ~ 500 meters, and level 3 corresponds to the visibility distance and is less than the super dense haze weather below 50 meters.
Table 1 haze concentration level and visibility correspondence table
Haze concentration level L Visibility V (m)
0 >1000
1 500~1000
2 50~500
3 <50
Step 108: and carrying out defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level.
And performing defogging treatment on the preview image by adopting a corresponding defogging method according to the acquired haze concentration level, wherein when the haze concentration level is 0, the visibility is very high, the weather is clear, the haze concentration in the air is very low, and the defogging treatment on the preview image is not needed. When the haze concentration level is level 1, corresponding to haze weather, the defogging algorithm aiming at the haze weather can be adopted to perform defogging treatment on the preview image. When the haze concentration level is level 2, the haze concentration level corresponds to the dense haze weather, and the defogging algorithm aiming at the dense haze weather can be adopted to perform defogging treatment on the preview image. When haze concentration level is level 3, its visibility is less than 50 meters, and haze concentration is too big, and it is bigger to make the image noise after the defogging when handling the defogging of this type of image, consequently, need carry out incomplete defogging, can adopt the defogging algorithm to dense haze weather to carry out incomplete defogging to preview image.
The image defogging method enters a photographing preview mode and displays a preview image; acquiring the current visibility of the area where the preview image is located; determining the level of haze concentration according to the visibility; and carrying out defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level. According to the image defogging method, the haze concentration level can be determined according to the obtained visibility, and the corresponding defogging method is adopted for the preview images with different haze concentration levels to perform self-adaptive defogging processing, so that the defogging effect is enhanced, and the user experience is improved.
FIG. 2 is a flowchart of an image defogging method in another embodiment. In the embodiment of the invention, the image defogging method comprises the following steps:
step 202: entering a photographing preview mode and displaying a preview image;
step 204: acquiring the current visibility of the area where the preview image is located;
step 206: determining the level of haze concentration according to the visibility;
step 208: and determining the accuracy of the haze concentration level according to the color histogram of the HSV component of the preview image.
Step 210: and carrying out defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level.
Wherein, steps 202, 204, 206, and 210 correspond to steps 102, 104, 106, and 108 in the embodiment of fig. 1, and are not described herein again.
In order to determine the accuracy of the information of the haze concentration level obtained according to the visibility, in the embodiment of the invention, the accuracy of the haze concentration level can be determined according to the color histogram of the HSV component of the preview image.
Specifically, the three elements constituting the HSV color space are a Hue component (Hue), a Saturation component (Saturation), and a Value component (Value). In the preview image acquired under the haze weather condition, information contained in the hue component H, the saturation component S and the lightness component V of the preview image is greatly different, so that the haze level can be determined according to the color histogram of the H, S, V components of the image. The decision formula is as follows:
AveH<100&&AveS<0.07&&AveV<0.5 (1-1)
AveH<125&&AveS<0.2&&AveV<0.48 (1-2)
in the formula, AveH, AveS, and AveV are the tone H, S, V component features of the image, respectively, and are calculated as follows:
AveH=SumH/M1 (1-3)
AveS=SumS/M2 (1-4)
AveV=SumV/M3 (1-5)
wherein, SumH, SumS and SumV are the sum of hue component H, saturation component S and lightness component V of all pixel points in the image respectively; m1, M2, and M3 are the numbers of pixels having hue component H, saturation component S, and value component V not equal to 0, respectively.
And if the characteristics of the hue component H, the saturation component S and the lightness component V of the preview image satisfy the formula (1-1), determining that the weather is haze, and the haze concentration level is level 1.
And if the characteristics of the hue component H, the saturation component S and the lightness component V of the preview image satisfy the formula (1-2), determining that the weather is the dense haze weather, and the haze concentration level is level 2.
If the hue component H, the saturation component S, and the lightness component V of the preview image do not satisfy either of the expressions (1-1) and (1-2), it is considered that there is no haze in the preview image, and at this time, it is not necessary to perform the defogging process on the preview image.
Whether the haze concentration level in the preview image is consistent with the haze concentration level determined according to visibility can be determined according to the hue component H, the saturation component S and the lightness component V of the preview image. If the fog concentration levels are consistent, the fog concentration levels determined according to the visibility are accurate, and corresponding defogging treatment can be performed. If the visibility is not consistent with the visibility of the preview image, the obtained visibility is inaccurate, and the current visibility of the area where the preview image is located needs to be obtained again to determine the haze concentration level or the haze concentration level is directly determined according to the color histogram of the HSV component of the preview image to perform corresponding defogging treatment.
Optionally, the quality of the preview image may be evaluated by Visual Contrast Measure (VCM), and the quality of the obtained preview image is compared with a preset value, so as to determine the haze concentration level.
In one embodiment, the preview image is defogged by adopting a corresponding defogging method according to the haze concentration level.
Specifically, when the haze concentration level is level 1, defogging processing is performed on the preview image by using a dark channel prior defogging method. When the haze concentration level is level 1, corresponding to haze weather, the contrast of the preview image is not seriously reduced, dark primary color information in the image is prominent, and the defogged image can be closer to a real scene by adopting a dark primary color prior defogging algorithm based on image restoration.
And when the haze concentration level is level 2, defogging the preview image by adopting a haze concentration factor and a guided filtering method. When the haze concentration level is level 1, corresponding to the dense haze weather, the contrast of the preview image is seriously reduced, information such as image details and colors is greatly lost, and dark primary color information is covered by haze, so that the defogging effect of the dark primary color prior defogging algorithm on the image is unsatisfactory. At this time, the preview image may be defogged by using a haze concentration factor and a guided filtering method, so that the defogged image is closer to a real scene, or may be defogged by using a global histogram equalization, a homomorphic filtering or a multi-scale Retinex algorithm which is based on the condition that an algorithm for image enhancement does not have the limitation of conditions such as dark primary color information.
As shown in fig. 3, further, the defogging processing on the preview image by using the dark channel prior defogging method includes the following steps:
step 302: and acquiring a dark primary color image of the preview image.
Obtaining a dark primary color image J according to the preview imagedark(x) It can be expressed by the following formula:
Figure BDA0001362511360000071
where c represents one of the R, G, B color channels of the image, Ω (x) represents a local window at x in the image except for the sky region, and J represents a local window at x in the imagecRepresenting one color channel of a color image. The minimum operation is performed twice when the dark primary color image of the preview image is solved, firstly, the minimum value of the color channel of the center pixel point R, G, B is solved, and then, the minimum value filtering is performed on the local region omega (x) of the image.
Step 304: and calculating an atmospheric light value A and a rough transmittance t (x) according to the dark primary color image.
In image defogging based on a physical model, the value size of A directly influences the brightness of a restored image: too large a will make the restored image darker, and too small a will result in overexposure of the image, distorting the restored image color. According to the statistical rule, a may be 0.98, or may be another value, and the embodiment of the present invention is not limited thereto.
Meanwhile, the rough transmittance t (x) may be defined as:
Figure BDA0001362511360000072
in the formula IC(y) three channels representing pixels R, G, B of preview image I (x); ω is called the defogging degree factor. The value range of omega is as follows: 0 < omega < 1, specifically omega=0.95。
Step 306: bilateral filtering of the coarse transmittance t (x) results in a fine transmittance.
First, the rough transmittance t (x) is down-sampled to obtain a sampled transmittance t' (x), and the down-sampling ratio is 1/4. The time consumption of bilateral filtering for optimizing transmittance can be reduced after the rough transmittance t (x) is subjected to down-sampling treatment. Then, smooth optimization is carried out on the estimated transmissivity t' (x) in a bilateral filtering mode to obtain fine transmissivity
Figure BDA0001362511360000081
Fine transmittance after optimization using bilateral filtering
Figure BDA0001362511360000082
Comprises the following steps:
Figure BDA0001362511360000083
in the formula, C is a normalization parameter;
Figure BDA0001362511360000084
is a spatial domain weight coefficient;
Figure BDA0001362511360000085
is a value domain weight coefficient; Ω is a window of (2N +1) × (2N +1) size centered on the pixel point (i, j); t (i, j) is the coarse transmittance t (x). Wherein, the bilateral filter parameter bilateral filter radius N is 10, the spatial similarity factor sigmas=5。
Step 308: for the fine transmittance
Figure BDA0001362511360000086
Performing bilinear interpolation to obtain the fine transmittance
Figure BDA0001362511360000087
Returning to original size to obtain estimated transmissivity
Figure BDA0001362511360000088
The bilinear interpolation is to perform weighted average on pixel values of 4 sampling points in a 2 × 2 area around a central point as an output result, and the interpolation mode is ideal for both the processing result and the processing speed of an input image.
Step 310: estimating the transmittance according to the atmospheric light value A
Figure BDA0001362511360000089
And carrying out defogging treatment on the preview image. Wherein the preset recovery formula is expressed as:
Figure BDA00013625113600000810
wherein I (x) represents a preview image, J (x) is a restored fog-free image, A is an atmospheric light value,
Figure BDA00013625113600000811
to estimate the transmittance, the defogging process is performed at t to avoid excessive white field exposure of the defogged image0Standard calculation is given as 0.1.
The algorithm speed is improved by optimizing the transmissivity by adopting the self-adaptive bilateral filtering, and meanwhile, downsampling operation is carried out on the transmissivity before the transmissivity is optimized, so that the algorithm speed is further improved.
As shown in fig. 5, further, performing defogging processing on the preview image based on the haze concentration factor and the guided filtering method includes the following steps:
step 402: constructing a haze weather imaging physical model, wherein the physical model is expressed as: e ═ Gβ(I-Ig)+IgIn the formula, GβIs a haze concentration factor, I is a preview image, IgIs ambient atmospheric light in the environment.
The haze weather imaging physical model is an important basis for researching haze weather image degradation, and a common haze weather imaging physical model can be represented as shown in fig. 5. As can be seen from FIG. 5, the factors that make the image of the haze weather blurred mainly include the absorption and scattering of the reflected light of the imaging object by the turbid medium in the air and the reflection of the atmospheric particles and the ground in the air. The light causes multiple scattering interference in the image being imaged during the scattering process. The process is expressed by a mathematical model as follows:
I=Igρ(x)e-βd(x)+Ig(1-e-βd(x))
in the formula IgThe ambient light is defined as the ambient light, ρ (x) is the standard light irradiation intensity, I is the preview image, d (x) is the scene depth, and β is the haze concentration influence coefficient. The change in depth of the image scene acquired by the imaging device is slight under heavy fog conditions, as compared to the distant transmission of skylight. If E is a clear image before degradation, a haze concentration factor G is usedβIn place of eβd(x)According to the radiation principle, the weather imaging physical model is obtained by processing the formula:
E=Gβ(I-Ig)+Ig
step 404: and obtaining the haze concentration factor according to the visibility.
Visibility is the most direct standard of measuring haze concentration, and the visibility is big and indicates that haze concentration is little, otherwise indicates that haze concentration is big. When the visibility is used for representing the haze concentration information, the haze concentration is too high under the condition of ultralow visibility, and the noise of the defogged image is large during the defogging treatment of the image, so that incomplete defogging is needed. When the visibility is a large distance, the haze concentration is low, and the defogging treatment on the preview image is not needed. The visibility is within 1000 m and is taken as the haze weather, and when the visibility L belongs to [50, 1000 ∈]Time, haze concentration factor GβAnd visibility 1/L. Therefore, visibility L and haze concentration factor G can be setβThe corresponding relationship between:
Figure BDA0001362511360000091
according to the corresponding relation, the visibility can be directly obtained according to the obtained visibilityDetermining haze concentration factor Gβ
Step 406: the atmospheric light is estimated using guided filtering.
Firstly, a preview image is converted into a gray-scale image, and the gray-scale image is used for guiding filtering to obtain atmospheric light A. When the atmospheric light A is estimated by adopting the guided filtering method, if the preview image is p, the guide image is I, and the filtering output image is q, then the window W with k as the center is positionedkThe intermediate filtered output image q has a linear relationship with the guide image I. In order to optimize the effect of the guided filtering, the difference between the output image q and the preview image p must be minimized, and in this case, the cost function E (a) is requiredk,bk) Satisfies the following conditions:
Figure BDA0001362511360000101
in the formula, ak、bkLinear coefficients in a window are taken, and fixed values are taken in the window; wkIs a hypothetical square window of radius r. The local linear coefficient a can be obtained by the least square methodk、bk. Atmospheric light IgCan be expressed as:
Figure BDA0001362511360000102
in the formula, piFor a pixel point in the preview image, i is pixel-directed, WiIs represented by piThe unit window is the center, | omega | is the number of the unit window pixels, and R represents that filtering processing is carried out on each pixel point.
Step 408: and bringing the acquired haze concentration factor and atmospheric light into the haze weather imaging physical model to carry out defogging treatment on the preview image.
The obtained haze concentration factor GβAtmospheric light IgBring into haze weather formation of image physical model E ═ Gβ(I-Ig)+IgIn this way, the defogging process for the preview image I can be realized.
In the embodiment of the invention, the complexity of the method is low when the value of the haze concentration factor is calculated, the execution speed of the guided filtering is irrelevant to the size of the filtering window when the filtering operation is executed, and the response speed is high. By the method, the defogging treatment of the dense haze preview image can be realized, and the defogged image is real in color, strong in layering sense and high in image brightness definition.
In one embodiment, the method further comprises the step of performing exposure processing and automatic tone scale processing on the defogged preview image.
And in the defogging photographing preview mode, performing defogging processing of a corresponding level on the first preview image according to the visibility to acquire a second preview image, and performing exposure processing and automatic color level processing on the defogged second preview image to enhance the display effect of the second preview image. Generally, the second preview image obtained after the defogging processing has darker brightness, the second preview image is post-processed, and the exposure and the automatic color level can be increased for the second preview image which is too dark in the post-processing process, so that the display effect of the defogged image can be presented more perfectly.
In one embodiment, the image defogging method further comprises the step of generating the second preview image into an image file in response to a photographing instruction.
The photographing instruction may be an instruction input by a user to control the imaging device to record an image of an object, and the user may trigger the photographing instruction through a physical or virtual start key, a gesture action, voice, and the like. And when receiving a photographing instruction, generating an image file (such as BMP format, JPEG format and the like) of the second preview image after the defogging processing, and storing the image file. And when the image file is generated, an image corresponding to the image file can be displayed on a screen of the mobile terminal for a user to view.
An embodiment of the present invention further provides an image defogging device, and fig. 6 is a schematic structural diagram of the image defogging device in an embodiment.
An image defogging device comprising:
the display module 610 is configured to display a preview image when entering a photographing preview mode;
an obtaining module 620, configured to obtain current visibility of an area where the preview image is located;
a determining module 630, configured to determine a level of haze concentration according to the visibility; and
and the defogging module 640 is used for performing defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level.
The image defogging device can determine the haze concentration levels according to the obtained visibility, and self-adaptive defogging processing is carried out on preview images of different haze concentration levels by adopting the corresponding defogging method, so that the defogging effect is enhanced, and the user experience is improved.
In one embodiment, the image defogging device further comprises:
and the evaluation module 650 is configured to evaluate the accuracy of the haze concentration level of the preview image according to the color histogram of the HSV components of the preview image.
The embodiment of the invention also provides a computer readable storage medium. A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
entering a photographing preview mode and displaying a preview image;
acquiring the current visibility of the area where the preview image is located;
determining the level of haze concentration according to the visibility;
and carrying out defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level.
When the computer program (instruction) in the computer-readable storage medium is executed, the haze concentration level can be determined according to the obtained visibility, and the corresponding defogging method is adopted for the preview images with different haze concentration levels to perform self-adaptive defogging processing, so that the defogging effect is enhanced, and the user experience is improved.
In one embodiment, the image defogging method further comprises:
and determining the accuracy of the haze concentration level according to the color histogram of the HSV component of the preview image.
In one embodiment, the haze level comprises: class 1 for mist and class 2 for dense mist;
when the haze concentration level is level 1, defogging the preview image by adopting a dark channel prior algorithm;
and when the haze concentration level is level 2 or higher than level 2, defogging processing is carried out on the preview image by adopting an atmosphere extinction coefficient-based guiding filtering method.
In one embodiment, the defogging processing on the preview image by using the dark channel prior algorithm includes:
acquiring a dark primary color image of a preview image;
obtaining an atmospheric light value and a rough transmittance according to the dark primary color image;
bilateral filtering is carried out on the rough transmissivity to obtain fine transmissivity;
carrying out bilinear interpolation on the fine transmissivity to restore the fine transmissivity to the original size so as to obtain estimated transmissivity;
and defogging the preview image according to the atmospheric light value and the estimated transmittance.
In one embodiment, the defogging processing on the preview image based on the haze concentration factor and the guiding filtering method includes:
constructing a haze weather imaging physical model, wherein the physical model is expressed as: e ═ Gβ(I-Ig)+IgIn the formula, GβIs a haze concentration factor, I is a preview image, IgIs ambient atmospheric light in the environment;
obtaining the haze concentration factor according to the visibility;
estimating the atmospheric light by adopting a guided filtering method;
and bringing the acquired haze concentration factor and atmospheric light into the haze weather imaging physical model to carry out defogging treatment on the preview image.
In one embodiment, the image defogging method further comprises:
and carrying out exposure processing and automatic color gradation processing on the preview image after the defogging processing.
The embodiment of the invention also provides computer equipment. The computer device includes therein an Image processing circuit, which may be implemented using hardware and/or software components, and may include various processing units defining an ISP (Image signal processing) pipeline. FIG. 7 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 7, for ease of explanation, only aspects of the image processing techniques related to embodiments of the present invention are shown.
As shown in fig. 7, the image processing circuit includes an ISP processor 740 and control logic 750. The image data captured by the imaging device 710 is first processed by the ISP processor 740, and the ISP processor 740 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of the imaging device 710. The imaging device 710 may include a camera having one or more lenses 712 and an image sensor 714. The image sensor 714 may include an array of color filters (e.g., Bayer filters), and the image sensor 714 may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor 714 and provide a set of raw image data that may be processed by the ISP processor 740. The sensor 720 may provide raw image data to the ISP processor 740 based on the sensor 720 interface type. The sensor 720 interface may utilize a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
ISP processor 740 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 740 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 740 may also receive pixel data from image memory 730. For example, raw pixel data is sent from the sensor 720 interface to the image memory 730, and the raw pixel data in the image memory 730 is then provided to the ISP processor 740 for processing. The image Memory 730 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
ISP processor 740 may perform one or more image processing operations, such as temporal filtering, upon receiving raw image data from sensor 720 interface or from image memory 730. The processed image data may be sent to or image memory 730 for additional processing before being displayed. ISP processor 740 may also receive processed data from image memory 730 for image data processing in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display 780 for viewing by a user and/or further processing by a graphics engine or GPU (graphics processing Unit). Further, the output of ISP processor 740 may also be sent to image memory 730, and display 780 may read image data from image memory 730. In one embodiment, image memory 730 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 740 may be transmitted to the encoder/decoder 770 for encoding/decoding image data. The encoded image data may be saved and decompressed prior to display on the display 780 device.
The ISP processed image data may be sent to a defogging module 760 for defogging the image before being displayed. And the defogging module 760 is used for defogging the preview image according to the determined haze concentration level by adopting a corresponding defogging method. Meanwhile, the accuracy of the haze concentration level can be determined according to the color histogram of the HSV component of the preview image. The defogging module 760 may be a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU) in the mobile terminal. After the defogging module 760 defoggs the image data, the defogged image data may be transmitted to the encoder/decoder 770 to encode/decode the image data. The encoded image data may be saved and decompressed prior to display on the display 780 device. It is understood that the image data processed by the defogging module 760 may be sent directly to the display 780 for display without passing through the encoder/decoder 770. The image data processed by the ISP processor 740 may also be processed by the encoder/decoder 770 and then processed by the defogging module 760.
The statistical data determined by ISP processor 740 may be sent to control logic 750 unit. For example, the statistical data may include image sensor 714 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 712 shading correction, and the like. Control logic 750 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 710 and, thus, control parameters based on the received statistical data. For example, the control parameters may include sensor 720 control parameters (e.g., gain, integration time for exposure control), camera flash control parameters, lens 712 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 712 shading correction parameters.
The following steps are implemented for the image defogging method based on the image processing technology in fig. 7:
entering a photographing preview mode and displaying a preview image;
acquiring the current visibility of the area where the preview image is located;
determining the level of haze concentration according to the visibility;
and carrying out defogging treatment on the preview image by adopting a corresponding defogging method according to the haze concentration level.
When the computer program running on the processor is executed, the haze concentration level can be determined according to the obtained visibility, and the corresponding defogging method is adopted for the preview images with different haze concentration levels to carry out self-adaptive defogging processing, so that the defogging effect is enhanced, and the user experience is provided.
In one embodiment, the image defogging method further comprises:
and determining the accuracy of the haze concentration level according to the color histogram of the HSV component of the preview image.
In one embodiment, the haze level comprises: class 1 for mist and class 2 for dense mist;
when the haze concentration level is level 1, defogging the preview image by adopting a dark channel prior algorithm;
and when the haze concentration level is level 2 or higher than level 2, defogging processing is carried out on the preview image by adopting an atmosphere extinction coefficient-based guiding filtering method.
In one embodiment, the defogging processing on the preview image by using the dark channel prior algorithm includes:
acquiring a dark primary color image of a preview image;
obtaining an atmospheric light value and a rough transmittance according to the dark primary color image;
bilateral filtering is carried out on the rough transmissivity to obtain fine transmissivity;
carrying out bilinear interpolation on the fine transmissivity to restore the fine transmissivity to the original size so as to obtain estimated transmissivity;
and defogging the preview image according to the atmospheric light value and the estimated transmittance.
In one embodiment, the defogging processing on the preview image based on the haze concentration factor and the guiding filtering method includes:
constructing a haze weather imaging physical model, wherein the physical model is expressed as: e ═ Gβ(I-Ig)+IgIn the formula, GβIs a haze concentration factor, I is a preview image, IgIs ambient atmospheric light in the environment;
obtaining the haze concentration factor according to the visibility;
estimating the atmospheric light by adopting a guided filtering method;
and bringing the acquired haze concentration factor and atmospheric light into the haze weather imaging physical model to carry out defogging treatment on the preview image.
In one embodiment, the image defogging method further comprises:
and carrying out exposure processing and automatic color gradation processing on the preview image after the defogging processing.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (9)

1. An image defogging method, comprising:
entering a photographing preview mode and displaying a preview image;
acquiring the current visibility of the area where the preview image is located;
determining the level of haze concentration according to the visibility;
performing self-adaptive defogging treatment on the preview image by adopting corresponding different defogging methods according to different haze concentration levels so as to perform defogging imaging;
if the haze concentration level corresponds to the thick haze weather, defogging the preview image by adopting a haze concentration factor-based guiding filtering method;
the defogging treatment is carried out on the preview image based on the haze concentration factor and the guiding filtering method, and the defogging treatment comprises the following steps:
constructing a haze weather imaging physical model;
obtaining the haze concentration factor according to the visibility;
estimating the atmospheric light by adopting a guided filtering method;
bringing the acquired haze concentration factor and atmospheric light into the haze weather imaging physical model to carry out defogging treatment on the preview image;
the haze concentration levels include: class 1 for mist and class 2 for dense mist;
when the haze concentration level is level 1, defogging the preview image by adopting a dark channel prior algorithm;
and when the haze concentration level is level 2 or higher than level 2, defogging processing is carried out on the preview image by adopting a haze concentration factor-based guide filtering method.
2. The image defogging method according to claim 1, further comprising:
and determining the accuracy of the haze concentration level according to the color histogram of the HSV component of the preview image.
3. The image defogging method according to claim 1, wherein the defogging process is performed on the preview image by adopting a dark channel prior algorithm, and comprises the following steps:
acquiring a dark primary color image of a preview image;
obtaining an atmospheric light value and a rough transmittance according to the dark primary color image;
bilateral filtering is carried out on the rough transmissivity to obtain fine transmissivity;
carrying out bilinear interpolation on the fine transmissivity to restore the fine transmissivity to the original size so as to obtain estimated transmissivity;
and defogging the preview image according to the atmospheric light value and the estimated transmittance.
4. The image defogging method according to claim 1,
the physical model is represented as: e ═ Gβ(I-Ig)+IgIn the formula, GβIs a haze concentration factor, I is a preview image, IgIs ambient atmospheric light in the environment.
5. The image defogging method according to claim 1, further comprising:
and carrying out exposure processing and automatic color gradation processing on the preview image after the defogging processing.
6. An image defogging device, comprising:
the display module is used for displaying the preview image when entering a photographing preview mode;
the acquisition module is used for acquiring the current visibility of the area where the preview image is located;
the determining module is used for determining a haze concentration level according to the visibility, wherein the haze concentration level and the visibility are in an inverse proportional relation;
and
the defogging module is used for performing self-adaptive defogging treatment on the preview image by adopting corresponding different defogging methods according to different haze concentration levels so as to perform defogging imaging;
the defogging module is further used for defogging the preview image by adopting a haze concentration factor-based guiding filtering method if the haze concentration grade corresponds to the thick haze weather;
the defogging treatment is carried out on the preview image based on the haze concentration factor and the guiding filtering method, and the defogging treatment comprises the following steps:
constructing a haze weather imaging physical model;
obtaining the haze concentration factor according to the visibility;
estimating the atmospheric light by adopting a guided filtering method;
bringing the acquired haze concentration factor and atmospheric light into the haze weather imaging physical model to carry out defogging treatment on the preview image;
the haze concentration levels include: class 1 for mist and class 2 for dense mist;
the defogging module is further used for defogging the preview image by adopting a dark primary color prior algorithm when the haze concentration level is level 1; and when the haze concentration level is level 2 or higher than level 2, defogging processing is carried out on the preview image by adopting a haze concentration factor-based guide filtering method.
7. The image defogging method according to claim 6, further comprising:
and the evaluation module is used for evaluating the accuracy of the haze concentration level of the preview image according to the color histogram of the HSV component of the preview image.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the image defogging method according to any one of claims 1 to 5.
9. A mobile terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the image defogging method according to any one of claims 1 to 5 when the program is executed.
CN201710624518.6A 2017-07-27 2017-07-27 Image defogging method and device, computer storage medium and mobile terminal Active CN107424133B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710624518.6A CN107424133B (en) 2017-07-27 2017-07-27 Image defogging method and device, computer storage medium and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710624518.6A CN107424133B (en) 2017-07-27 2017-07-27 Image defogging method and device, computer storage medium and mobile terminal

Publications (2)

Publication Number Publication Date
CN107424133A CN107424133A (en) 2017-12-01
CN107424133B true CN107424133B (en) 2020-01-10

Family

ID=60430486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710624518.6A Active CN107424133B (en) 2017-07-27 2017-07-27 Image defogging method and device, computer storage medium and mobile terminal

Country Status (1)

Country Link
CN (1) CN107424133B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107767353A (en) * 2017-12-04 2018-03-06 河南工业大学 A kind of adapting to image defogging method based on definition evaluation
CN108038831A (en) * 2017-12-19 2018-05-15 北京理工大学 A kind of color video defogging method based on atmospherical scattering model
TWI724375B (en) * 2018-02-23 2021-04-11 富智捷股份有限公司 Haze removal method, electronic device and computer readable recording media
CN108460743A (en) * 2018-03-19 2018-08-28 西安因诺航空科技有限公司 A kind of unmanned plane image defogging algorithm based on dark
CN109584186A (en) * 2018-12-25 2019-04-05 西北工业大学 A kind of unmanned aerial vehicle onboard image defogging method and device
CN109658359B (en) * 2018-12-26 2023-06-13 联创汽车电子有限公司 Atmospheric suspended matter detection system and detection method thereof
CN110992293A (en) * 2019-12-13 2020-04-10 杭州电子科技大学 Self-adaptive video defogging method and device
CN111738959B (en) * 2020-06-30 2022-08-19 福州大学 Real-time defogging method for video image based on FPGA
CN111899309A (en) * 2020-07-31 2020-11-06 上海眼控科技股份有限公司 Uphill fog detection method and device, computer equipment and readable storage medium
CN114648467B (en) * 2022-05-18 2022-08-16 中山大学深圳研究院 Image defogging method and device, terminal equipment and computer readable storage medium
CN115412669B (en) * 2022-08-26 2023-06-06 清华大学 Foggy day imaging method and device based on image signal-to-noise ratio analysis

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2213219A1 (en) * 2007-11-21 2010-08-04 Panasonic Corporation Endoscope device, camera device for endoscope, and defogging method
KR20100104262A (en) * 2009-03-17 2010-09-29 주식회사 현대오토넷 Apparatus for auto defogging and method using thereof
CN105631827A (en) * 2015-12-28 2016-06-01 四川大学 Vehicle video demisting method and system
CN105809647A (en) * 2016-03-31 2016-07-27 北京奇虎科技有限公司 Automatic defogging photographing method, device and equipment
CN105913390A (en) * 2016-04-07 2016-08-31 潍坊学院 Image defogging method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2213219A1 (en) * 2007-11-21 2010-08-04 Panasonic Corporation Endoscope device, camera device for endoscope, and defogging method
KR20100104262A (en) * 2009-03-17 2010-09-29 주식회사 현대오토넷 Apparatus for auto defogging and method using thereof
CN105631827A (en) * 2015-12-28 2016-06-01 四川大学 Vehicle video demisting method and system
CN105809647A (en) * 2016-03-31 2016-07-27 北京奇虎科技有限公司 Automatic defogging photographing method, device and equipment
CN105913390A (en) * 2016-04-07 2016-08-31 潍坊学院 Image defogging method and system

Also Published As

Publication number Publication date
CN107424133A (en) 2017-12-01

Similar Documents

Publication Publication Date Title
CN107424133B (en) Image defogging method and device, computer storage medium and mobile terminal
CN110149482B (en) Focusing method, focusing device, electronic equipment and computer readable storage medium
CN107424198B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
EP3496383A1 (en) Image processing method, apparatus and device
CN107451969B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
KR102261532B1 (en) Method and system for image dehazing using single scale image fusion
CN107481186B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN107395991B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
CN107493432A (en) Image processing method, device, mobile terminal and computer-readable recording medium
CN107317967B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107277299B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN110121031B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN107341782B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN107454317B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN108989699A (en) Image composition method, device, imaging device, electronic equipment and computer readable storage medium
CN107317969A (en) Image defogging method, device, computer can storage medium and mobile terminals
CN107424134B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN107464225B (en) Image processing method, image processing device, computer-readable storage medium and mobile terminal
CN107392870B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107454318B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN107295261B (en) Image defogging method and device, storage medium and mobile terminal
CN107454319B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN114140481A (en) Edge detection method and device based on infrared image
CN112825189B (en) Image defogging method and related equipment
CN107292853B (en) Image processing method, image processing device, computer-readable storage medium and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: Guangdong Opel Mobile Communications Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant