CN113379631B - Image defogging method and device - Google Patents

Image defogging method and device Download PDF

Info

Publication number
CN113379631B
CN113379631B CN202110655785.6A CN202110655785A CN113379631B CN 113379631 B CN113379631 B CN 113379631B CN 202110655785 A CN202110655785 A CN 202110655785A CN 113379631 B CN113379631 B CN 113379631B
Authority
CN
China
Prior art keywords
parameter value
defogging
brightness
value
specified parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110655785.6A
Other languages
Chinese (zh)
Other versions
CN113379631A (en
Inventor
郭莎
姜俊锟
朱飞
杜凌霄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bigo Technology Pte Ltd
Original Assignee
Bigo Technology Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bigo Technology Pte Ltd filed Critical Bigo Technology Pte Ltd
Priority to CN202110655785.6A priority Critical patent/CN113379631B/en
Publication of CN113379631A publication Critical patent/CN113379631A/en
Application granted granted Critical
Publication of CN113379631B publication Critical patent/CN113379631B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a method and a device for defogging an image, wherein the method comprises the following steps: acquiring a first appointed parameter value of a region of interest of an original image; defogging the original image by adopting a defogging algorithm to obtain a defogging image; determining a second designated parameter value of the defogging region of interest according to the first designated parameter value; determining a brightness compensation coefficient according to the first specified parameter value and the second specified parameter value; and adopting the brightness compensation coefficient to carry out brightness compensation on the defogging image so as to make up the defect of brightness loss of the region of interest after defogging, and improving the brightness of the region of interest, thereby improving the visual effect after defogging.

Description

Image defogging method and device
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to a method and a device for defogging an image.
Background
In outdoor foggy scenes, the photographed pictures are affected by haze particles, and the quality of the images is poor, which affects the classification and segmentation of later images, the detection and tracking of targets, and the like. Therefore, an image defogging technology is generated, the image defogging is an important direction in the field of computer vision, and the image defogging refers to removing noise caused by haze in an image by utilizing an image processing technology, so as to recover a clear image without haze.
In the related art, one of the image defogging algorithms is an image restoration-based defogging algorithm, which basically performs a response defogging process based on an atmospheric light degradation model. The representative algorithm is a dark channel prior defogging algorithm based on guide filtering, and the method obtains a rule by counting a large number of natural images: in the haze-free image, each partial area is likely to have at least one color channel with a very low value, and the statistical rule is called DARK CHANNEL priority, which is the basic rule based on the haze removal of the dark channel. Although the image quality can be improved in the definition dimension by adopting the defogging algorithm, the defogging result is always darker than the input before defogging, and the defogging result has a certain brightness loss in subjective vision from the dimension.
Disclosure of Invention
The application provides a method and a device for defogging an image, which are used for solving the problem of image brightness loss caused by defogging the image in the existing image defogging algorithm.
In a first aspect, an embodiment of the present application provides a method for defogging an image, the method including:
Acquiring a first appointed parameter value of a region of interest of an original image;
defogging the original image by adopting a defogging algorithm to obtain a defogging image;
Determining a second designated parameter value of the defogging region of interest according to the first designated parameter value;
determining a brightness compensation coefficient according to the first specified parameter value and the second specified parameter value;
and adopting the brightness compensation coefficient to carry out brightness compensation on the defogging image.
In a second aspect, an embodiment of the present application further provides an image defogging device, including:
The first specified parameter value acquisition module is used for acquiring a first specified parameter value of a region of interest of the original image;
The defogging processing module is used for defogging the original image by adopting a defogging algorithm to obtain a defogged image;
the second designated parameter value acquisition module is used for determining a second designated parameter value of the defogging region of interest according to the first designated parameter value;
A brightness compensation coefficient determining module, configured to determine a brightness compensation coefficient according to the first specified parameter value and the second specified parameter value;
and the brightness compensation module is used for carrying out brightness compensation on the defogging image by adopting the brightness compensation coefficient.
In a third aspect, an embodiment of the present application further provides an electronic device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements the method described above when executing the program.
In a fourth aspect, embodiments of the present application also provide a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the above-described method.
The application has the following beneficial effects:
in this embodiment, after the region of interest is detected from the original image, a first specified parameter value of the region of interest may be obtained, after the defogging process is performed on the original image by using a defogging algorithm to obtain a defogged image, a second specified parameter value of the defogged region of interest may be determined according to the first specified parameter value, and a brightness compensation coefficient may be determined according to the first specified parameter value and the second specified parameter value, and then the defogged image may be subjected to brightness compensation by using the brightness compensation coefficient, so as to compensate for the defect of brightness loss of the region of interest after defogging, and improve the brightness of the region of interest, thereby improving the visual effect after defogging.
Drawings
FIG. 1 is a flow chart of an embodiment of a method for defogging an image according to a first embodiment of the present application;
FIG. 2 is a flow chart of an embodiment of a method for defogging an image according to a second embodiment of the present application;
FIG. 3 is a block diagram of an embodiment of an image defogging device according to a third embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present application are shown in the drawings.
Example 1
Fig. 1 is a flowchart of an embodiment of a method for defogging an image, which is provided in an embodiment of the present application, and the embodiment can be applied to applications with image processing functions, such as an image processing tool, a video production tool, a short video app client, a live app client, and the like. The embodiment mainly aims at the defect that the whole defogging is performed by adopting an atmospheric light degradation model in the related technology, and the defogged image is compensated and corrected by utilizing brightness compensation, so that the defogged image achieves a better visual effect.
The embodiment specifically may include the following steps:
in step 110, first specified parameter values of a region of interest of an original image are obtained.
In this step, a region of interest (Region Of Interest, abbreviated as ROI) may first be detected from the original image. In machine vision and image processing, a region to be processed is outlined from a processed image in a form of a square, a circle, an ellipse, an irregular polygon, and the like, and is called an ROI region. In practice, the ROI may be different for different targets. When implemented, the corresponding ROI detection model may be trained according to a specific objective, and then the ROI detection model is employed to extract the ROI from the original image. For example, if the ROI includes a face region, a face recognition model based on deep learning may be employed to perform face detection on the original image. Or if the terminal is provided with the face detection function, the face detection function can be directly adopted to carry out face detection on the original image.
After detecting the ROI area, a first specified parameter value in the ROI area may be acquired. Since the present embodiment needs to solve the problem of brightness loss after image defogging in the related art, the first specified parameter value may be a parameter value related to brightness. In one example, the first specified parameter value may include, but is not limited to: the average brightness of the Y channel of the ROI area and the average color of each of R, G, B channels of the ROI area.
For example, if the ROI area is a face area, the average value of the colors of the channels in R, G, B channels of the ROI area may be the average value of the skin colors, and the average value of the skin colors of each channel in the RGB three channels of the ROI area may be calculated as follows:
Firstly, skin color detection is carried out on a human face area by adopting a skin color detection algorithm so as to determine a pixel point set of which the color belongs to a skin color range in the human face area; and then, respectively carrying out average value calculation on each pixel point in the pixel point set according to the channel values of R, G, B three channels to obtain a skin color average value corresponding to each channel.
Specifically, due to the presence of the five sense organs, the human face region is not entirely skin, and only a part of pixels are pixels belonging to the skin color range. It is therefore necessary to first determine the set of pixels in the face region whose colors fall within the skin tone range. In one example, if the pixel points in the face area all meet the conditions agreed by the following skin color detection algorithm, it may be determined that the color of the pixel point belongs to the skin color range:
Y >0.35, R >0.39G >0.15, B >0.09, V1-V2>0.07, R/G <1.8, R/B <3.0, G/B <3.0 and R > G and R > B.
Wherein y=r×0.299+g×0.587+b×0.114, v1=max (R, G, B), v2=min (R, G, B); r, G, B represent the red, green, blue component values of the pixel point, respectively.
After a pixel point set with colors belonging to a skin color range in a face area is determined, calculating an average value of red component values of all pixel points in the set, and taking the average value as a skin color average value of a red R channel; calculating the average value of the green component values of all pixel points in the set, and taking the average value as the average value of the skin color of the green G channel; and calculating the average value of the blue component values of all pixel points in the set as the average value of the skin color of the blue B channel.
For the average value of the brightness of the Y channel of the face region, it can be calculated by calculating the average value of the brightness values of the pixels in the region.
And 120, defogging the original image by adopting a defogging algorithm to obtain a defogged image.
In one implementation, the defogging algorithm may include a dark channel prior defogging algorithm based on guided filtering, in which the following defogging model may be employed:
I(x)=J(x)t(x)+A(1-t(x)) (1)
Where I (x) is an original image, J (x) is a defogging image, a is an atmospheric light component, and t (x) is a transmittance of the original image.
By deforming the above formula (1), the following formula (2) can be obtained:
J(x)=(I(x)-A)/t(x)+A (2)
It can be seen that by determining a and t (x), a defogging image J (x) can be obtained.
When implemented, in a dark channel defogging algorithm, a global atmospheric light value a can be estimated from the original image using the dark channel. Specifically, a certain proportion (e.g., the first 0.1%) of the brightest pixels may be selected from the dark channel map according to the brightness, and in the positions of these pixels, the value of the corresponding point with the highest brightness is found in the original image as the a value. Each of the three channels has an atmospheric light value. Of course, other general methods may be used to calculate the atmospheric light value a, for example, estimating the global atmospheric light value based on the block recursive idea, estimating the global atmospheric light value based on the block thinking, rapidly estimating the global atmospheric light value, and the like, which is not limited in this embodiment.
For t (x), in the dark channel defogging algorithm, after various deductions, the following calculation formula (3) for initially estimating the transmittance can be obtained:
finally, in order to ensure the naturalness of the picture, a parameter can be added to adjust the transmissivity, as shown in a formula (4):
of course, the transmittance may be calculated in other general manners, such as a guided filtering manner or an iterative idea to estimate the transmittance, which is not limited in this embodiment.
And 130, determining a second designated parameter value of the defogging region of interest according to the first designated parameter value.
In this embodiment, the second specified parameter value may also be a parameter value related to luminance. Since there is a loss in brightness after defogging of an image according to the related art, a first specified parameter value before defogging is larger than a second specified parameter value after defogging for a region of interest.
In one embodiment, step 130 may further comprise the steps of:
and taking the first specified parameter value as the input of the defogging algorithm, and obtaining the output result of the defogging algorithm as a second specified parameter value.
In this embodiment, the first specified parameter value may be substituted into the defogging formula to obtain a second specified parameter value that darkens after defogging.
In one embodiment, the step of taking the first specified parameter value as the input of the defogging algorithm and obtaining the output result of the defogging algorithm as the second specified parameter value may further include the steps of:
determining a dark channel value of the region of interest according to the average value of the colors of all the R, G, B channels of the region of interest; calculating the transmissivity of the region of interest according to the dark channel value; and calculating the average brightness value of the defogged Y channel as a second designated parameter value based on the transmissivity of the region of interest and the average brightness value of the Y channel.
Specifically, after obtaining the average color value of each channel in R, G, B channels of the ROI area, for example, obtaining the average skin color value of each channel in RGB three channels of the face area, the minimum average skin color value can be taken as the dark channel value of the area according to the dark channel priori theory. The dark channel prior theory refers to that the gray value of one channel is very low and tends to be 0 in all three RGB color channels of each image.
The transmittance of the ROI region can then be calculated using the following equation (5):
Wherein k dark is the minimum value of the color averages of R, G, B three channels, i.e., the dark channel value; a is the atmospheric light value and t (k) is the transmittance of the region of interest.
The average brightness of the Y channel after defogging of the ROI area can then be calculated using the following equation (6):
wherein k is the average brightness value of the Y channel of the ROI region, A is the atmospheric light value, t (k) is the transmittance of the region of interest in the above formula (5), and k' is the average brightness value of the Y channel after defogging, namely the second specified parameter value. .
And 140, determining a brightness compensation coefficient according to the first designated parameter value and the second designated parameter value.
According to the comparison of the first specified parameter value and the second specified parameter value, the brightness loss after defogging can be analyzed, so that a brightness compensation coefficient is obtained.
In one embodiment, a ratio of the first specified parameter value to the second specified parameter value may be calculated as the luminance compensation coefficient, that is:
In other embodiments, the luminance compensation coefficient may be expressed in a differential manner, that is, luminance compensation coefficient=k-k', in addition to the ratio.
And 150, adopting the brightness compensation coefficient to carry out brightness compensation on the defogging image.
By compensating the brightness of the defogged image, the defect that the image is darkened after defogging can be made up. In one embodiment, the defogging image may be brightness compensated in the following manner:
And multiplying the pixel value of each pixel point in the defogging image by the brightness compensation coefficient to obtain a brightness compensated image.
Specifically, the following formula (7) may be used for brightness compensation:
Wherein J (x) is defogging image, For the brightness compensation coefficient, J' (x) is the defogging image after brightness compensation.
In other embodiments, brightness compensation may also be performed by adding, that is: j '(x) =j (x) + (k-k'), which is not limited in this embodiment.
In this embodiment, after the region of interest is detected from the original image, a first specified parameter value of the region of interest may be obtained, after the defogging process is performed on the original image by using a defogging algorithm to obtain a defogged image, a second specified parameter value of the defogged region of interest may be determined according to the first specified parameter value, and a brightness compensation coefficient may be determined according to the first specified parameter value and the second specified parameter value, and then the defogged image may be subjected to brightness compensation by using the brightness compensation coefficient, so as to compensate for the defect of brightness loss of the region of interest after defogging, and improve the brightness of the region of interest, thereby improving the visual effect after defogging.
Example two
Fig. 2 is a flowchart of an embodiment of a method for defogging an image according to a second embodiment of the present application, where, based on the first embodiment, the image is further processed from a contrast dimension to obtain a better visual effect.
The embodiment specifically may include the following steps:
at step 210, first specified parameter values of a region of interest of an original image are obtained.
And 220, defogging the original image by adopting a defogging algorithm to obtain a defogged image.
Step 230, determining a second designated parameter value of the defogging region of interest according to the first designated parameter value.
Step 240, determining a brightness compensation coefficient according to the first specified parameter value and the second specified parameter value.
And step 250, performing brightness compensation on the defogging image by adopting the brightness compensation coefficient.
And 260, performing contrast adjustment on the image after brightness compensation by using the first specified parameter value.
In this embodiment, according to the actual service requirement, the first specified parameter value is taken as the balance point, and the S-shaped contrast stretching adjustment is performed on the contrast curve of the image after brightness compensation, so as to obtain a better visual effect.
In one embodiment, when the first specified parameter value is the average value of the brightness of the Y channel of the region of interest, step 260 may further include the steps of:
Comparing the brightness value of each pixel point in the brightness compensated image with the brightness average value; and adjusting the brightness value of the pixel point with the brightness value larger than the brightness average value to be high, and adjusting the brightness value of the pixel point with the brightness value smaller than the brightness average value to be low.
In this embodiment, the average value of the brightness of the Y channel of the region of interest in the original image may be used as a balance point, the brightness value of each pixel in the image after brightness compensation is compared with the average value of the brightness, the brightness value of the pixel is adjusted down for the pixel below the average value of the brightness, the brightness value of the pixel is adjusted up for the pixel above the average value of the brightness, and if the brightness value of a certain pixel is equal to the average value of the brightness, the processing is not performed, so as to perform brightness suppression on the darker region, slightly perform brightness enhancement on the bright region, and increase the contrast between the bright portion and the dark portion, thereby further enhancing the definition.
In this embodiment, the contrast of the image after brightness compensation is adjusted by taking the first specified parameter value as the balance point, so that the contrast of the bright portion and the dark portion is increased, the definition of the image is improved, and a better visual contrast effect is obtained.
Example III
Fig. 3 is a block diagram of an embodiment of an image defogging device according to a third embodiment of the present application, where the device may include the following modules:
a first specified parameter value obtaining module 310, configured to obtain a first specified parameter value of a region of interest of an original image;
The defogging processing module 320 is configured to perform defogging processing on the original image by using a defogging algorithm, so as to obtain a defogged image;
A second specified parameter value acquisition module 330, configured to determine a second specified parameter value after defogging the region of interest according to the first specified parameter value;
a brightness compensation coefficient determining module 340, configured to determine a brightness compensation coefficient according to the first specified parameter value and the second specified parameter value;
And the brightness compensation module 350 is configured to perform brightness compensation on the defogging image by using the brightness compensation coefficient.
In one embodiment, the second specified parameter value acquisition module 330 is specifically configured to:
and taking the first specified parameter value as an input of the defogging algorithm, and obtaining an output result of the defogging algorithm as a second specified parameter value.
In one embodiment, the first specified parameter value includes a luminance average value of a Y channel of the region of interest and a color average value of each of R, G, B channels of the region of interest;
The second specified parameter value acquisition module 330 is further configured to:
Determining a dark channel value of the region of interest according to the average value of the colors of all the R, G, B channels of the region of interest;
calculating the transmissivity of the region of interest according to the dark channel value;
And calculating the average brightness value of the defogged Y channel as a second designated parameter value based on the transmissivity of the region of interest and the average brightness value of the Y channel.
In one embodiment, the brightness compensation coefficient determining module 340 is specifically configured to:
And calculating the ratio of the first specified parameter value to the second specified parameter value as a brightness compensation coefficient.
In one embodiment, the brightness compensation module 350 is specifically configured to:
And multiplying the pixel value of each pixel point in the defogging image by the brightness compensation coefficient to obtain a brightness compensated image.
In one embodiment, the apparatus may further comprise the following modules:
And the contrast adjustment module is used for carrying out contrast adjustment on the image subjected to brightness compensation by adopting the first appointed parameter value.
In one embodiment, the first specified parameter value includes an average value of brightness of a Y channel of the region of interest, and the contrast adjustment module is specifically configured to:
comparing the brightness value of each pixel point in the brightness compensated image with the brightness average value;
and adjusting the brightness value of the pixel point with the brightness value larger than the brightness average value to be high, and adjusting the brightness value of the pixel point with the brightness value smaller than the brightness average value to be low.
It should be noted that, the image defogging device provided by the embodiment of the present application may execute the image defogging method provided by the first embodiment or the second embodiment of the present application, and has the corresponding functional module and beneficial effects of the execution method.
Example IV
Fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present application, where the electronic device may include an image processing tool, a video production tool, a short video app client, a live app client, and other application programs, as shown in fig. 4, and the electronic device includes a processor 410, a memory 420, an input device 430, and an output device 440; the number of processors 410 in the electronic device may be one or more, one processor 410 being taken as an example in fig. 4; the processor 410, memory 420, input device 430, and output device 440 in the electronic device may be connected by a bus or other means, for example in fig. 4.
The memory 420 is a computer readable storage medium that can be used to store software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the methods in embodiments of the present application. The processor 410 executes various functional applications of the electronic device and data processing, i.e., implements the methods described above, by running software programs, instructions, and modules stored in the memory 420.
Memory 420 may include primarily a program storage area and a data storage area, wherein the program storage area may store an operating system, at least one application program required for functionality; the storage data area may store data created according to the use of the terminal, etc. In addition, memory 420 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, memory 420 may further include memory remotely located relative to processor 410, which may be connected to the electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 430 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the electronic device. The output 440 may include a display device such as a display screen.
Example five
The fifth embodiment of the present application also provides a storage medium containing computer-executable instructions for performing the method of any of the first embodiments when executed by a processor of a server.
From the above description of embodiments, it will be clear to a person skilled in the art that the present application may be implemented by means of software and necessary general purpose hardware, but of course also by means of hardware, although in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a FLASH Memory (FLASH), a hard disk, or an optical disk of a computer, etc., and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments of the present application.
It should be noted that, in the embodiment of the apparatus, each unit and module included are only divided according to the functional logic, but not limited to the above-mentioned division, so long as the corresponding function can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present application.
Note that the above is only a preferred embodiment of the present application and the technical principle applied. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, while the application has been described in connection with the above embodiments, the application is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the application, which is set forth in the following claims.

Claims (9)

1. A method of defogging an image, the method comprising:
Acquiring a first appointed parameter value of a region of interest of an original image;
defogging the original image by adopting a defogging algorithm to obtain a defogging image;
Determining a second designated parameter value of the defogging region of interest according to the first designated parameter value;
determining a brightness compensation coefficient according to the first specified parameter value and the second specified parameter value;
Performing brightness compensation on the defogging image by adopting the brightness compensation coefficient;
Wherein the first specified parameter value and the second specified parameter value are both luminance-related parameter values; the defogging algorithm comprises a dark channel priori defogging algorithm based on guide filtering;
the determining a second specified parameter value after defogging the region of interest according to the first specified parameter value comprises:
and taking the first specified parameter value as an input of the defogging algorithm, and obtaining an output result of the defogging algorithm as a second specified parameter value.
2. The method of claim 1, wherein the first specified parameter value comprises a luminance average of a Y channel of the region of interest and a color average of each of R, G, B channels of the region of interest;
the step of obtaining an output result of the defogging algorithm by taking the first specified parameter as an input of the defogging algorithm, and taking the output result as a second specified parameter value comprises the following steps:
Determining a dark channel value of the region of interest according to the average value of the colors of all the R, G, B channels of the region of interest;
calculating the transmissivity of the region of interest according to the dark channel value;
And calculating the average brightness value of the defogged Y channel as a second designated parameter value based on the transmissivity of the region of interest and the average brightness value of the Y channel.
3. The method according to any one of claims 1-2, wherein said determining a luminance compensation coefficient from said first specified parameter value and said second specified parameter value comprises:
And calculating the ratio of the first specified parameter value to the second specified parameter value as a brightness compensation coefficient.
4. The method according to any one of claims 1-2, wherein said employing said brightness compensation coefficients to brightness compensate said defogged image comprises:
And multiplying the pixel value of each pixel point in the defogging image by the brightness compensation coefficient to obtain a brightness compensated image.
5. The method according to any one of claims 1-2, wherein after said employing said brightness compensation coefficients to brightness compensate said defogging image, said method further comprises:
And carrying out contrast adjustment on the image subjected to brightness compensation by adopting the first appointed parameter value.
6. The method of claim 5, wherein the first specified parameter value comprises an average value of brightness of a Y channel of the region of interest, and wherein using the first specified parameter value to contrast-adjust the brightness-compensated image comprises:
comparing the brightness value of each pixel point in the brightness compensated image with the brightness average value;
and adjusting the brightness value of the pixel point with the brightness value larger than the brightness average value to be high, and adjusting the brightness value of the pixel point with the brightness value smaller than the brightness average value to be low.
7. An apparatus for defogging an image, the apparatus comprising:
The first specified parameter value acquisition module is used for acquiring a first specified parameter value of a region of interest of the original image;
The defogging processing module is used for defogging the original image by adopting a defogging algorithm to obtain a defogged image;
the second designated parameter value acquisition module is used for determining a second designated parameter value of the defogging region of interest according to the first designated parameter value;
A brightness compensation coefficient determining module, configured to determine a brightness compensation coefficient according to the first specified parameter value and the second specified parameter value;
The brightness compensation module is used for carrying out brightness compensation on the defogging image by adopting the brightness compensation coefficient;
Wherein the first specified parameter value and the second specified parameter value are both luminance-related parameter values; the defogging algorithm comprises a dark channel priori defogging algorithm based on guide filtering;
The second specified parameter value obtaining module is specifically configured to: and taking the first specified parameter value as an input of the defogging algorithm, and obtaining an output result of the defogging algorithm as a second specified parameter value.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1-6 when the program is executed by the processor.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any of claims 1-6.
CN202110655785.6A 2021-06-11 2021-06-11 Image defogging method and device Active CN113379631B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110655785.6A CN113379631B (en) 2021-06-11 2021-06-11 Image defogging method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110655785.6A CN113379631B (en) 2021-06-11 2021-06-11 Image defogging method and device

Publications (2)

Publication Number Publication Date
CN113379631A CN113379631A (en) 2021-09-10
CN113379631B true CN113379631B (en) 2024-05-17

Family

ID=77574148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110655785.6A Active CN113379631B (en) 2021-06-11 2021-06-11 Image defogging method and device

Country Status (1)

Country Link
CN (1) CN113379631B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118172284B (en) * 2024-05-10 2024-07-12 山西省财政税务专科学校 Processing method, device, medium and equipment for visual data of remote computer

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002281312A (en) * 2001-03-15 2002-09-27 Minolta Co Ltd Device, method and program for processing image
CN107292853A (en) * 2017-07-27 2017-10-24 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and mobile terminal
CN107317970A (en) * 2017-07-27 2017-11-03 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and computer equipment
CN107424198A (en) * 2017-07-27 2017-12-01 广东欧珀移动通信有限公司 Image processing method, device, mobile terminal and computer-readable recording medium
CN107451969A (en) * 2017-07-27 2017-12-08 广东欧珀移动通信有限公司 Image processing method, device, mobile terminal and computer-readable recording medium
CN111192205A (en) * 2019-11-22 2020-05-22 晏子俊 Image defogging method and system and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101583947B1 (en) * 2014-06-20 2016-01-08 현대자동차주식회사 Apparatus and method for image defogging

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002281312A (en) * 2001-03-15 2002-09-27 Minolta Co Ltd Device, method and program for processing image
CN107292853A (en) * 2017-07-27 2017-10-24 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and mobile terminal
CN107317970A (en) * 2017-07-27 2017-11-03 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and computer equipment
CN107424198A (en) * 2017-07-27 2017-12-01 广东欧珀移动通信有限公司 Image processing method, device, mobile terminal and computer-readable recording medium
CN107451969A (en) * 2017-07-27 2017-12-08 广东欧珀移动通信有限公司 Image processing method, device, mobile terminal and computer-readable recording medium
CN111192205A (en) * 2019-11-22 2020-05-22 晏子俊 Image defogging method and system and computer readable storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Improved Dark Channel Prior for Image Defogging Using RGB and YCbCr Color Space;Zahid Tufail del;IEEE Access;第6卷;全文 *
交通图像去雾方法及应用研究;杜晶晶;中国优秀博士学位论文全文数据库工程科技Ⅱ辑;全文 *
基于改进暗通道算法的雾天车辆偏离预警研究;周劲草;魏朗;张在吉;;东北师大学报(自然科学版)(01);全文 *

Also Published As

Publication number Publication date
CN113379631A (en) 2021-09-10

Similar Documents

Publication Publication Date Title
Liang et al. Single underwater image enhancement by attenuation map guided color correction and detail preserved dehazing
US11127122B2 (en) Image enhancement method and system
Bai et al. Underwater image enhancement based on global and local equalization of histogram and dual-image multi-scale fusion
CN107767354B (en) Image defogging algorithm based on dark channel prior
WO2016206087A1 (en) Low-illumination image processing method and device
Gao et al. Sand-dust image restoration based on reversing the blue channel prior
Zotin Fast algorithm of image enhancement based on multi-scale retinex
CN108133462B (en) Single image restoration method based on gradient field region segmentation
CN105959510B (en) A kind of video rapid defogging method
CN111861896A (en) UUV-oriented underwater image color compensation and recovery method
CN115578297A (en) Generalized attenuation image enhancement method for self-adaptive color compensation and detail optimization
CN111192205A (en) Image defogging method and system and computer readable storage medium
Yu et al. Image and video dehazing using view-based cluster segmentation
CN110175967B (en) Image defogging processing method, system, computer device and storage medium
Rahman et al. Efficient contrast adjustment and fusion method for underexposed images in industrial cyber-physical systems
CN113379631B (en) Image defogging method and device
CN110852971B (en) Video defogging method based on dark channel prior and Retinex and storage medium
CN112825189B (en) Image defogging method and related equipment
CN117218039A (en) Image processing method, device, computer equipment and storage medium
CN115564682A (en) Uneven-illumination image enhancement method and system
Negru et al. Exponential image enhancement in daytime fog conditions
CN114266803A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114119411A (en) Fog noise video image recovery method, device, equipment and medium
Tang et al. Sky-preserved image dehazing and enhancement for outdoor scenes
Kumari et al. Improved single image and video dehazing using morphological operation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant