CN113344802A - Underwater image restoration method based on self-adaptive atmospheric light fusion - Google Patents

Underwater image restoration method based on self-adaptive atmospheric light fusion Download PDF

Info

Publication number
CN113344802A
CN113344802A CN202110419038.2A CN202110419038A CN113344802A CN 113344802 A CN113344802 A CN 113344802A CN 202110419038 A CN202110419038 A CN 202110419038A CN 113344802 A CN113344802 A CN 113344802A
Authority
CN
China
Prior art keywords
image
atmospheric light
value
channel
red
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110419038.2A
Other languages
Chinese (zh)
Other versions
CN113344802B (en
Inventor
张维石
周景春
王燕云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Maritime University
Original Assignee
Dalian Maritime University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Maritime University filed Critical Dalian Maritime University
Priority to CN202110419038.2A priority Critical patent/CN113344802B/en
Publication of CN113344802A publication Critical patent/CN113344802A/en
Application granted granted Critical
Publication of CN113344802B publication Critical patent/CN113344802B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an underwater image restoration method based on self-adaptive atmospheric light fusion, which comprises the steps of firstly determining three atmospheric light values to be fused, determining a first atmospheric light value and a second atmospheric light value through a quadtree search method, taking the average value of the maximum 0.1% pixel values in all pixels in a red-dark channel image as the third atmospheric light value, and fusing the three atmospheric light values through two self-adaptive parameters; secondly, solving the saturation of the original image; solving a rough transmission image according to an underwater imaging model, decomposing the rough transmission image into a content image and a profile image through guiding filtering, carrying out self-adaptive processing on the profile image according to local variance, and reconstructing two processed images to obtain a refined transmission image; and finally, carrying out automatic color gradation processing on the restored image to obtain a final restored image. The invention utilizes self-adaptive atmospheric light fusion and optimized transmissivity, not only effectively improves the definition of the image, but also can solve the problem of color distortion, and has strong scene adaptability.

Description

Underwater image restoration method based on self-adaptive atmospheric light fusion
Technical Field
The invention relates to the technical field of image processing, in particular to an underwater image restoration method based on self-adaptive atmospheric light fusion.
Background
More than seventy percent of the earth is ocean, the ocean world contains a large amount of resources, and the development and the protection of ocean resources become more important under the problems of sudden increase of land population, gradual depletion of resources, rapid deterioration of environment and the like. The underwater image is an important carrier of ocean information, however, due to absorption and scattering of light, the underwater image is usually degraded, the real color of an object is changed, and the quality is difficult to achieve a satisfactory effect, so that the application and development of the image in underwater operation are severely restricted. Currently, underwater image sharpening technologies are roughly classified into 2 types: an underwater image enhancement method and an underwater image restoration method.
The underwater image enhancement method does not consider the imaging mechanism of the image, and only focuses on how to improve the visual quality by adjusting the pixel values of the image. Although a plurality of enhancement methods can improve the quality of the underwater image, the methods do not consider the physical process of the underwater image degradation, generally only enhance the visual perception, are biased to subjective judgment of people, and the lost detail information cannot be repaired. The underwater image restoration method considers an imaging mechanism of the underwater image, establishes an effective underwater image degradation model, deduces restoration parameters through a physical model and priori knowledge, and learns the degradation process of the whole image to restore the underwater image to a state before degradation.
At present, most of image restoration methods based on physical models need to process for a long time, cannot be applied to actual environments, are insufficient in robustness and scene adaptability, and cannot perform self-adaptive adjustment when different types of degraded images are restored.
Disclosure of Invention
According to the technical problem, an underwater image restoration method based on adaptive atmospheric light fusion is provided. The method comprises the steps of determining candidate atmospheric light values by utilizing a quadtree search method, carrying out self-adaptive fusion on the atmospheric light values, obtaining saturation, obtaining a rough transmission image through an underwater image restoration model, refining the transmission image, obtaining a primary restoration image, and finally carrying out automatic color gradation to obtain a final restoration image.
The technical means adopted by the invention are as follows:
an underwater image restoration method based on self-adaptive atmospheric light fusion comprises the following steps:
step S01: acquiring an original image, and establishing an underwater image restoration model for the original image;
step S02: obtaining a red and dark channel image after the inversion of a red channel by using the priori knowledge of the red and dark channel;
step S03: solving a first atmospheric light value and a second atmospheric light value by using a quadtree search method, taking the average value of the maximum 0.1% pixel values in all pixels in the red-dark channel image as a third atmospheric light value, and fusing the three atmospheric light values to obtain a global atmospheric light value;
step S04: estimating the saturation of the original image;
step S05: according to the global atmospheric light value obtained in the step S03 and the saturation obtained in the step S04, solving a rough transmission diagram by applying dark channel priori knowledge;
step S06: decomposing and refining the rough transmission diagram through guiding filtering to obtain a refined transmission diagram;
step S07: substituting the global atmospheric light value obtained in the step S03 and the refined transmission map obtained in the step S05 into an underwater image restoration model to obtain an initial restoration image;
step S08: and carrying out automatic color gradation on the initial restoration image to obtain a final restoration image.
Further, the underwater image restoration model in step S01 is:
Ic(x)=Jc(x)tc(x)+Bc(1-tc(x))
wherein, IcIs the original input image, JcIs the restored output image, c represents one of the R, G, B color channels of the image, tc(x) Denotes the transmittance, BcRepresenting a global atmospheric light value.
Further, the formula of the red dark channel prior knowledge used in the red dark channel map calculation in step S02 is as follows:
Figure RE-GDA0003131693510000021
where x, y denote different pixel positions, JRED(x) Denotes a red dark channel map, Ω (x) denotes a local area centered on x, JR(y)、JG(y) and JB(y) denotes a red channel, a green channel, and a blue channel of the image, respectively.
Further, the fusion of the three atmospheric light values in step S03 includes the following steps:
step S31: uniformly dividing an original image into four regions, calculating the score of each region, defining the score of each region as the minimum value of the variance of pixel values in the region when obtaining the first atmospheric light value, selecting the region with the highest score from the four regions, uniformly dividing the region into the four regions, and continuously selecting the region with the highest score until the end condition is met
Figure RE-GDA0003131693510000031
Stopping the process, and obtaining the average value of the pixels in the termination area as the first atmospheric light value B1
Wherein c is one of a red channel, a green channel and a blue channel; epsilonnThe value is 0.2; icMore than 0.5| represents the number of pixel values in a certain channel of the image which are more than 0.5;
second atmospheric light value B2And (2) and (B)1The same process as in (1), except that the score of each region is defined as the minimum of the sum of the absolute values of the differences between the red channel and the blue and green channel pixel values, respectively, in the region;
in the calculation of B2In the process of (3), the sum of the absolute values of the differences between the red channel and the blue and green channel pixel values is calculated by the following formula:
C2=|IR(x)-IG(x)|+|IR(x)-IB(x)|
wherein, IR、IG、IBPixel values representing red, green, and blue channels of an image, respectively;
step S32: taking the average value of the maximum 0.1% pixel values in all pixels in the red and dark channel map as a third atmospheric light value B3;
step S33: by two adaptive parameters alpha and beta to B1、B2And B3And (3) carrying out fusion to obtain a global atmospheric light value B, wherein the formula for calculating alpha and beta is as follows:
Figure RE-GDA0003131693510000032
Figure RE-GDA0003131693510000033
wherein grI is a grayscale map of the original image; i isrIs the red channel pixel value in the original image; epsilonα=0.3,εβ0.2; s is a Sigmoid function defined as follows:
S(a,v)=[1+e-s(a-v)]-1
wherein s is an empirical constant with a value of 32; to B1、B2And B3The formula for performing the subchannel fusion is as follows:
B=β(αB1+(1-α)B2)+(1-β)B3
further, the saturation estimation formula in step S04 is as follows:
Figure RE-GDA0003131693510000041
wherein, RGB represents three channels of red, green and blue of the image, max () function represents the maximum value of the RGB channel in the formula, and min () function represents the minimum value of the RGB channel in the formula; after adding the saturation component, the red dark channel prior knowledge is modified to:
Figure RE-GDA0003131693510000042
wherein, JRED-SAT(x) A corrected red dark channel map is shown.
Further, the formula for solving the rough transmission map in step S05 is as follows:
Figure RE-GDA0003131693510000043
wherein, BR、BG、BBRGB channel components of the global atmospheric light value obtained in step S03, respectively; sat (y) is the saturation obtained in step S04, and λ is 0.6.
Further, the finding of the refined transmission map in step S06 includes the steps of:
step S61: and performing guided filtering processing on the rough transmission map in the step S05 to obtain a content map, wherein the formula of the guided filtering is as follows:
c=guided_filter(I,p,r,eps)
wherein, I refers to a gray scale image of an original image; p is the roughness transmission map, r is the local window radius, with a value of 22, eps is the regularization parameter, with a value of 0.0001;
step S62: and decomposing the rough transmission diagram to obtain a profile diagram, wherein the rough transmission diagram is decomposed into a content diagram and a profile diagram, and the decomposition process adopts the following formula:
d=t–c
wherein d represents a profile, t represents a rough transmission map, and c represents a content map obtained in step S61;
step S63: using local variance based sharpening formula to optimize the detail of the contour map, assuming xklIs the gray value of a certain point in the image, and the definition of the local area is as follows: a region having a window size of (2m +1) (2n +1) centered on (i, j), where m, n are integers, and the formula is as follows:
Figure RE-GDA0003131693510000051
Figure RE-GDA0003131693510000052
wherein, muijMeans, v, of the partijDenotes local variance, (2m +1) (2n +1) denotes the size of the local window, xklRepresenting gray values within a local window;
V=var(t)
D=(1+vij/V)(t-μij)
wherein V represents the local variance of the coarse transmission map; d represents the refined outline;
step S64: and fusing the content graph and the outline graph to obtain a refined transmission graph, wherein the fusion formula is as follows:
t=c+D
where c is the content map obtained in step S61, and D is the contour map optimized in step S63.
Further, the formula for solving the initial restored image in step S07 is as follows:
Figure RE-GDA0003131693510000053
wherein, Jc(x) Is an initial restored image, Ic(x) Is an original image, BcIs the global atmospheric light value, t, obtained in step S03c(x) Is the refined transmission map obtained in step S06, and t is added to avoid the too-low transmittance causing the too-bright restoration image0As a lower limit, the value is 0.1.
Compared with the prior art, the invention has the following advantages:
1. in order to solve the problem of color distortion caused by inaccuracy of atmospheric light value estimation in a restoration method, the atmospheric light value is solved according to the characteristics of an original image, self-adaptive fusion is carried out according to the brightness of the image and the size of a red channel pixel value, and the atmospheric light value can be accurately solved no matter other color deviations such as blue deviation, green deviation, blue-green deviation and the like, so that the restored image has better color fidelity.
2. According to the invention, the rough transmission image is decomposed, the contour image is sharpened by using a method based on local variance, and then fusion is carried out, so that the detail information of the transmission image is enhanced, and finally the definition of the restored image is greatly improved.
For the above reasons, the present invention can be widely applied to the fields of image processing and the like.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow chart of the present invention.
Fig. 2 is a graph comparing the restoration effect of the present invention on images of divers with other methods, wherein (a) shows an initial image before restoration, (b) shows a result graph processed by the olap method, (c) shows a result graph processed by the RGHS method, (d) shows a result graph processed by the iba method, and (e) shows a result graph processed by the restoration method according to the present invention.
Fig. 3 is a graph comparing the restoration effect of the present invention on fish images with other methods, wherein (a) shows an initial image before restoration, (b) shows a result graph processed by the iba method, (c) shows a result graph processed by the RGHS method, (d) shows a result graph processed by the olap method, and (e) shows a result graph processed by the restoration method according to the present invention.
Fig. 4 is a graph showing the effect of restoring an image of grouper according to the present invention compared with other methods, in which (a) shows an initial image before restoration, (b) shows a result graph of processing using the iba method, (c) shows a result graph of processing using the RGHS method, (d) shows a result graph of processing using the olap method, and (e) shows a result graph of processing using the restoration method according to the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
As shown in fig. 1, the present invention provides an underwater image restoration method based on adaptive atmospheric light fusion, which includes the following steps:
step S01: acquiring an original image, and establishing an underwater image restoration model for the original image;
the underwater image restoration model in step S01 is:
Ic(x)=Jc(x)tc(x)+Bc(1-tc(x))
wherein, IcIs the original input image, JcIs the restored output image, c represents one of the R, G, B color channels of the image, tc(x) Denotes the transmittance, BcRepresents a global atmospheric light value;
step S02: obtaining a red and dark channel image after the inversion of a red channel by using the priori knowledge of the red and dark channel;
the formula of the prior knowledge of the red dark channel used in the step S02 to calculate the red dark channel map is:
Figure RE-GDA0003131693510000071
where x, y denote different pixel positions, JRED(x) Denotes a red dark channel map, Ω (x) denotes a local area centered on x, JR(y)、JG(y) and JB(y) red, green and blue channels representing images, respectively;
step S03: solving a first atmospheric light value and a second atmospheric light value by using a quadtree search method, taking the average value of the maximum 0.1% pixel values in all pixels in the red-dark channel image as a third atmospheric light value, and fusing the three atmospheric light values to obtain a global atmospheric light value;
the fusion of the three atmospheric light values in step S03 includes the following steps:
step S31: uniformly dividing an original image into four regions, calculating the score of each region, defining the score of each region as the minimum value of the variance of pixel values in the region when obtaining the first atmospheric light value, selecting the region with the highest score from the four regions, uniformly dividing the region into the four regions, and continuously selecting the region with the highest score until the end condition is met
Figure RE-GDA0003131693510000081
Stopping the process, and obtaining the average value of the pixels in the termination area as the first atmospheric light value B1
Wherein c is one of a red channel, a green channel and a blue channel; epsilonnThe value is 0.2; icMore than 0.5| represents the number of pixel values in a certain channel of the image which are more than 0.5;
second atmospheric light value B2And (2) and (B)1The same process as in (1), except that the score of each region is defined as the minimum of the sum of the absolute values of the differences between the red channel and the blue and green channel pixel values, respectively, in the region;
in the calculation of B2In the process of (3), the sum of the absolute values of the differences between the red channel and the blue and green channel pixel values is calculated by the following formula:
C2=|IR(x)-IG(x)|+|IR(x)-IB(x)|
wherein, IR、IG、IBPixel values representing red, green, and blue channels of an image, respectively;
step S32: taking the average value of the maximum 0.1% pixel values in all pixels in the red and dark channel map as a third atmospheric light value B3;
step S33: by two adaptive parameters alpha and beta to B1、B2And B3And (3) carrying out fusion to obtain a global atmospheric light value B, wherein the formula for calculating alpha and beta is as follows:
Figure RE-GDA0003131693510000082
Figure RE-GDA0003131693510000083
wherein grI is a grayscale map of the original image; i isrIs the red channel pixel value in the original image; epsilonα=0.3,εβ0.2; s is a Sigmoid function defined as follows:
S(a,v)=[1+e-s(a-v)]-1
wherein s is an empirical constant with a value of 32; to B1、B2And B3The formula for performing the subchannel fusion is as follows:
B=β(αB1+(1-α)B2)+(1-β)B3
step S04: estimating the saturation of the original image;
the saturation estimation formula in step S04 is as follows:
Figure RE-GDA0003131693510000091
wherein, RGB represents three channels of red, green and blue of the image, max () function represents the maximum value of the RGB channel in the formula, and min () function represents the minimum value of the RGB channel in the formula; after adding the saturation component, the red dark channel prior knowledge is modified to:
Figure RE-GDA0003131693510000092
wherein, JRED-SAT(x) Representing the corrected red dark channel map;
step S05: according to the global atmospheric light value obtained in the step S03 and the saturation obtained in the step S04, solving a rough transmission diagram by applying dark channel priori knowledge;
the formula for solving the rough transmission map in step S05 is as follows:
Figure RE-GDA0003131693510000093
wherein, BR、BG、BBRGB channel components of the global atmospheric light value obtained in step S03, respectively; sat (y) is the saturation obtained in step S04, and λ is 0.6;
step S06: decomposing and refining the rough transmission diagram through guiding filtering to obtain a refined transmission diagram;
the step of obtaining the refined transmission map in the step S06 includes the steps of:
step S61: and performing guided filtering processing on the rough transmission map in the step S05 to obtain a content map, wherein the formula of the guided filtering is as follows:
c=guided_filter(I,p,r,eps)
wherein, I refers to a gray scale image of an original image; p is the roughness transmission map, r is the local window radius, with a value of 22, eps is the regularization parameter, with a value of 0.0001;
step S62: and decomposing the rough transmission diagram to obtain a profile diagram, wherein the rough transmission diagram is decomposed into a content diagram and a profile diagram, and the decomposition process adopts the following formula:
d=t–c
wherein d represents a profile, t represents a rough transmission map, and c represents a content map obtained in step S61;
step S63: using local variance based sharpening formula to optimize the detail of the contour map, assuming xklIs the gray value of a certain point in the image, and the definition of the local area is as follows: a region having a window size of (2m +1) (2n +1) centered on (i, j), where m, n are integers, and the formula is as follows:
Figure RE-GDA0003131693510000101
Figure RE-GDA0003131693510000102
wherein, muijMeans, v, of the partijDenotes local variance, (2m +1) (2n +1) denotes the size of the local window, xklRepresenting gray values within a local window;
V=var(t)
D=(1+vij/V)(t-μij)
wherein V represents the local variance of the coarse transmission map; d represents the refined outline;
step S64: and fusing the content graph and the outline graph to obtain a refined transmission graph, wherein the fusion formula is as follows:
t=c+D
wherein c is the content map obtained in step S61, and D is the contour map optimized in step S63;
step S07: substituting the global atmospheric light value obtained in the step S03 and the refined transmission map obtained in the step S05 into an underwater image restoration model to obtain an initial restoration image;
the formula for solving the initial restored image in step S07 is as follows:
Figure RE-GDA0003131693510000103
wherein, Jc(x) Is an initial restored image, Ic(x) Is an original image, BcIs the global atmospheric light value, t, obtained in step S03c(x) Is the refined transmission map obtained in step S06, and t is added to avoid the too-low transmittance causing the too-bright restoration image0As a lower limit, a value of 0.1 is selected;
step S08: and carrying out automatic color gradation on the initial restoration image to obtain a final restoration image.
In order to verify the effectiveness of defogging in the present invention, the present embodiment selects Underwater images of different scenes as a test set, and performs comparative analysis from both qualitative and quantitative aspects with the experimental results of the ibla (enhanced Water Image retrieval Based on Image blur and Light absorption) method, the RGHS (hollow-Water Image Enhancement Using Relative Global history base on Adaptive Parameter Acquisition) method, the olap (a Rapid Scene Depth Estimation Model Based on lower Water Light attribute for lower Water Image retrieval) method.
As shown in FIG. 2, the invention provides a comparison graph of the restoration effect of the diver image compared with other methods, and as can be seen from the experimental effect graph, the four methods restore the underwater image to a certain extent, and improve the contrast of the underwater image. However, the restoration effect of the olap method is poor, the problem of color bias of the resulting image is more serious than that of the original image, and the contrast of the distant view portion is not enhanced. Although the IBLA method and the RGHS method improve the contrast of a distant view part to a certain extent, the RGHS method is not thorough in color cast removal, and detail information of a foreground processed by the IBLA method is lost and unclear. Compared with other methods, the underwater image processed by the method of the invention better solves the color cast problem and improves the contrast ratio of the distant view part and the near view part in the underwater image. Therefore, the method has a good restoration effect, effectively solves the problem of color cast of the underwater image, improves the overall contrast and definition, and realizes detail enhancement and color fidelity.
As shown in fig. 3, the invention provides a comparison graph of restoration effects of fish images in comparison with other methods, and as can be seen from experimental effect graphs, the four methods restore underwater images to a certain extent, and improve the contrast of the underwater images. However, the restoration effect of the olap method is poor, the resulting image of the olap method is yellowish compared to the original image, and the contrast of the distant view portion is not enhanced. Although the IBLA method and the RGHS method improve the contrast of a distant view part to a certain extent, the RGHS method is not thorough in color cast removal, and detail information of a foreground processed by the IBLA method is lost and unclear. Compared with other methods, the underwater image processed by the method of the invention better solves the color cast problem and improves the contrast ratio of the distant view part and the near view part in the underwater image. Therefore, the method has a good restoration effect, effectively solves the problem of color cast of the underwater image, improves the overall contrast and definition, and realizes detail enhancement and color fidelity.
As shown in fig. 4, the invention provides a comparison graph of restoration effects of grouper images compared with other methods, and as can be seen from experimental effect graphs, the four methods restore underwater images to a certain extent, and improve the contrast of the underwater images. However, the restoration effect of the olap method is poor, the color cast of the resultant image is more serious than that of the original image, and the contrast of the distant view portion is not enhanced. Although the IBLA method and the RGHS method improve the contrast of a distant view part to a certain extent, the RGHS method is not thorough in color cast removal, and details of foreground processed by the IBLA method are lost and overexposed. Compared with other methods, the underwater image processed by the method of the invention better solves the color cast problem and improves the contrast ratio of the distant view part and the near view part in the underwater image. Therefore, the method has a good restoration effect, effectively solves the problem of color cast of the underwater image, improves the overall contrast and definition, and realizes detail enhancement and color fidelity.
In the embodiment, the experimental results of different methods are compared according to two objective indexes of average gradient and UIQM; as can be seen from the data in table 1 and table 2, the average gradient and UIQM of the IBLA method, RGHS method, olap method and the present invention are all larger than the original image; although the UIQM of the three methods is larger than that of the original image, the average gradient improvement of the result is not large, which indicates that although the IBLA method, the RGHS method and the olap method can improve the image quality to some extent, the definition and the contrast are not enough, and the problem of incomplete color cast removal exists. The invention adopts a self-adaptive atmosphere light fusion mode to more accurately obtain the atmosphere light value, can effectively solve the color cast problem, guides filtering to effectively keep the edge and enhance the details, and automatically processes the color level, thereby improving the contrast. Therefore, the method has larger promotion on the average gradient and UIQM of the original image, and is superior to other underwater image restoration methods.
TABLE 1 average gradient AG comparison of treatment results of the inventive and other methods
Figure RE-GDA0003131693510000121
TABLE 2 UIQM comparison of results of the inventive and other methods
Figure RE-GDA0003131693510000122
Figure RE-GDA0003131693510000131
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (8)

1. An underwater image restoration method based on self-adaptive atmospheric light fusion is characterized by comprising the following steps:
step S01: acquiring an original image, and establishing an underwater image restoration model for the original image;
step S02: obtaining a red and dark channel image after the inversion of a red channel by using the priori knowledge of the red and dark channel;
step S03: solving a first atmospheric light value and a second atmospheric light value by using a quadtree search method, taking the average value of the maximum 0.1% pixel values in all pixels in the red-dark channel image as a third atmospheric light value, and fusing the three atmospheric light values to obtain a global atmospheric light value;
step S04: estimating the saturation of the original image;
step S05: according to the global atmospheric light value obtained in the step S03 and the saturation obtained in the step S04, solving a rough transmission diagram by applying dark channel priori knowledge;
step S06: decomposing and refining the rough transmission diagram through guiding filtering to obtain a refined transmission diagram;
step S07: substituting the global atmospheric light value obtained in the step S03 and the refined transmission map obtained in the step S05 into an underwater image restoration model to obtain an initial restoration image;
step S08: and carrying out automatic color gradation on the initial restoration image to obtain a final restoration image.
2. The underwater image restoration method based on adaptive atmospheric light fusion according to claim 1, wherein the underwater image restoration model in step S01 is:
Ic(x)=Jc(x)tc(x)+Bc(1-tc(x))
wherein, IcIs the original input image, JcIs the restored output image, c represents one of the R, G, B color channels of the image, tc(x) Representing transmissionRate, BcRepresenting a global atmospheric light value.
3. The underwater image restoration method based on adaptive atmospheric light fusion and transmittance optimization according to claim 1, wherein the formula of the red dark channel prior knowledge used in the step S02 of finding the red dark channel map is as follows:
Figure RE-FDA0003131693500000011
where x, y denote different pixel positions, JRED(x) Denotes a red dark channel map, Ω (x) denotes a local area centered on x, JR(y)、JG(y) and JB(y) denotes a red channel, a green channel, and a blue channel of the image, respectively.
4. The underwater image restoration method based on the adaptive atmospheric light fusion of claim 1, wherein the fusion of the three atmospheric light values in the step S03 comprises the following steps:
step S31: uniformly dividing an original image into four regions, calculating the score of each region, defining the score of each region as the minimum value of the variance of pixel values in the region when obtaining the first atmospheric light value, selecting the region with the highest score from the four regions, uniformly dividing the region into the four regions, and continuously selecting the region with the highest score until the end condition is met
Figure RE-FDA0003131693500000021
Stopping the process, and obtaining the average value of the pixels in the termination area as the first atmospheric light value B1
Wherein c is one of a red channel, a green channel and a blue channel; epsilonnThe value is 0.2; icMore than 0.5| represents the number of pixel values in a certain channel of the image which are more than 0.5;
second atmospheric light value B2And (2) and (B)1Is the same as that of (1) except thatDefining the score of each region as the minimum value of the sum of absolute values of the differences between the pixel values of the red channel and the blue channel and the pixel values of the green channel in the region;
in the calculation of B2In the process of (3), the sum of the absolute values of the differences between the red channel and the blue and green channel pixel values is calculated by the following formula:
C2=|IR(x)-IG(x)|+|IR(x)-IB(x)|
wherein, IR、IG、IBPixel values representing red, green, and blue channels of an image, respectively;
step S32: taking the average value of the maximum 0.1% pixel values in all pixels in the red and dark channel map as a third atmospheric light value B3;
step S33: by two adaptive parameters alpha and beta to B1、B2And B3And (3) carrying out fusion to obtain a global atmospheric light value B, wherein the formula for calculating alpha and beta is as follows:
Figure RE-FDA0003131693500000022
Figure RE-FDA0003131693500000023
wherein grI is a grayscale map of the original image; i isrIs the red channel pixel value in the original image; epsilonα=0.3,εβ0.2; s is a Sigmoid function defined as follows:
S(a,v)=[1+e-s(a-v)]-1
wherein s is an empirical constant with a value of 32; to B1、B2And B3The formula for performing the subchannel fusion is as follows:
B=β(αB1+(1-α)B2)+(1-β)B3
5. the underwater image restoration method based on adaptive atmospheric light fusion according to claim 3, wherein the saturation estimation formula in step S04 is as follows:
Figure RE-FDA0003131693500000031
wherein, RGB represents three channels of red, green and blue of the image, max () function represents the maximum value of the RGB channel in the formula, and min () function represents the minimum value of the RGB channel in the formula; after adding the saturation component, the red dark channel prior knowledge is modified to:
Figure RE-FDA0003131693500000032
wherein, JRED-SAT(x) A corrected red dark channel map is shown.
6. The underwater image restoration method based on adaptive atmospheric light fusion according to claim 1, wherein the formula for solving the rough transmission map in step S05 is as follows:
Figure RE-FDA0003131693500000033
wherein, BR、BG、BBRGB channel components of the global atmospheric light value obtained in step S03, respectively; sat (y) is the saturation obtained in step S04, and λ is 0.6.
7. The underwater image restoration method based on the adaptive atmospheric light fusion of claim 1, wherein the step of obtaining the refined transmission map in the step S06 comprises the following steps:
step S61: and performing guided filtering processing on the rough transmission map in the step S05 to obtain a content map, wherein the formula of the guided filtering is as follows:
c=guided_filter(I,p,r,eps)
wherein, I refers to a gray scale image of an original image; p is the roughness transmission map, r is the local window radius, with a value of 22, eps is the regularization parameter, with a value of 0.0001;
step S62: and decomposing the rough transmission diagram to obtain a profile diagram, wherein the rough transmission diagram is decomposed into a content diagram and a profile diagram, and the decomposition process adopts the following formula:
d=t–c
wherein d represents a profile, t represents a rough transmission map, and c represents a content map obtained in step S61;
step S63: using local variance based sharpening formula to optimize the detail of the contour map, assuming xklIs the gray value of a certain point in the image, and the definition of the local area is as follows: a region having a window size of (2m +1) (2n +1) centered on (i, j), where m, n are integers, and the formula is as follows:
Figure RE-FDA0003131693500000041
Figure RE-FDA0003131693500000042
wherein, muijMeans, v, of the partijDenotes local variance, (2m +1) (2n +1) denotes the size of the local window, xklRepresenting gray values within a local window;
V=var(t)
D=(1+vij/V)(t-μij)
wherein V represents the local variance of the coarse transmission map; d represents the refined outline;
step S64: and fusing the content graph and the outline graph to obtain a refined transmission graph, wherein the fusion formula is as follows:
t=c+D
where c is the content map obtained in step S61, and D is the contour map optimized in step S63.
8. The underwater image restoration method based on adaptive atmospheric light fusion according to claim 1, wherein the formula for solving the initial restoration image in step S07 is as follows:
Figure RE-FDA0003131693500000043
wherein, Jc(x) Is an initial restored image, Ic(x) Is an original image, BcIs the global atmospheric light value, t, obtained in step S03c(x) Is the refined transmission map obtained in step S06, and t is added to avoid the too-low transmittance causing the too-bright restoration image0As a lower limit, the value is 0.1.
CN202110419038.2A 2021-04-19 2021-04-19 Underwater image restoration method based on self-adaptive atmosphere light fusion Active CN113344802B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110419038.2A CN113344802B (en) 2021-04-19 2021-04-19 Underwater image restoration method based on self-adaptive atmosphere light fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110419038.2A CN113344802B (en) 2021-04-19 2021-04-19 Underwater image restoration method based on self-adaptive atmosphere light fusion

Publications (2)

Publication Number Publication Date
CN113344802A true CN113344802A (en) 2021-09-03
CN113344802B CN113344802B (en) 2024-08-20

Family

ID=77468126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110419038.2A Active CN113344802B (en) 2021-04-19 2021-04-19 Underwater image restoration method based on self-adaptive atmosphere light fusion

Country Status (1)

Country Link
CN (1) CN113344802B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114119383A (en) * 2021-09-10 2022-03-01 大连海事大学 Underwater image restoration method based on multi-feature fusion

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243758A (en) * 2011-07-14 2011-11-16 浙江大学 Fog-degraded image restoration and fusion based image defogging method
WO2013029337A1 (en) * 2011-08-30 2013-03-07 Fujitsu Limited Image defogging method and system
US20150243003A1 (en) * 2014-02-27 2015-08-27 Samsung Techwin Co., Ltd. Method and apparatus for processing image
US20160071244A1 (en) * 2014-09-04 2016-03-10 National Taipei University Of Technology Method and system for image haze removal based on hybrid dark channel prior
CN106846258A (en) * 2016-12-12 2017-06-13 西北工业大学 A kind of single image to the fog method based on weighted least squares filtering
WO2017175231A1 (en) * 2016-04-07 2017-10-12 Carmel Haifa University Economic Corporation Ltd. Image dehazing and restoration
US20180225545A1 (en) * 2017-02-06 2018-08-09 Mediatek Inc. Image processing method and image processing system
US20190287219A1 (en) * 2018-03-15 2019-09-19 National Chiao Tung University Video dehazing device and method
CN110689504A (en) * 2019-10-11 2020-01-14 大连海事大学 Underwater image restoration method based on secondary guide transmission diagram
CN111754433A (en) * 2020-06-22 2020-10-09 哈尔滨理工大学 Aerial image defogging method
CN112488955A (en) * 2020-12-08 2021-03-12 大连海事大学 Underwater image restoration method based on wavelength compensation

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243758A (en) * 2011-07-14 2011-11-16 浙江大学 Fog-degraded image restoration and fusion based image defogging method
WO2013029337A1 (en) * 2011-08-30 2013-03-07 Fujitsu Limited Image defogging method and system
US20150243003A1 (en) * 2014-02-27 2015-08-27 Samsung Techwin Co., Ltd. Method and apparatus for processing image
US20160071244A1 (en) * 2014-09-04 2016-03-10 National Taipei University Of Technology Method and system for image haze removal based on hybrid dark channel prior
WO2017175231A1 (en) * 2016-04-07 2017-10-12 Carmel Haifa University Economic Corporation Ltd. Image dehazing and restoration
CN106846258A (en) * 2016-12-12 2017-06-13 西北工业大学 A kind of single image to the fog method based on weighted least squares filtering
US20180225545A1 (en) * 2017-02-06 2018-08-09 Mediatek Inc. Image processing method and image processing system
US20190287219A1 (en) * 2018-03-15 2019-09-19 National Chiao Tung University Video dehazing device and method
CN110689504A (en) * 2019-10-11 2020-01-14 大连海事大学 Underwater image restoration method based on secondary guide transmission diagram
CN111754433A (en) * 2020-06-22 2020-10-09 哈尔滨理工大学 Aerial image defogging method
CN112488955A (en) * 2020-12-08 2021-03-12 大连海事大学 Underwater image restoration method based on wavelength compensation

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DAI, CHENGGANG ETAL.: "Single hazy image restoration using robust atmospheric scattering model", 《SIGNAL PROCESSING》, vol. 166, no. 0, 31 December 2020 (2020-12-31), pages 1 - 11 *
于坤 等: "基于中智学的非局部先验图像去雾算法", 《光学技术》, vol. 46, no. 4, 31 July 2020 (2020-07-31), pages 476 - 482 *
高强;胡辽林;陈鑫;: "基于暗通道补偿与大气光值改进的图像去雾方法", 激光与光电子学进展, no. 06, 31 December 2020 (2020-12-31), pages 150 - 156 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114119383A (en) * 2021-09-10 2022-03-01 大连海事大学 Underwater image restoration method based on multi-feature fusion
CN114119383B (en) * 2021-09-10 2024-04-26 大连海事大学 Underwater image restoration method based on multi-feature fusion

Also Published As

Publication number Publication date
CN113344802B (en) 2024-08-20

Similar Documents

Publication Publication Date Title
CN109816605B (en) MSRCR image defogging method based on multi-channel convolution
CN111047530B (en) Underwater image color correction and contrast enhancement method based on multi-feature fusion
CN108876743B (en) Image rapid defogging method, system, terminal and storage medium
Emberton et al. Hierarchical rank-based veiling light estimation for underwater dehazing.
CN110570365B (en) Image defogging method based on prior information
CN110889812B (en) Underwater image enhancement method for multi-scale fusion of image characteristic information
CN110689504B (en) Underwater image restoration method based on secondary guide transmission diagram
CN110689587A (en) Underwater image enhancement method based on color correction and detail enhancement
Soni et al. An improved image dehazing technique using CLAHE and guided filter
CN109118446B (en) Underwater image restoration and denoising method
CN110782407B (en) Single image defogging method based on sky region probability segmentation
Pei et al. Effective image haze removal using dark channel prior and post-processing
CN108182671B (en) Single image defogging method based on sky area identification
CN108133462B (en) Single image restoration method based on gradient field region segmentation
CN109272475B (en) Method for rapidly and effectively repairing and strengthening underwater image color
CN114331873A (en) Non-uniform illumination color image correction method based on region division
CN114119383B (en) Underwater image restoration method based on multi-feature fusion
CN115937019A (en) Non-uniform defogging method combining LSD (local Scale decomposition) quadratic segmentation and deep learning
Chaudhry et al. Underwater visibility restoration using dehazing, contrast enhancement and filtering
CN113344802B (en) Underwater image restoration method based on self-adaptive atmosphere light fusion
Pramunendar et al. A novel approach for underwater image enhancement based on improved dark channel prior with colour correction
CN116883259A (en) Underwater image enhancement method based on denoising diffusion probability model
Dhanya et al. L-CLAHE intensification filter (L-CIF) algorithm for underwater image enhancement and colour restoration
CN109064425A (en) A kind of image de-noising method of adaptive non local total variation
de Dravo et al. An adaptive combination of dark and bright channel priors for single image dehazing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant