CN111861896A - UUV-oriented underwater image color compensation and recovery method - Google Patents

UUV-oriented underwater image color compensation and recovery method Download PDF

Info

Publication number
CN111861896A
CN111861896A CN201910358840.8A CN201910358840A CN111861896A CN 111861896 A CN111861896 A CN 111861896A CN 201910358840 A CN201910358840 A CN 201910358840A CN 111861896 A CN111861896 A CN 111861896A
Authority
CN
China
Prior art keywords
image
underwater
color
pixel
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910358840.8A
Other languages
Chinese (zh)
Inventor
梁洪涛
朱鑫
尚茹月
申秀清
史永然
曹仁杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Normal University
Original Assignee
Shaanxi Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Normal University filed Critical Shaanxi Normal University
Priority to CN201910358840.8A priority Critical patent/CN111861896A/en
Publication of CN111861896A publication Critical patent/CN111861896A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

The invention belongs to the field of underwater optical imaging and image enhancement, and particularly relates to a UUV-oriented underwater image color compensation and recovery method. Firstly, acquiring a bright channel underwater image through an underwater optical imaging model, carrying out color compensation on the bright channel underwater image, and further carrying out image visual effect enhancement by utilizing histogram equalization; secondly, on the basis of an underwater optical imaging model, considering various water area types, analyzing the red, green and blue channel attenuation ratio, estimating background light and underwater transmittance, and selecting an optimal underwater recovery image by utilizing a white balance hypothesis; and finally, performing weighted fusion on the color compensation image and the color recovery image of the original image based on pixel level fusion and weighted processing, solving the problems of serious image atomization, low contrast and the like, and realizing the effect of improving the image quality.

Description

UUV-oriented underwater image color compensation and recovery method
Technical Field
The invention belongs to the field of underwater optical imaging and image enhancement, and particularly relates to a UUV-oriented underwater image color compensation and recovery method.
Background
An Underwater Unmanned Vehicle (UUV) is a novel Underwater weapon following a torpedo, a torpedo and the like, is an Underwater Unmanned system integrating functions of environment sensing, dynamic planning, behavior control, energy adaptation and the like, and has important significance in the aspects of military surveying, Underwater reconnaissance, attack defense and the like. At present, the underwater vision technology applied to the UUV can be mainly divided into acoustic vision and optical vision, wherein the optical vision image enhancement technology becomes a research hotspot for enhancing the underwater image of the UUV.
Because light is attenuated and scattered in water, the UUV light vision system obtains uneven illumination and obtains an underwater image with the problems of blurring, low contrast, color distortion and noise. At present, methods for researching underwater image enhancement at home and abroad are mainly divided into three methods, namely a spatial domain method, a frequency domain method and a color correction method. The spatial domain method is basically based on gray mapping transformation, and directly processes pixels of an image, the underwater image enhancement technology is mature, and has the advantages of simple algorithm realization, obvious contrast enhancement of the processed underwater image and the like, but the color cast condition is not considered, and the problems of introduction of artifacts, color distortion, amplified noise and the like are not well solved; the transform domain method is to use transform technology and adjust the definition of an image in a digital filtering mode, and the method has a good effect on removing underwater image noise, but does not have a perfect effect on the comprehensive problems of low contrast, color deviation, blur and the like of the underwater image, and compared with the airspace enhancement technology, the method has less research work and slow progress on the enhancement of the underwater image based on a frequency domain; the color correction method extracts red, green and blue channels based on physical hypothesis, and adopts color compensation according to wavelength propagation characteristics and absorption characteristics to achieve the aim of recovering underwater images.
The method mainly aims at various problems of underwater image blurring, low contrast, color distortion and the like caused by underwater attenuation of light, utilizes an underwater optical imaging model Jaffe-McGlamry to separate red, green and blue channels to respectively compensate and restore colors, and utilizes a pixel-level weighting fusion method to fuse a color compensation image and a color restoration image of an original image, so that various problems of underwater image blurring, low contrast, color distortion and the like are solved.
Disclosure of Invention
In order to overcome the problems of fuzzy underwater images, low contrast and color distortion pointed out in the background art, the invention provides a UUV-oriented underwater image color compensation and recovery method, and aims to achieve the aims of improving the quality of degraded images and highlighting detailed information in the images.
A UUV-oriented underwater image color compensation and recovery method comprises the following steps:
the method comprises the following steps: extraction of bright channel for image shot by UUV
The expression of the total strong light can be obtained according to the Jaffe-McGlamry underwater optical imaging model, namely
Figure BDA0002046277930000021
In the formula (1), (x, y) represents the specific position of a pixel in an image, C epsilon { R, G, B } represents three channels of red, green and blue, Jc(x, y) is the original image, I c(x, y) is the attenuated image, AIs background light; ed(x, y) is the direct light intensity received by the camera; c (lambda) is the total attenuation coefficient, including the attenuation caused by absorption and scattering, lambda is one of the three color channels of image R, G, B; d (x) is the distance between the object and the UUV; g (x, y) is a point spread function; ef(x, y) is the forward scattered intensity; eb(x, y) represents the intensity of the backscattered light;
the light received by the UUV consists of a direct component, a forward scattering component and a backward scattering component, and the transmissivity S is setc(x) Comprises the following steps:
Sc(x)=exp[-c(λ)d(x)](2)
when the object is very small from the camera, the forward scattering light causes negligible blur, so equation (1)
Can be simplified as follows:
Ic(x,y)=Jc(x,y)Sc(x)+A[1-Sc(x)](3)
taking the maximum value from two sides of equation (3) and three channel neighborhood of red, green and blue to extract the bright channel, i.e.
Figure BDA0002046277930000031
Figure BDA0002046277930000032
Is a bright channel map;
based on the above analysis, it is proposed to compensate the color of each channel with reference to the bright channel. The bright channel acquisition method comprises the following steps:
Figure BDA0002046277930000033
in the formula (I), the compound is shown in the specification,
Figure BDA0002046277930000034
for obtaining an image IcThe maximum value of the red, green and blue channels at each pixel point in (x, y),
Figure BDA0002046277930000035
for maximum filtering, the filter template Ω (x, y) size is taken to be 15.
Step two: carrying out bright channel color compensation on the image obtained by extracting the bright channel in the step one
Gray values with the same mean value of the red channel, the green channel and the blue channel in a noise-free image can be obtained by the gray world white balance hypothesis, and the difference value of the mean value of the compensated image gray value and the mean value of the attenuated image gray value is in direct proportion to the mean value of the original image gray value through a large number of experiments and theoretical derivation. Therefore, the attenuation image needs to be multiplied by the difference value between the average value of the gray value of the compensated image and the average value of the gray value of the attenuated image, and meanwhile, the attenuation of the channels of the three colors is different underwater.
Based on the above analysis, a color recovery algorithm based on a bright channel is proposed, namely:
Figure BDA0002046277930000036
wherein C is in the form of { R, G, B } channel of red, green and blue, Ic(x, y) is the attenuated image, IBC(x, y) is the image after color compensation,
Figure BDA0002046277930000037
the mean value of the gray values of the three color channels after compensation for the bright channel,
Figure BDA0002046277930000038
the average of the grey values of the three channels of the image is attenuated. All image pixel values involved in the calculation in equation (6) are reduced to [0,1 ] by dividing by 255]In the interval, the average brightness value of the naturally exposed image is about 125 statistically, η can adjust the brightness of the compensated image, η is 3.5, so that the average brightness value of the compensated image satisfies the above rule, η is 3.5,
Figure BDA0002046277930000041
for rounding-down operations, N<>Indicating that the pixel value is linearly stretched to 0,255]An interval.
Step three: performing histogram equalization processing on the image subjected to the two-color compensation in the step
Under the complex underwater environment, the scattering effects caused by different degrees of turbidity of water are different, so that the pictures show different degrees of blur, and further processing is required to be carried out by using a histogram equalization algorithm.
In discrete form, the probability can be replaced by a frequency, where r is usedkRepresents IBCDiscrete gray levels of (x, y), and k is the number of gray levels. And the following holds:
Figure BDA0002046277930000042
Wherein r is more than or equal to 0k≤1,k=0,1,2,3,…,n-1,nkFor the occurrence of r in the imagekThe number of pixels of such a gray scale, n being the total number of pixels in the image, and
Figure BDA0002046277930000043
is the frequency of the probabilities.
The function expression of histogram equalization of the image is as follows:
Figure BDA0002046277930000044
the functional expression for histogram equalization needs to satisfy two conditions: (a) t is a single value monotone increasing function, (b) r is more than or equal to 0 and less than or equal to L-1, and T is more than or equal to 0 and less than or equal to (r) and less than or equal to L-1. When the two conditions are satisfied, the cumulative distribution function can convert the original histogram with uneven gray level distribution into the histogram with even gray level distribution, and the corresponding inverse transformation is as follows:
ri=T-1(Si) (9)
using transformed discrete grey levels riRecovering the image Iz=F(ri) Wherein; f is a mapping function; step four: improving an underwater image forming model, and estimating background light and underwater transmissivity in the improved model to further estimate scene recovery Jc
Step 4.1, obtaining a simplified underwater optical imaging model according to a Jaffe-McGlamry underwater optical imaging model formula (1):
Ic(x)=tc(x)Jc(x)+(1-tc(x))·Ac(10)
wherein x represents a pixel point in the image, IcRepresenting images taken by UUV, tcRepresenting the underwater transmission of the color channel, JcRepresenting a clear original image, AcIs the background light, c ∈ { R, G, B }.
Transmission t under waterc(x) Dependent on the distance z (x) of the object from the UUV and the water attenuation coefficient beta of each channel cI.e. by
tc(x)=exp(-βcz(x)) (11)
Step 4.2: estimation of background light in the underwater image formation model improved in step 4.1
Estimation of background light A using linear contrast stretching and edge detection modelscThe specific principle is as follows: first, I is obtained by linear contrast stretching and structured edge detectioncEdge detection image IeAnd to IeThe pixels of (a) are arranged from large to small; secondly, cutting the pixel according to a pixel point judgment threshold Th, if the pixel x is larger than the threshold, the pixel belongs to the maximum connected component Lc, otherwise, the pixel does not belong to Lc, and the method specifically comprises the following steps:
Figure BDA0002046277930000051
further, an average Av of the pixels in Lc and Sum is calculated as the estimated background light.
Step 4.3: underwater Transmission estimation in the Underwater image formation model improved for step 4.1
Combining equations (10) and (11), the increment of the blue channel can be calculated:
Figure BDA0002046277930000052
the increment of the red channel is calculated:
Figure BDA0002046277930000053
taking formula (14) from both sides
Figure BDA0002046277930000054
The power can be obtained
Figure BDA0002046277930000055
The ratio between the attenuation coefficients is defined as:
Figure BDA0002046277930000056
further, according to the compensation space formula (16), a structure similar to the formula (10) is formed by the underwater transmittance:
Figure BDA0002046277930000061
due to I in the formula (10)c-AcPossibly negative, and therefore increasing β by increasing the absolute value and preserving the signBcWhile avoiding the occurrence of I c-AcNegative value problem.
The boundary J is replaced in equation (10) to account for the different attenuation coefficients of the different color channelscNot less than 0 and obtaining lower limit t on the underwater transmissivity of the blue channelLB
Figure BDA0002046277930000062
Setting pixel x of the underwater transmittance to t calculated in equation (18)LB. Since binary classification of pixels x to Lc often results in abrupt discontinuities, the underwater transmittance is estimated using the algorithm soft-matching:
Figure BDA0002046277930000063
in the formula (I), the compound is shown in the specification,
Figure BDA0002046277930000064
x∈LcDM(I(x) Represents the average of the mahalanobis distance for the background light pixels,
Figure BDA0002046277930000065
representing the maximum mahalanobis distance, σMRepresents the standard deviation, alpha (x) represents the extinction coefficient of a pixel belonging to an object or water area,
Figure BDA0002046277930000066
step 4.4: estimation scene recovery Jc
Color fading compensation is performed and a clear original image is estimated according to equations (12), (18), and (19):
Figure BDA0002046277930000067
step five: for J recovered in step fourcSelecting the optimal underwater recovery image based on the white balance hypothesis
If the wrong attenuation coefficient beta is usedcCan lead to color deviation and repair of underwater transmittance image errors, for which the best results are automatically selected by grey world assumption according to different water area types of Jerlov
Figure BDA0002046277930000071
Assuming that the water area type is N, i is 1,2, … N, piIndicates the ith water area type, J cAnd if three color channels of R, G and B exist, the specific gray world algorithm processing steps are as follows:
(a) calculating water area type image JcThree-channel mean value Gray of R, G, Bi
Figure BDA0002046277930000072
(b) Calculation of JcGain coefficients of three channels of images R, G, B:
Figure BDA0002046277930000073
(c) according to the Von Kries diagonal model, for each pixel x in the image, the R, G and B components are adjusted to be Cr、CgAnd Cb
Figure BDA0002046277930000074
(d) According to the formula (23), Cr、CgAnd CbThe number of pixels is M, for CrPixel C included inrj、CgPixel C included ingjAnd CbPixel C included inbjSumming and taking the average value avr:
Figure BDA0002046277930000075
(e) differencing each channel avr and taking the absolute value pi
pi=|avrr-avrg-avrb| (25)
Comparing the calculated difference p of different water area typesiSelecting the image corresponding to the water area type with the minimum difference value as
Figure BDA0002046277930000081
Step six: pixel-based image fusion
IzFor bright channel color compensated and histogram processed images,
Figure BDA0002046277930000082
and (3) reconstructing and recovering the image of the underwater 3D scene, wherein J is the image after fusion processing:
Figure BDA0002046277930000083
in the formula, α and β are respective weights and satisfy α + β ═ 1.
The method mainly aims at various problems of fuzzy underwater images, low contrast, color distortion and the like caused by the underwater attenuation of light, obtains a bright channel image by using red, green and blue channels, performs color compensation on each channel by taking the bright channel image as a reference, and obtains a relatively ideal underwater enhanced image by multi-scale fusion of images by adopting histogram enhancement and bilateral filtering denoising technologies.
Compared with a single underwater image enhancement algorithm, the method has the following beneficial effects:
1. the color information is complete, the compensation factor is increased along with the larger the depth of field of the shot by the bright channel color compensation method and is not a negative value, so that the color information cannot be lost;
2. the contrast is high, and the problems of darker images and low contrast are solved by carrying out equalization processing on the underwater images subjected to histogram color compensation;
3. the method has strong comprehensiveness, adopts multi-aspect technical fusion, and efficiently solves various problems of underwater image blur, low contrast, color distortion and the like.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a comparison of color restoration effect of color chart, original image and four methods; wherein (a) is a standard color card; (b) is an original picture; (c) is the bright channel processing result; (d) is the histogram processing result; (e) is the 3D scene restoration processing result; (f) is the algorithm processing result of the invention;
FIG. 3 is a comparison of the restoration effect of various underwater images according to the present invention; wherein, the group a of pictures are original pictures; b group of diagrams are processing effect diagrams of bright channel color compensation technology; c group of diagrams are processing effect diagrams of the histogram stretching technology; the D group of diagrams are the processing effect diagrams of the 3D scene color recovery technology, and the e group of diagrams are the effect diagrams of the technical processing proposed by the invention;
FIG. 4 is a graph of application test effects; wherein, the group (a) is SURF feature matching of the original image; (b) the set is the SURF feature matching of the present invention.
Detailed Description
The present invention will be described in further detail with reference to specific examples, but the embodiments of the present invention are not limited thereto.
The invention provides a UUV-oriented underwater image color compensation and recovery method, which comprises the steps that firstly, an underwater optical imaging model obtains a bright channel underwater image and performs color compensation on the bright channel underwater image, and further, histogram equalization is utilized to enhance the visual effect of the image; secondly, on the basis of an underwater optical imaging model, considering various water area types, analyzing the red, green and blue channel attenuation ratio, estimating background light and underwater transmittance, and selecting an optimal underwater recovery image by utilizing a white balance hypothesis; and finally, performing weighted fusion on the color compensation image and the color recovery image of the original image based on pixel level fusion and weighted processing, solving the problems of serious image atomization, low contrast and the like, and realizing the effect of improving the image quality.
The color correction is realized by utilizing the color compensation of the bright channel, the color information is not easy to lose, and on the basis, the image standard deviation mean value is kept at about 0.5 by adopting a self-adaptive contrast stretching and bilateral filtering method; better underwater image edge details can be kept while noise is reduced. Reconstructing an underwater 3D scene by using a method for estimating unknown parameters of an image, recovering the color of a distant scene, and reducing edge mutation halo; by weighting and re-fusing the two images, the contrast of the underwater image is greatly improved, the color recovery is obvious, the detail processing is more exquisite, and the quality of the underwater image is enhanced.
Example 1:
1. color recovery test
The color recovery result is shown in FIG. 2. As can be seen from FIG. 2, the contrast of the image processed by the histogram algorithm is enhanced, but the color of the image processed by the histogram algorithm has distortion of different degrees, and the image processed by the bright channel algorithm has white light; the color of the image processed by the underwater 3D scene recovery algorithm is reddish, and the color recovery effect is poor; comparing the color test card, the image color processed by the method of the invention is recovered to be correct.
2. Visual effect contrast
In order to detect the effect of the invention on processing the problems of underwater image fogging, blurring and the like, 6 images 1-6 are selected, as shown in fig. 3, wherein a group of images are original images, b group of images are bright channel color compensation technical processing effect images, c group of images are histogram stretching technical processing effect images, D group of images are 3D scene color recovery technical processing effect images, and e group of images are effect images of the technical processing proposed by the invention.
It is not easy to see that bright channel color compensation improves the brightness of the picture to a certain extent, but has poor defogging effect on the picture and cannot well restore the real color of the image; the defogging effect is obvious by adopting a histogram algorithm, so that the contrast of the picture is enhanced, but a large amount of white light is generated at the edge of the picture, and the color distortion of the whole picture is generated to different degrees; the 3D underwater scene reconstruction algorithm aims to solve the problems of image far color distortion, halo appearing in edge mutation and the like by reconstructing an underwater 3D scene to obtain two global attenuation ratio parameters to obtain a far color calibration image, the processed image color correction effect is better, but the processed image has low contrast ratio and is not obvious enough compared with a histogram in the image defogging effect, and the color of the processed image is slightly reddish; the method has a good processing effect, solves the problems of excessive white light and fuzzy edges of the pictures processed by the histogram, solves the problems of dim color tone and weak image contrast of a 3D underwater scene reconstruction algorithm, and better restores the underwater real images.
3. Image quality evaluation
Quantitative evaluation is carried out on underwater images 1-image6 in fig. 2 by peak signal to noise ratio (PSNR), and a higher PSNR represents that the algorithm introduces a small amount of noise and retains more valuable image information.
The evaluation results are shown in table 1, and data show that the image processed by the algorithm of the invention has low introduced noise, retains more valuable information, and the algorithm processing effect is superior to histogram equalization and 3D scene recovery algorithm; less information is retained compared to the bright channel compensation method.
TABLE 1 Peak SNR PSNR contrast analysis
Figure BDA0002046277930000111
4. Application testing
In order to verify the application effect of the image processed by the method, application contrast experiments before and after image processing are set. The method adopts an accelerated robust feature (SURF) feature matching test, compares the matching number of feature points before and after image processing under the condition of the same feature similarity threshold, applies the test effect as shown in FIG. 4, performs SURF feature matching on the original image after rotating the original image by right angle degrees and only can correctly match a small number of feature points under the same threshold condition, and increases the number of correctly matched feature points of the image processed by the method. Referring to fig. 4, the result shows that the image processed by the method of the present invention has a good application effect in the subsequent feature extraction process.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (2)

1. A UUV-oriented underwater image color compensation and recovery method is characterized by comprising the following steps:
the method comprises the following steps: extraction of bright channel for image shot by UUV
Obtaining an expression of total strong light according to the Jaffe-McGlamry underwater optical imaging model, namely
Figure FDA0002046277920000011
In the formula (1), (x, y) represents the specific position of a pixel in an image, C epsilon { R, G, B } represents three channels of red, green and blue, Jc(x, y) is the original image, Ic(x, y) is the attenuated image, AIs background light; ed(x, y) is the direct light intensity received by the camera; c (λ) is the overall attenuation coefficient, including the attenuation caused by absorption and scattering; lambda is one of three color channels of the images R, G and B; d (x) is the distance between the object and the UUV; g (x, y) is a point spread function; ef(x, y) is the forward scattered intensity; e b(x, y) represents the intensity of the backscattered light;
the light received by the UUV consists of a direct component, a forward scattering component and a backward scattering component, and the transmissivity S is setc(x) Comprises the following steps:
Sc(x)=exp[-c(λ)d(x)](2)
when the object is a little away from the camera, the forward scattering light causes blurring to be ignored, so equation (1) is simplified as:
Ic(x,y)=Jc(x,y)Sc(x)+A[1-Sc(x)](3)
taking the maximum value from two sides of equation (3) and three channel neighborhood of red, green and blue to extract the bright channel, i.e.
Figure FDA0002046277920000012
Figure FDA0002046277920000013
Extracting a bright channel map;
the bright channel acquisition method is expressed as equation (5):
Figure FDA0002046277920000014
for obtaining an image IcThe maximum value of the red, green and blue channels at each pixel point in (x, y),
Figure FDA0002046277920000015
for maximum filtering, the filter template size is Ω (x, y);
Figure FDA0002046277920000021
step two: carrying out bright channel color compensation on the image obtained by extracting the bright channel in the step one
Obtaining gray values with the same mean value of red, green and blue channels in a noise-free image by assuming gray world white balance, and obtaining that the difference value of the mean value of the compensated image gray value and the mean value of the attenuated image gray value is in direct proportion to the mean value of the original image gray value through a large number of experiments and theories, so that the attenuated image is multiplied by the difference value of the mean value of the compensated image gray value and the mean value of the attenuated image gray value; meanwhile, under water, the attenuation of the channels of the three colors is different; if the images are processed uniformly, the values of some channels are saturated to generate a large amount of white light, so that pigments with lower gray values in the images need to be compensated;
Based on the above analysis, a color recovery algorithm based on a bright channel is proposed, namely:
Figure FDA0002046277920000022
in the formula (6), C is in the { R, G, B } channel of three colors of red, green and blue, Ic(x, y) is the attenuated image, IBC(x, y) is the image after color compensation,
Figure FDA0002046277920000023
the mean value of the gray values of the three color channels after compensation for the bright channel,
Figure FDA0002046277920000024
attenuating the average value of the gray values of three channels of the image;
all image pixel values involved in the calculation in equation (6) are reduced to [0,1 ] by dividing by 255]The interval, η, is 3.5,
Figure FDA0002046277920000025
for rounding-down operations, N<>Indicating that the pixel value is linearly stretched to 0,255]An interval;
step three: performing histogram equalization processing on the image subjected to the two-color compensation in the step
Under the complex underwater environment, the scattering effects caused by different water turbidity degrees are different, so that the pictures show different degrees of blur and need to be further processed by a histogram equalization algorithm;
in discrete form, the probability is replaced by frequency, where r is usedkRepresents IBCDiscrete gray levels of (x, y), k is the number of gray levels, and the following equation (7) holds:
Figure FDA0002046277920000026
wherein r is more than or equal to 0k≤1,k=0,1,2,3,…,n-1,nkFor the occurrence of r in the imagekThe number of pixels of such a gray scale, n being the total number of pixels in the image, and
Figure FDA0002046277920000031
is the frequency of the probabilities;
the function expression of histogram equalization of the image is as follows:
Figure FDA0002046277920000032
The functional expression (8) of histogram equalization needs to satisfy two conditions: (a) t is a single value monotone increasing function, (b) r is more than or equal to 0 and less than or equal to L-1, and T is more than or equal to 0 and less than or equal to (r) and less than or equal to L-1;
when the two conditions are met, the cumulative distribution function converts the original histogram with uneven gray level distribution into the histogram with even gray level distribution, and the corresponding inverse transformation is as follows:
ri=T-1(Si) (9)
using transformed discrete grey levels riRecovering the image Iz=F(ri) Wherein; f is a mapping function;
step four: reprocessing the original image shot by UUV, improving the underwater image forming model, estimating the background light and the underwater transmissivity in the improved model, and further estimating the scene recovery Jc
Step 4.1, obtaining a simplified underwater optical imaging model according to a Jaffe-McGlamry underwater optical imaging model formula (1):
Ic(x)=tc(x)Jc(x)+(1-tc(x))·Ac(10)
wherein x represents a pixel point in the image, IcRepresenting images taken by UUV, tcRepresenting the underwater transmission of the color channel, JcRepresenting a clear original image, AcIs background light, c ∈ { R, G, B };
transmission t under waterc(x) Dependent on the distance z (x) of the object from the UUV and the water attenuation coefficient beta of each channelcI.e. by
tc(x)=exp(-βcz(x)) (11)
Step 4.2 estimation of background light in the underwater image formation model improved in step 4.1
Estimation of background light A using linear contrast stretching and edge detection models cThe specific principle is as follows: first, I is obtained by linear contrast stretching and structured edge detectioncEdge detection image IeAnd to IeThe pixels of (a) are arranged from large to small; secondly, cutting the pixel according to a pixel point judgment threshold Th, if the pixel x is larger than the threshold, the pixel belongs to the maximum connected component Lc, otherwise, the pixel does not belong to Lc, and the method specifically comprises the following steps:
Figure FDA0002046277920000041
further calculating the average value Av of the pixels and Sum in Lc, and taking Av as the estimated background light;
step 4.3, the underwater transmittance estimation in the underwater image formation model improved in step 4.1 is combined with the formula (10) and the formula (11), and the increment of the blue channel is calculated:
Figure FDA0002046277920000042
the increment of the red channel is calculated:
Figure FDA0002046277920000043
taking formula (14) from both sides
Figure FDA0002046277920000044
The power is obtained:
Figure FDA0002046277920000045
the ratio between the attenuation coefficients is defined as:
Figure FDA0002046277920000046
further, according to the compensation space formula (16), a structure similar to the formula (10) is formed by the underwater transmittance:
Figure FDA0002046277920000047
the boundary J is replaced in equation (10) to account for the different attenuation coefficients of the different color channelscNot less than 0 and obtaining lower limit t on the underwater transmissivity of the blue channelLB
Figure FDA0002046277920000051
Setting pixel x of the underwater transmittance to t calculated in equation (18)LB(ii) a Since binary classification of pixels x to Lc often results in abrupt discontinuities, the underwater transmittance is estimated using the algorithm soft-matching:
Figure FDA0002046277920000052
In the formula (19), the compound represented by the formula (I),
Figure FDA0002046277920000053
x∈LcDM(I (x)) represents the average of the Mahalanobis distances for the background light pixels,
Figure FDA0002046277920000054
representing the maximum mahalanobis distance, σMRepresents the standard deviation, alpha (x) represents the extinction coefficient of a pixel belonging to an object or water area,
Figure FDA0002046277920000055
step 4.4 estimation of scene recovery Jc
Color fading compensation is performed according to equations (12), (18) and (19), and a clear original image is estimated:
Figure FDA0002046277920000056
step five: for J recovered in step fourcSelecting the optimal underwater recovery image based on the white balance hypothesis
Automatically selecting the best result by the grey world assumption according to different water area types of Jerlov
Figure FDA0002046277920000057
Step six: based on pixel level image fusion, specifically, carrying out image fusion on the image recovered in the step three and the optimal scene image selected in the step five;
Izfor bright channel color compensated and histogram processed images,
Figure FDA0002046277920000058
and (3) reconstructing and recovering the image of the underwater 3D scene, wherein J is the image after fusion processing:
Figure FDA0002046277920000059
in formula (26), α and β are respective weights and satisfy α + β ═ 1.
2. According to the rightThe UUV-oriented underwater image color compensation and recovery method of claim 1, wherein the optimal result is automatically selected through grey world assumption
Figure FDA0002046277920000061
The specific process comprises the following steps:
assuming that the water area type is N, i is 1,2, … N, p iIndicates the ith water area type, JcAnd if three color channels of R, G and B exist, the specific gray world algorithm processing steps are as follows:
(a) calculating water area type image JcThree-channel mean value Gray of R, G, Bi
Figure FDA0002046277920000062
(b) Calculation of JcGain coefficients of three channels of images R, G, B:
Figure FDA0002046277920000063
(c) according to the Von Kries diagonal model, for each pixel x in the image, the R, G and B components are adjusted to be Cr、CgAnd Cb
Figure FDA0002046277920000064
(d) According to the formula (23), Cr、CgAnd CbThe number of pixels is M, for CrPixel C included inrj、CgPixel C included ingjAnd CbPixel C included inbjSumming and taking the average value avr:
Figure FDA0002046277920000065
(e) differencing each channel avr and taking the absolute value pi
pi=|avrr-avrg-avrb| (25)
Comparing the calculated difference p in different water area typesiSelecting the image corresponding to the water area type with the minimum difference value as the image
Figure FDA0002046277920000071
CN201910358840.8A 2019-04-30 2019-04-30 UUV-oriented underwater image color compensation and recovery method Pending CN111861896A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910358840.8A CN111861896A (en) 2019-04-30 2019-04-30 UUV-oriented underwater image color compensation and recovery method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910358840.8A CN111861896A (en) 2019-04-30 2019-04-30 UUV-oriented underwater image color compensation and recovery method

Publications (1)

Publication Number Publication Date
CN111861896A true CN111861896A (en) 2020-10-30

Family

ID=72965460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910358840.8A Pending CN111861896A (en) 2019-04-30 2019-04-30 UUV-oriented underwater image color compensation and recovery method

Country Status (1)

Country Link
CN (1) CN111861896A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132925A (en) * 2020-11-24 2020-12-25 上海彩虹鱼海洋科技股份有限公司 Method and device for reconstructing underwater image color
CN112446841A (en) * 2020-12-14 2021-03-05 中国科学院长春光学精密机械与物理研究所 Self-adaptive image recovery method
CN112804510A (en) * 2021-01-08 2021-05-14 海南省海洋与渔业科学院 Color fidelity processing method and device for deep water image, storage medium and camera
CN114511462A (en) * 2022-02-11 2022-05-17 电子科技大学 Visual image enhancement method
CN114821021A (en) * 2022-04-28 2022-07-29 昆明理工大学 Underwater image enhancement method combining multichannel equalization and multi-scale fusion
WO2023070958A1 (en) * 2021-10-28 2023-05-04 中国科学院沈阳自动化研究所 Image restoration method based on physical scattering model

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132925A (en) * 2020-11-24 2020-12-25 上海彩虹鱼海洋科技股份有限公司 Method and device for reconstructing underwater image color
CN112446841A (en) * 2020-12-14 2021-03-05 中国科学院长春光学精密机械与物理研究所 Self-adaptive image recovery method
CN112804510A (en) * 2021-01-08 2021-05-14 海南省海洋与渔业科学院 Color fidelity processing method and device for deep water image, storage medium and camera
WO2023070958A1 (en) * 2021-10-28 2023-05-04 中国科学院沈阳自动化研究所 Image restoration method based on physical scattering model
CN114511462A (en) * 2022-02-11 2022-05-17 电子科技大学 Visual image enhancement method
CN114511462B (en) * 2022-02-11 2023-04-18 电子科技大学 Visual image enhancement method
CN114821021A (en) * 2022-04-28 2022-07-29 昆明理工大学 Underwater image enhancement method combining multichannel equalization and multi-scale fusion

Similar Documents

Publication Publication Date Title
Zhou et al. Auto color correction of underwater images utilizing depth information
CN111861896A (en) UUV-oriented underwater image color compensation and recovery method
CN107358585B (en) Foggy day image enhancement method based on fractional order differential and dark channel prior
Chen et al. Hazy image restoration by bi-histogram modification
CN109447917B (en) Remote sensing image haze eliminating method based on content, characteristics and multi-scale model
Dharejo et al. A color enhancement scene estimation approach for single image haze removal
Soni et al. An improved image dehazing technique using CLAHE and guided filter
Yu et al. Image and video dehazing using view-based cluster segmentation
CN109003238B (en) Image haze removal method based on model, histogram and gray level enhancement
CN111598814B (en) Single image defogging method based on extreme scattering channel
Tang et al. A local flatness based variational approach to retinex
CN117252773A (en) Image enhancement method and system based on self-adaptive color correction and guided filtering
CN110335210B (en) Underwater image restoration method
Wei et al. An image fusion dehazing algorithm based on dark channel prior and retinex
CN111598886A (en) Pixel-level transmittance estimation method based on single image
CN112419163A (en) Single image weak supervision defogging method based on priori knowledge and deep learning
Wang et al. Nighttime image dehazing using color cast removal and dual path multi-scale fusion strategy
CN116757949A (en) Atmosphere-ocean scattering environment degradation image restoration method and system
CN116797484A (en) Image defogging method and system based on improved dark channel prior
CN116883259A (en) Underwater image enhancement method based on denoising diffusion probability model
Abbaspour et al. A new fast method for foggy image enhancement
Yuan et al. Defogging Technology Based on Dual‐Channel Sensor Information Fusion of Near‐Infrared and Visible Light
CN113379631B (en) Image defogging method and device
Zheng et al. An illumination adaptive underwater image enhancement method
Wang et al. Fast visibility restoration using a single degradation image in scattering media

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination