CN113763267A - Image restoration method under strong scattering environment based on NSCT image fusion - Google Patents
Image restoration method under strong scattering environment based on NSCT image fusion Download PDFInfo
- Publication number
- CN113763267A CN113763267A CN202110987104.6A CN202110987104A CN113763267A CN 113763267 A CN113763267 A CN 113763267A CN 202110987104 A CN202110987104 A CN 202110987104A CN 113763267 A CN113763267 A CN 113763267A
- Authority
- CN
- China
- Prior art keywords
- image
- frequency
- coefficient
- fusion
- polarization
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 66
- 238000000034 method Methods 0.000 title claims abstract description 51
- 230000010287 polarization Effects 0.000 claims abstract description 75
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 16
- 238000003708 edge detection Methods 0.000 claims abstract description 16
- 238000005516 engineering process Methods 0.000 claims abstract description 9
- 238000004364 calculation method Methods 0.000 claims abstract description 6
- 238000007781 pre-processing Methods 0.000 claims abstract description 6
- 239000011159 matrix material Substances 0.000 claims description 34
- 238000005259 measurement Methods 0.000 claims description 31
- 230000009466 transformation Effects 0.000 claims description 13
- 206010003694 Atrophy Diseases 0.000 claims description 11
- 230000037444 atrophy Effects 0.000 claims description 11
- 102000018059 CS domains Human genes 0.000 claims description 8
- 108050007176 CS domains Proteins 0.000 claims description 8
- 238000007476 Maximum Likelihood Methods 0.000 claims description 3
- 230000002708 enhancing effect Effects 0.000 claims description 3
- 230000011218 segmentation Effects 0.000 claims description 3
- 238000007500 overflow downdraw method Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 abstract description 27
- 238000001914 filtration Methods 0.000 abstract description 7
- 238000011156 evaluation Methods 0.000 abstract description 5
- 230000003287 optical effect Effects 0.000 abstract description 3
- 238000012545 processing Methods 0.000 abstract description 3
- 230000000694 effects Effects 0.000 description 6
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 5
- 238000011161 development Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 241000282414 Homo sapiens Species 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 239000002245 particle Substances 0.000 description 3
- 238000010521 absorption reaction Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000000701 chemical imaging Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000008239 natural water Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/29—Graphical models, e.g. Bayesian networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a polarization imaging degraded image restoration method under a strong scattering environment based on NSCT image fusion, belonging to the field of optical image processing; the method comprises the steps of underwater image preprocessing, algorithm input, NSCT decomposition, Orthogonal Matching Pursuit (OMP) reconstruction, covering source image edges on an initial fusion image by using an edge detection technology, and algorithm output. The method selects information Entropy (EN), Average Gradient (AG) and standard deviation (STD) as image evaluation indexes, utilizes a wiener filtering method, a constrained least square filtering method and a blind deconvolution method to restore the image, and compares the image restoration with the patent algorithm. Through experimental calculation, the algorithm is the highest in three aspects of information Entropy (EN), Average Gradient (AG) and standard deviation (STD), namely the algorithm is the most effective, wherein the stronger imaging of the Average Gradient (AG) is improved by 3.5 times, and the stronger imaging of the Average Gradient (AG) is improved by 10% compared with polarization imaging; the standard deviation (STD) is improved by 44% compared with polarization imaging.
Description
Technical Field
The invention belongs to the field of optical image processing, and particularly relates to a polarization imaging degraded image restoration method under a strong scattering environment based on NSCT image fusion.
Background
The ocean area on the earth is about 70%, and the ocean has a large amount of resources which can be developed and utilized by human beings. With the excessive development of land resources by human beings, the ecological imbalance problem becomes more serious, so that the attention of human beings is shifted to the development of underwater resources. However, the underwater environment is complicated and changeable, and the underwater resource development is not easy. Therefore, the underwater polarization imaging technology and the detection technology become the focus of people's attention, and have wide application in the aspects of development and exploration of underwater resources, military reconnaissance and the like.
When an underwater target object is subjected to image acquisition, the image quality is often difficult to satisfy, and the reason for causing the phenomenon is that the light received by an imaging system is seriously attenuated due to the absorption and scattering effects of a large amount of suspended particles and soluble substances contained in natural water on the light, so that the acquired image is blurred, and the image quality is reduced.
Polarization is another important characteristic property of electromagnetic waves outside amplitude, wavelength, and phase. When the polarized image is imaged, light received by the imaging system not only has the traditional optical characteristics, but also has a large amount of polarization vector characteristics. Compared with traditional visible light imaging, infrared imaging, spectral imaging and the like, the polarization imaging technology has unique advantages in the aspects of detection, reconnaissance and the like.
The main cause of the degradation of image quality is scattering effects. Scattering effects can be divided into backscatter and forward reflection. Backscattering is a phenomenon, and in the process of irradiating a target object by light, the light in water can be interfered by suspended particles in water, and a propagation path can deviate seriously, so that the quality of an image is greatly reduced. Forward scattering refers to the fact that light rays in water slightly deviate before being received by an imaging system due to interference of suspended particles in the water, and therefore the underwater image quality is reduced. The Contourlet transform can effectively describe the outline information of an image, but has translation invariance, which can cause image distortion in image processing. Based on the dark channel, image fusion and Retinex algorithm, although the underwater imaging quality can be improved to a certain extent, the color, detail and definition of the image still need to be further improved.
Disclosure of Invention
The invention aims to provide a polarization imaging degraded image restoration method under a strong scattering environment based on NSCT image fusion, so as to solve the technical problems that although the underwater imaging quality can be improved to a certain extent based on the dark channel, the image fusion and the Retinex algorithm mentioned in the background technology, the color, the detail and the definition of the image still need to be further improved.
In order to achieve the above purpose, the specific technical solution of the method for restoring a polarization imaging degraded image in a strong scattering environment based on NSCT image fusion of the present invention is as follows:
the required equipment: underwater polarization imaging detection device, computer and VC + + programming software
An image restoration method under a strong scattering environment based on NSCT image fusion specifically comprises the following steps, and the following steps are sequentially carried out:
step S1, underwater image preprocessing: compensating the red channel by using the information of the blue-green channel;
step S2, algorithm input: a polarization degree image DOP, an intensity image S0, and a measurement matrix phi;
step S3, carrying out intensity NSCT decomposition on the intensity image S0 serving as a source image to obtain an intensity low-frequency coefficient and an intensity high-frequency coefficient, and carrying out polarization NSCT decomposition on the polarization degree image DOP serving as a source image to obtain a polarization low-frequency coefficient and a polarization high-frequency coefficient;
s4, measuring the intensity high-frequency coefficient by using the Gaussian matrix phi as an intensity CS measurement matrix to obtain an intensity high-frequency measurement value; the polarization CS measurement matrix measures the polarization high-frequency coefficient to obtain a polarization high-frequency measurement value; adjusting high-frequency sub-band coefficients of the two measured values through a Bayesian atrophy method estimation threshold value and a high-frequency fusion rule of nonlinear transformation, enhancing edge detail information and suppressing noise to obtain a fused CS domain measured value, and performing OMP reconstruction on the CS domain measured value by using an orthogonal matching pursuit method to obtain a recovered high-frequency coefficient; fusing the low-frequency coefficient of intensity and the low-frequency coefficient of polarization by using a low-frequency fusion rule to obtain a low-channel fusion coefficient;
step S5, reconstructing the recovered high-frequency coefficient and low-channel fusion coefficient by NSCT inverse transformation to obtain a final fusion image, and then covering an intensity edge image and a polarization edge image on the initial fusion image by an edge detection technology;
step S6, algorithm output: and finally obtaining the polarization fusion image.
Further, in the step S1, the underwater image is preprocessed, and the improved red channel is:
in the formula Ir(x)、Ig(x)、Ib(x) To compensate for the pixel values of the front red, green and blue channels.
Further, in step S3, a region variance, laplace energy, and a low-frequency fusion rule are provided for the low-frequency subbands of the low-frequency coefficients of intensity and polarization to obtain a final low-channel fusion coefficient; the high-frequency sub-band part selects a high-frequency fusion rule which estimates a threshold value and performs nonlinear transformation by a Bayesian atrophy method to adjust a high-frequency sub-band coefficient, enhance edge detail information and inhibit noise; and obtaining the fused CS domain measured value, and then reconstructing the recovered high-frequency coefficient by using an OMP reconstruction algorithm to obtain the recovered high-frequency coefficient.
Further, in the step S3, performing NSCT decomposition on the intensity image S0 and the polarization degree image DOP as source images to obtain an intensity low-frequency coefficient, an intensity high-frequency coefficient, a polarization low-frequency coefficient, and a polarization high-frequency coefficient, and for the low-frequency subband, a method for fusing a regional variance and a laplacian energy sum is provided, where the regional variance is defined as:
in the formula LM J,KRepresenting the low frequency subband coefficients of the mth image in the kth direction of the J-th scale decomposition, M (a, B),and muM J,K(x, y) are the variance and mean of the corresponding point (x, y) in the neighborhood, respectively, the region r is 3, and the modified weighted laplacian energy sum is defined as:
in the formula, SMLM J,K(x, y) is an improved Laplace energy operator, the correlation between the central pixel of the region and the surrounding neighborhood pixels is considered, step is a variable step size, 1 is taken, NSMLM J,K(x, y) is the modified weighted laplacian energy sum, and W is the weighting matrix valued as follows:
and obtaining a decision graph according to the variance and the improved NSML weighting decision, wherein the low-frequency coefficient fusion expression is as follows:
Lf=Lmap*LA+(1-Lmap)*LB
for the high-frequency sub-band, threshold value estimation and nonlinear transformation are performed by a Bayesian atrophy method to adjust the coefficient of the high-frequency sub-band, so that edge detail information is enhanced and noise is suppressed; the expression of the Bayesian atrophy method estimation threshold is as follows:
T=σ2 n(k,s)/σ(k,s)
in the formula: sigman(k, s) and sigma (k, s) are respectively the noise standard deviation and the signal standard deviation of the high-frequency sub-band in the k-th scale and the s-th direction, and the estimated value of sigma (k, s) can be obtained by adopting a median method, namely
σn(k,s)=median[|gs k(i,j)|]/0.6745
In the formula:obtaining coefficients of the kth scale and the s-th direction high-frequency sub-band at the (i, j) position according to a maximum likelihood estimation method:
in the formula:for the high-frequency subband coefficient variance, T only considers the correlation among subband coefficient scales, ignores the correlation among subbands in all directions in the scales, and improves the threshold value according to the average value and the minimum value characteristics because the edge detail information of the high-frequency subband image of NSCT shows different performance among all scale directions and the coefficient absolute value is smaller, namely:
in the formula:a threshold value of a high-frequency sub-band coefficient in the k scale and the s direction;the average value of all high-frequency sub-band coefficients of the kth scale in the NSCT domain is obtained;andrespectively taking the average value and the minimum value of the high-frequency sub-band coefficient in the k scale and the s direction, normalizing the high-frequency sub-band coefficient, and taking the expression as follows:
in the formula:for the maximum value of the corresponding high frequency sub-band coefficient, the adjusted sub-band coefficient is
Further, the orthogonal matching pursuit in the step S4 is defined as follows:
firstly, inputting a measured value y of a target signal, expressing a Gaussian measurement matrix by phi, expressing the dimension of the matrix as P multiplied by Q, expressing iteration times by i, expressing H as sparsity and expressing Z as an index set; the output of the signal can be represented as an H-term sparse approximation of the signal xThe margin of error is denoted here by U; inner product of error margin U and each column of measurement matrix phi, wherein initializationU0=y,i is 1, and the inner product expression is as follows:
gn=ΦTUn-1
using the inner product gnSolving the maximum value of the absolute value of the index value:
h=argmaxj=1,2...N|gn[j]|
and then, updating the index set:
Zn=Zn-1∪{h}
recording finds a reconstruction set in the observation matrix:
we can get an approximation of the signal by reconstructing the set:
and finally, updating the residual value:
Un=y-ΦZnxn,i=i+1
and when the sparsity is less than the iteration times, stopping iteration when a condition is reached, otherwise, continuously returning to iteration.
Further, in the step S5, let the intensity image S0 be a, and the degree of polarization image DOP be B; according to the residual error matrix R of the primary fusion image and the source images A and BA(i, j) and RB(i, j) define the segmentation result of the clear region:
WA(i,j)=1,RA(i,j)<RB(i,j);WA(i,j)=0,RA(i,j)≥RB(i,j)
WB=1-WA;
WAand WBThe middle element 1 corresponds to the clear areas of the image A and the image B, and 0 corresponds to the fuzzy area; by { EA1,...,EAJAnd { E } andB1,...,EBJdenotes the first-layer high-frequency component coefficients of image A and image BAnd (3) obtaining an edge detection graph through edge detection, wherein the edge detection graphs of the image A and the image B are defined as follows:
according to WAAnd WBCovering the edges of the image A and the image B on the preliminary fusion graph to obtain a final fusion image F, wherein the calculation formula is as follows:
FA(i,j)=Fnew(i,j)[1-WA(i,j)EA(i,j)]+A(i,j)WA(i,j)EA(i,j)
F(i,j)=FA(i,j)[1-WB(i,j)EB(i,j)]+B(i,j)WB(i,j)EB(i,j)
wherein, FAAnd (i, j) is an image of the preliminary fused image after the edge of the image A is covered, A (i, j) and B (i, j) are pixel values of the source images A and B at a point (i, j), and F is a final fused image.
The method for restoring the polarization imaging degraded image in the strong scattering environment based on NSCT image fusion has the following advantages: and selecting information Entropy (EN), Average Gradient (AG) and standard deviation (STD) as image evaluation indexes, and restoring the image by using a wiener filtering method, a constrained least square filtering method and a blind deconvolution method, which are compared with the patent algorithm disclosed by the invention. Through experimental calculation, the algorithm is the highest in three aspects of information Entropy (EN), Average Gradient (AG) and standard deviation (STD), namely the algorithm is the most effective, wherein the stronger imaging of the Average Gradient (AG) is improved by 3.5 times, and the stronger imaging of the Average Gradient (AG) is improved by 10% compared with polarization imaging; the standard deviation (STD) is improved by 44% compared with polarization imaging.
Drawings
FIG. 1 is a composition diagram of an image restoration method under a strong scattering environment based on NSCT image fusion.
The notation in the figure is: 1. performing intensity NSCT decomposition; 11. the intensity image S0; 111. intensity high frequency coefficient; 112. an intensity low frequency coefficient; 2. an intensity CS measurement matrix; 21. intensity high frequency measurements; 3. polarization NSCT decomposition; 31. degree of polarization image DOP; 311. a polarization high-frequency coefficient; 312. polarization low frequency coefficient; 4. a polarization CS measurement matrix; 41. a polarized high frequency measurement; 5. a high-frequency fusion rule; 51. a fused CS domain measurement value; 6. OMP reconstruction; 61. the recovered high-frequency coefficient; 7. a low frequency fusion rule; 71. a low channel fusion coefficient; 8. covering the edges; 81. an intensity edge map; 82. a polarization edge map; 9. and (4) polarization fusion images.
Detailed Description
In order to better understand the purpose, structure and function of the present invention, the method for restoring polarization degraded images under strong scattering environment based on NSCT image fusion according to the present invention is described in further detail below with reference to the accompanying drawings.
Example 1:
the invention discloses a method for restoring a polarization imaging degraded image in a strong scattering environment based on NSCT image fusion, which comprises the following steps:
(1) preprocessing an underwater image: the value of the red channel of the underwater image is very small due to the absorption characteristic of the water body, so that the information of the blue-green channel is fully utilized to compensate the red channel, and the improved red channel is as follows:
in the formula Ir(x)、Ig(x)、Ib(x) To compensate for the pixel values of the front red, green and blue channels. After compensation, a white balance algorithm is used, the color effect is more balanced, and the color cast of the underwater image is effectively removed.
(2) Inputting an algorithm: the degree of polarization image DOP31, the intensity image S011, and the measurement matrix is gaussian.
(3) The intensity image S011 and the polarization degree image DOP31 are subjected to NSCT decomposition as source images to obtain a low-frequency coefficient (learning) 112 and a high-frequency coefficient (enhancement) 111. A fusion method of a region variance and a Laplace energy sum is provided for a low-frequency sub-band, wherein the region variance is defined as:
in the formula LM J,KRepresenting the low frequency subband coefficients of the mth image in the kth direction of the J-th scale decomposition, M (a, B),and muM J,K(x, y) are the variance and mean of the corresponding point (x, y) in the neighborhood, respectively, and the region r is 3. The modified weighted laplacian energy sum is defined as:
in the formula, SMLM J,K(x, y) is an improved Laplace energy operator, the correlation between the central pixel of the region and the surrounding neighborhood pixels is considered, step is a variable step size, 1 is taken, NSMLM J,K(x, y) is the modified weighted laplacian energy sum, and W is the weighting matrix valued as follows:
and obtaining a decision graph according to the variance and the improved NSML weighting decision, wherein the low-frequency coefficient fusion expression is as follows:
Lf=Lmap*LA+(1-Lmap)*LB
and for the high-frequency sub-band, estimating a threshold value by a Bayesian atrophy method and adjusting a high-frequency sub-band coefficient by nonlinear transformation, enhancing edge detail information and suppressing noise. The expression of the Bayesian atrophy method for estimating the threshold value is
T=σ2 n(k,s)/σ(k,s)
In the formula: sigmanAnd (k, s) and sigma (k, s) are respectively the noise standard deviation and the signal standard deviation of the high-frequency sub-band in the k-th scale and the s-th direction. The estimated value of σ (k, s) can be obtained by a median method, namely:
σn(k,s)=median[|gs k(i,j)|]/0.6745
in the formula:is the coefficient of the high-frequency sub-band in the k-th scale and the s-th direction at the position (i, j).
Obtained from maximum likelihood estimation
In the formula:is the high frequency subband coefficient variance. T only considers the correlation between the sub-band coefficient scales, neglects the correlation between sub-bands in all directions in the scales, because the edge detail information of the NSCT high-frequency sub-band image is different in all scale directions, and the coefficient absolute value is smaller, the threshold value is improved according to the average value and the minimum value, namely
In the formula:a threshold value of a high-frequency sub-band coefficient in the k scale and the s direction;the average value of all high-frequency sub-band coefficients of the kth scale in the NSCT domain is obtained;andthe k scale and the average value and the minimum value of the high-frequency sub-band coefficient in the s direction are respectively. Normalizing the high-frequency subband coefficient, wherein the expression is as follows:
in the formula:for the maximum value of the corresponding high frequency sub-band coefficient, the adjusted sub-band coefficient is
(4) And then, measuring the measured value by using the Gaussian matrix phi as a measuring matrix. The obtained measurement values are reconstructed by an orthogonal matching pursuit method (OMP6), and then the restored high-frequency coefficients (61) are obtained. Where the orthogonal matching pursuit is defined as follows:
firstly, inputting a measured value y of a target signal, expressing a Gaussian measurement matrix by phi, expressing the dimension of the matrix as P multiplied by Q, expressing the iteration times by i, expressing H as sparsity and expressing Z as an index set. The output of the signal can be represented as an H-term sparse approximation of the signal xThe error margin is denoted here by U. Inner product of error margin U and each column of measurement matrix phi, wherein initializationU0Y, i is 1, and the inner product expression is:
gn=ΦTUn-1
using the inner product gnSolving the maximum value of the absolute value of the index value:
h=argmaxj=1,2...N|gn[j]|
and then, updating the index set:
Zn=Zn-1∪{h}
recording finds a reconstruction set in the observation matrix:
we can get an approximation of the signal by reconstructing the set:
and finally, updating the residual value:
Un=y-ΦZnxn,i=i+1
and when the sparsity is less than the iteration times, stopping iteration when a condition is reached, otherwise, continuously returning to iteration. The orthogonal matching pursuit method can restore the best effect with the least iteration times, and greatly reduces some unnecessary iterations.
(5) And reconstructing the recovered high-frequency coefficient (61) and low-channel fusion coefficient (71) by using NSCT inverse transformation to obtain a primary fusion image. And then, utilizing an edge detection technology to overlay the source image edge on the initial fusion image. Let intensity image S011 be a and degree of polarization image DOP31 be B. And according to the residual error matrix of the preliminary fusion image and the source images A and B and the segmentation result of the definition clear area.
WA(i,j)=1,RA(i,j)<RB(i,j);WA(i,j)=0,RA(i,j)≥RB(i,j)
WB=1-WA;
WAAnd WBMedium element 1 corresponds to the sharp areas of image a and image B, and 0 corresponds to the blurred areas. By { EA1,...,EAJAnd { E } andB1,...,EBJdenotes image AAnd an edge detection map obtained by edge detection of the first layer high-frequency component coefficient of the image B, wherein the edge detection maps of the image A and the image B are defined as follows:
according to WAAnd WBCovering the edges of the image A and the image B on the preliminary fusion graph to obtain a final fusion image F, wherein the calculation formula is as follows:
FA(i,j)=Fnew(i,j)[1-WA(i,j)EA(i,j)]+A(i,j)WA(i,j)EA(i,j)
F(i,j)=FA(i,j)[1-WB(i,j)EB(i,j)]+B(i,j)WB(i,j)EB(i,j)
wherein, FAAnd (i, j) is an image of the preliminary fused image after the edge of the image A is covered, A (i, j) and B (i, j) are pixel values of the source images A and B at a point (i, j), and F is a final fused image.
(6) And (3) outputting an algorithm: resulting in a final polarization fused image 9.
And selecting information Entropy (EN), Average Gradient (AG) and standard deviation (STD) as image evaluation indexes, and restoring the image by using a wiener filtering method, a constrained least square filtering method and a blind deconvolution method, which are compared with the patent algorithm disclosed by the invention. Through experimental calculation, the algorithm is the highest in three aspects of information Entropy (EN), Average Gradient (AG) and standard deviation (STD), namely the algorithm is the most effective, wherein the stronger imaging of the Average Gradient (AG) is improved by 3.5 times, and the stronger imaging of the Average Gradient (AG) is improved by 10% compared with polarization imaging; the standard deviation (STD) is improved by 44% compared with polarization imaging.
The invention is further described with reference to the following figures and detailed description:
analysis of the data shows that the three traditional image restoration algorithms are general in restoration effect, the image evaluation index is slightly higher than the intensity, and the visual effect is general. However, the constraint least square filtering only needs the variance and the mean as conditions to achieve the aim of restoration, and the operation is relatively simple and convenient. The algorithm is that an intensity image S0 and a polarization degree image DOP are used as source images, the S0 image has a large amount of intensity information, and the DOP image has more detail information compared with a visible light image, so that a target can be highlighted from a complex background. Compared with the traditional restoration algorithm, the algorithm has the intensity information of the intensity image, the edge information and the texture information of the polarization image, and is far higher than the traditional restoration algorithm in subjective visual effect and image evaluation indexes.
Preprocessing an underwater image, then taking an intensity image S011 and a polarization DOP31 image as a source image, and carrying out sparse decomposition on an original image by utilizing intensity NSCT decomposition 1 and polarization NSCT decomposition 3 transformation to obtain an intensity high-frequency coefficient 111, an intensity low-frequency coefficient 112, a polarization high-frequency coefficient 311 and a polarization low-frequency coefficient 312. The transformed intensity high frequency coefficients 111 and polarization high frequency coefficients 311 are "sufficiently" sparse, while the intensity low frequency coefficients 112 and polarization low frequency coefficients 312 can be considered as approximations of the original image, not sparse. Therefore, the intensity high-frequency coefficient 111 and the polarization high-frequency coefficient 311 meet the constraint condition of compressed sensing, so that an intensity CS measurement matrix 2 and a polarization CS measurement matrix 4 (called measurement matrices for short, and gaussian matrices are selected herein) are performed on the intensity high-frequency coefficient and the polarization high-frequency coefficient respectively to obtain an intensity high-frequency measurement value 21 and a polarization high-frequency measurement value 41, the high-frequency subband coefficients of the measurement values are adjusted through a Bayesian atrophy method estimation threshold value and a high-frequency fusion rule 5 of nonlinear transformation, edge detail information and noise suppression are enhanced, a fused CS domain measurement value 51 is obtained, and then an orthogonal matching pursuit (OMP6) algorithm is used for reconstructing the fused measurement values of the high-frequency coefficients to obtain a recovered high-frequency coefficient 61. And the low-frequency fusion rule 7 of the low-frequency coefficient direct region variance and the Laplace energy sum obtains a low-channel fusion coefficient (71)71, the edge detection technology can effectively extract the edge contour information of the picture, the edge coverage 8 is carried out on the primary fusion result, and the quality of the fusion image can be effectively improved. The high-frequency components contain most of edge contour information of the image, so that edge detection is carried out on each first-layer high-frequency component of the source image, the obtained edge detection images of all directional coefficients are subjected to weighted average fusion, and a source image intensity edge image 81 and a polarization edge image 82 are obtained. And finally, obtaining a final polarization fusion image 9 through NSCT inverse transformation.
It is to be understood that the present invention has been described with reference to certain embodiments, and that various changes in the features and embodiments, or equivalent substitutions may be made therein by those skilled in the art without departing from the spirit and scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (6)
1. An image restoration method under a strong scattering environment based on NSCT image fusion is characterized by comprising the following steps which are sequentially carried out:
step S1, underwater image preprocessing: compensating the red channel by using the information of the blue-green channel;
step S2, algorithm input: a degree of polarization image DOP (31), an intensity image S0(11), and a measurement matrix is a Gaussian matrix phi;
step S3, intensity NSCT decomposition (1) is carried out by taking the intensity image S0(11) as a source image to obtain an intensity low-frequency coefficient (112) and an intensity high-frequency coefficient (111), and polarization NSCT decomposition (3) is carried out by taking the polarization degree image DOP (31) as a source image to obtain a polarization low-frequency coefficient (312) and a polarization high-frequency coefficient (311);
step S4, measuring the intensity high-frequency coefficient (111) by using the Gaussian matrix phi as the intensity CS measurement matrix (2) to obtain an intensity high-frequency measurement value (21); the polarization CS measurement matrix (4) measures the polarization high-frequency coefficient (311) to obtain a polarization high-frequency measurement value (41); estimating a threshold value and a high-frequency fusion rule (5) of nonlinear transformation on the two measured values by a Bayesian atrophy method, adjusting a high-frequency sub-band coefficient, enhancing edge detail information and suppressing noise to obtain a fused CS domain measured value (51), and reconstructing (6) the CS domain measured value (51) by using an Orthogonal Matching Pursuit (OMP) method to obtain a recovered high-frequency coefficient (61); fusing the low-frequency intensity coefficient (112) and the low-frequency polarization coefficient (312) by using a low-frequency fusion rule (7) to obtain a low-channel fusion coefficient (71);
step S5, reconstructing the recovered high-frequency coefficient (61) and low-channel fusion coefficient (71) by NSCT inverse transformation to obtain a final fusion image, and then covering (8) the intensity edge map (81) and the polarization edge map (82) on the initial fusion image by an edge detection technology;
step S6, algorithm output: finally, a polarization fusion image (9) is obtained.
2. The NSCT image fusion-based image restoration method under the strong scattering environment according to claim 1, wherein in the step S1 of underwater image preprocessing, the improved red channel is:
in the formula Ir(x)、Ig(x)、Ib(x) To compensate for the pixel values of the front red, green and blue channels.
3. The NSCT image fusion-based image restoration method under the strong scattering environment according to claim 1, wherein in step S3, a region variance, Laplace energy and low frequency fusion rule (7) is proposed for the low frequency subbands of the low frequency coefficients (112) of intensity and the low frequency coefficients (312) of polarization to obtain a final low channel fusion coefficient (71); the high-frequency sub-band part selects a high-frequency fusion rule (5) for estimating a threshold value and nonlinear transformation by a Bayesian atrophy method to adjust a high-frequency sub-band coefficient, enhance edge detail information and inhibit noise; and obtaining a fused CS domain measured value (51), and then reconstructing the recovered high-frequency coefficient (61) by using an OMP reconstruction (6) algorithm to obtain the recovered high-frequency coefficient (61).
4. The image restoration method in the strong scattering environment based on NSCT image fusion as claimed in claim 3, wherein said step S3 is to perform NSCT decomposition on the intensity image S0(11) and the polarization degree image DOP (31) as source images to obtain the low frequency coefficient of intensity (112), the high frequency coefficient of intensity (111), the low frequency coefficient of polarization (311), and the high frequency coefficient of polarization (312), and a fusion method of the regional variance and the sum of Laplace energy is proposed for the low frequency subbands, and the regional variance is defined as:
in the formula LM J,KRepresenting the low frequency subband coefficients of the mth image in the kth direction of the J-th scale decomposition, M (a, B),and muM J,K(x, y) are the variance and mean of the corresponding point (x, y) in the neighborhood, respectively, the region r is 3, and the modified weighted laplacian energy sum is defined as:
in the formula, SMLM J,K(x, y) is an improved Laplace energy operator, the correlation between the central pixel of the region and the surrounding neighborhood pixels is considered, step is a variable step size, 1 is taken, NSMLM J,K(x, y) is the modified weighted laplacian energy sum, and W is the weighting matrix valued as follows:
and obtaining a decision graph according to the variance and the improved NSML weighting decision, wherein the low-frequency coefficient fusion expression is as follows:
Lf=Lmap*LA+(1-Lmap)*LB
for the high-frequency sub-band, threshold value estimation and nonlinear transformation are performed by a Bayesian atrophy method to adjust the coefficient of the high-frequency sub-band, so that edge detail information is enhanced and noise is suppressed; the expression of the Bayesian atrophy method estimation threshold is as follows:
T=σ2 n(k,s)/σ(k,s)
in the formula: sigman(k, s) and sigma (k, s) are respectively the noise standard deviation and the signal standard deviation of the high-frequency sub-band in the k-th scale and the s-th direction, and the estimated value of sigma (k, s) can be obtained by adopting a median method, namely
σn(k,s)=median[|gs k(i,j)|]/0.6745
In the formula:obtaining coefficients of the kth scale and the s-th direction high-frequency sub-band at the (i, j) position according to a maximum likelihood estimation method:
in the formula:for the high-frequency subband coefficient variance, T only considers the correlation among subband coefficient scales, ignores the correlation among subbands in all directions in the scales, and improves the threshold value according to the average value and the minimum value characteristics because the edge detail information of the high-frequency subband image of NSCT shows different performance among all scale directions and the coefficient absolute value is smaller, namely:
in the formula:a threshold value of a high-frequency sub-band coefficient in the k scale and the s direction;the average value of all high-frequency sub-band coefficients of the kth scale in the NSCT domain is obtained;andrespectively taking the average value and the minimum value of the high-frequency sub-band coefficient in the k scale and the s direction, normalizing the high-frequency sub-band coefficient, and taking the expression as follows:
5. The NSCT image fusion-based image restoration method under the strong scattering environment according to claim 1, wherein the orthogonal matching pursuit law in the step S4 is defined as follows:
firstly, inputting measured value y of target signal, using phi as Gaussian measurement matrix, using P x Q as matrix dimension, using i as iteration number and using H asSparsity, Z being an index set; the output of the signal can be represented as an H-term sparse approximation of the signal xThe margin of error is denoted here by U; inner product of error margin U and each column of measurement matrix phi, wherein initializationU0Y, i is 1, and the inner product expression is:
gn=ΦTUn-1
using the inner product gnSolving the maximum value of the absolute value of the index value:
h=argmaxj=1,2...N|gn[j]|
and then, updating the index set:
Zn=Zn-1∪{h}
recording finds a reconstruction set in the observation matrix:
we can get an approximation of the signal by reconstructing the set:
and finally, updating the residual value:
and when the sparsity is less than the iteration times, stopping iteration when a condition is reached, otherwise, continuously returning to iteration.
6. The method of claim 1The image restoration method under the strong scattering environment based on the NSCT image fusion is characterized in that, in the step S5, the intensity image S0(11) is a, and the degree of polarization image DOP (31) is B; according to the residual error matrix R of the primary fusion image and the source images A and BA(i, j) and RB(i, j) define the segmentation result of the clear region:
WA(i,j)=1,RA(i,j)<RB(i,j);WA(i,j)=0,RA(i,j)≥RB(i,j)
WB=1-WA;
WAand WBThe middle element 1 corresponds to the clear areas of the image A and the image B, and 0 corresponds to the fuzzy area; by { EA1,...,EAJAnd { E } andB1,...,EBJand the coefficients of the high-frequency components of the first layer of the image A and the image B are subjected to edge detection to obtain edge detection maps, and the edge detection maps of the image A and the image B are defined as follows:
according to WAAnd WBCovering the edges of the image A and the image B on the preliminary fusion graph to obtain a final fusion image F, wherein the calculation formula is as follows:
FA(i,j)=Fnew(i,j)[1-WA(i,j)EA(i,j)]+A(i,j)WA(i,j)EA(i,j)
F(i,j)=FA(i,j)[1-WB(i,j)EB(i,j)]+B(i,j)WB(i,j)EB(i,j)
wherein, FAAnd (i, j) is an image of the preliminary fused image after the edge of the image A is covered, A (i, j) and B (i, j) are pixel values of the source images A and B at a point (i, j), and F is a final fused image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110987104.6A CN113763267A (en) | 2021-08-26 | 2021-08-26 | Image restoration method under strong scattering environment based on NSCT image fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110987104.6A CN113763267A (en) | 2021-08-26 | 2021-08-26 | Image restoration method under strong scattering environment based on NSCT image fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113763267A true CN113763267A (en) | 2021-12-07 |
Family
ID=78791367
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110987104.6A Pending CN113763267A (en) | 2021-08-26 | 2021-08-26 | Image restoration method under strong scattering environment based on NSCT image fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113763267A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115170498A (en) * | 2022-06-30 | 2022-10-11 | 江苏科技大学 | Underwater polarization imaging method based on multi-index optimization |
CN115797374A (en) * | 2023-02-03 | 2023-03-14 | 长春理工大学 | Airport runway extraction method based on image processing |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107784638A (en) * | 2017-10-27 | 2018-03-09 | 北京信息科技大学 | A kind of Dongba ancient books image enchancing method of optimization |
CN108830819A (en) * | 2018-05-23 | 2018-11-16 | 青柠优视科技(北京)有限公司 | A kind of image interfusion method and device of depth image and infrared image |
CN111968054A (en) * | 2020-08-14 | 2020-11-20 | 中国海洋大学 | Underwater image color enhancement method based on potential low-rank representation and image fusion |
-
2021
- 2021-08-26 CN CN202110987104.6A patent/CN113763267A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107784638A (en) * | 2017-10-27 | 2018-03-09 | 北京信息科技大学 | A kind of Dongba ancient books image enchancing method of optimization |
CN108830819A (en) * | 2018-05-23 | 2018-11-16 | 青柠优视科技(北京)有限公司 | A kind of image interfusion method and device of depth image and infrared image |
CN111968054A (en) * | 2020-08-14 | 2020-11-20 | 中国海洋大学 | Underwater image color enhancement method based on potential low-rank representation and image fusion |
Non-Patent Citations (4)
Title |
---|
刘佳等: "特征匹配度结合边缘检测的图像融合技术", 《计算机技术与发展》 * |
张兵: "基于偏振探测的水下退化图像复原方法研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 * |
王圣等: "基于非下采样轮廓波变换和加权引导滤波的遥感图像增强", 《激光与光电子学进展》 * |
蓝善营 等: "基于NSCT的水下图像清晰化算法", 《微电子学与计算机》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115170498A (en) * | 2022-06-30 | 2022-10-11 | 江苏科技大学 | Underwater polarization imaging method based on multi-index optimization |
CN115797374A (en) * | 2023-02-03 | 2023-03-14 | 长春理工大学 | Airport runway extraction method based on image processing |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Deledalle et al. | Iterative weighted maximum likelihood denoising with probabilistic patch-based weights | |
Pan et al. | Underwater image de-scattering and enhancing using dehazenet and HWD | |
Bhateja et al. | Speckle suppression in SAR images employing modified anisotropic diffusion filtering in wavelet domain for environment monitoring | |
Karami et al. | Band-specific shearlet-based hyperspectral image noise reduction | |
CN113763267A (en) | Image restoration method under strong scattering environment based on NSCT image fusion | |
Pan et al. | De-scattering and edge-enhancement algorithms for underwater image restoration | |
Dharejo et al. | A deep hybrid neural network for single image dehazing via wavelet transform | |
Mathur et al. | Enhancement of nonuniformly illuminated underwater images | |
Chang | Single underwater image restoration based on adaptive transmission fusion | |
CN101540039B (en) | Method for super resolution of single-frame images | |
CN116309136A (en) | Remote sensing image cloud zone reconstruction method based on SAR priori knowledge guidance | |
Lu et al. | SAR image despeckling via structural sparse representation | |
Biradar et al. | Blind source parameters for performance evaluation of despeckling filters | |
Jin et al. | An image denoising approach based on adaptive nonlocal total variation | |
Wang et al. | An edge-preserving adaptive image denoising | |
Sun et al. | Color image denoising based on guided filter and adaptive wavelet threshold | |
Lu et al. | A novel underwater scene reconstruction method | |
Soulez et al. | Restoration of hyperspectral astronomical data from integral field spectrograph | |
Muthuraman et al. | Contrast improvement on side scan sonar images using retinex based edge preserved technique | |
Kansal et al. | Effect of non uniform illumination compensation on dehazing/de-fogging techniques | |
Yu et al. | Haze removal using deep convolutional neural network for Korea Multi-Purpose Satellite-3A (KOMPSAT-3A) multispectral remote sensing imagery | |
Bhargava et al. | An Effective Method for Image Denoising Using Non-local Means and Statistics based Guided Filter in Nonsubsampled Contourlet Domain. | |
Li et al. | Underwater image enhancement using inherent optical properties | |
Gu et al. | A gan-based method for sar image despeckling | |
Li et al. | Haze density estimation and dark channel prior based image defogging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |