CN115187471A - Semi-reference image defogging algorithm evaluation method - Google Patents

Semi-reference image defogging algorithm evaluation method Download PDF

Info

Publication number
CN115187471A
CN115187471A CN202210701142.5A CN202210701142A CN115187471A CN 115187471 A CN115187471 A CN 115187471A CN 202210701142 A CN202210701142 A CN 202210701142A CN 115187471 A CN115187471 A CN 115187471A
Authority
CN
China
Prior art keywords
image
dehaze
defogged
hazy
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210701142.5A
Other languages
Chinese (zh)
Inventor
王正宁
奚伟航
刘晓宁
丁桢炎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202210701142.5A priority Critical patent/CN115187471A/en
Publication of CN115187471A publication Critical patent/CN115187471A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a semi-reference image defogging algorithm evaluation method, and belongs to the technical field of image processing. The evaluation scores of the defogged images are obtained by adopting four evaluation modes, including dark channel prior, local average and contrast normalization coefficients of natural images, gradient information of the defogged images and original fog images and saturation information of the defogged images and the original fog images are compared, and then the four evaluation scores are subjected to weighted summation to obtain the final evaluation score. The method combines a non-reference quality evaluation index and a reference quality evaluation index, the non-reference evaluation index is used for measuring the defogging degree of the image, and the reference evaluation index evaluates the degree of color cast and abnormal texture of the image by comparing the gradient and the saturation between the original fog image and the defogged image. The possibility of color cast and abnormal texture caused by each pixel point is quantitatively measured through the gradient and saturation change degree of each pixel point of the defogged image compared with the original fog image.

Description

Semi-reference image defogging algorithm evaluation method
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a semi-reference image defogging algorithm evaluation method.
Background
Images taken from outdoor scenes are mostly degraded to various degrees by airborne particles and water droplets, and in foggy scenes, the degradation becomes more severe. The image defogging algorithm aims to recover a clear and fog-free image from input captured by a camera in a fog scene, and the clear and fog-free image is of great significance to high-level visual tasks such as target detection, semantic image segmentation and the like.
The image defogging algorithm evaluation method aims to provide fair scores which accord with subjective evaluation of people aiming at defogging results obtained by different image defogging algorithms. The evaluation method is generally divided into a reference evaluation method and a non-reference evaluation method, wherein the non-reference evaluation method means that objective evaluation can be given to an image to be evaluated without any other auxiliary information; the reference evaluation method is that for an image to be evaluated, other auxiliary information (such as a gradient map and a depth map) is required to evaluate the image. However, in practical applications, it is difficult to conveniently acquire a fog image and a clear image strictly corresponding to the fog image, so most of the mainstream image defogging algorithms currently use two reference evaluation indexes, namely SSIM and PSNR, to evaluate based on a synthesized fog data set. In recent years, several new evaluation methods have been proposed in succession. Documents l.k.choi, j.you and a.c.bovik, "referencecell Prediction of Perceptual Fog sensitivity and Perceptual Image suppression," in IEEE Transactions on Image Processing, vol.24, no.11, pp.3888-3901, nov.2015, doi, 10.1109/tip.2015.245802 "disclose a non-reference evaluation index named FADE, which is used to obtain 12 sub-feature indexes for each block of a specified size of an Image to be evaluated, and finally obtain a score for each Image by averaging and the like. The index can better balance the effect of the image defogging algorithm on processing the image in the real scene, but the algorithm complexity is high due to more adopted indexes. The documents Liu W, zhou F, lu T, duan J, qiu G.image unfolding Quality Assessment: real-World Database and method. IEEE Trans Image Process.2021; 30-190. Doi. In the Chinese patent application with the application number of 202010364743.2 and the application name of the Chinese patent application as an image defogging algorithm evaluation method, foggy and fogless image pairs are respectively acquired for the same scene in different cities in different time periods, so that a full-reference evaluation index based on a VSI algorithm is provided, the quality of an image area which can be more noticed by human eyes can be focused on based on visual saliency in a targeted manner, but the evaluation index can not be applied to a universal multi-scene and large-range data set. The Chinese patent application with the application number of 201810359895.6 and the application name of the Chinese patent application being a non-reference objective defogging effect evaluation method provides a non-reference image quality evaluation index, and from the characteristics of a defogged image, the method can be effectively used for comprehensively evaluating the quality of the defogged image by comprehensively considering two aspects of image definition (based on the contrast enhancement degree and the fog residue degree of the defogged image) and color fidelity (based on the hue shift degree and the supersaturation degree of the defogged image), but the evaluation index can generate certain degree of distortion when dealing with phenomena such as color cast, local blocks and the like in the image. In the Chinese patent application with the application number of 201710072123.X and the application name of the method and the device for evaluating the defogging effect of the haze image, four characteristic indexes are provided based on residual fog measurement, contrast measurement and related saturation measurement of dark channel prior information, the four characteristic index values of the image to be evaluated are firstly obtained, and finally weighting processing is carried out to obtain the quality score of the defogged image.
Disclosure of Invention
The invention aims to: the method solves the technical problems that the evaluation index of the existing image defogging algorithm cannot give an accurate score to the defogging effect of the image, and the evaluation index after the defogging of the real scene fog image does not exist effectively.
The invention provides a semi-reference image defogging algorithm evaluation method, which comprises the following steps:
step 1, inputting a defogged image I dehaze And corresponding original fog image I hazy
Step 2, obtaining the evaluation scores of the defogged images by adopting four evaluation modes:
according to a DCP (dark channel prior) method, acquiring a non-reference evaluation score S of the defogged image 11 Evaluation score S 11 For estimating fog residuals;
according to MSCN index (local average and contrast normalization coefficient of natural image), acquiring non-reference evaluation score S of defogged image 12 Evaluation of score S 12 For estimating the residual degree of fog;
obtaining the reference evaluation score S of the defogged image by comparing the gradient information of the defogged image with the original fog image 21 Evaluation of score S 21 For estimating whether the defogged image generates abnormal textures.
Obtaining the reference evaluation score S of the defogged image by comparing the saturation information of the defogged image and the original fog image 22 Evaluation of score S 22 For estimating whether the defogged image generates color cast.
Step 3, evaluating the score S without reference 11 No reference evaluation score S 12 With a reference evaluation score S 21 And has a reference evaluation score S 22 And carrying out weighted summation and outputting the image quality evaluation score of the defogged image.
Preferably, in step 2, a no-reference evaluation score S of the defogged image is acquired 11 The method specifically comprises the following steps:
if the image I dehaze If the color space of (1) is a non-RGB color space, it is converted into an RGB color space, and it is recorded as an image I. If the image I dehaze Is RGB color space, then image I is image I dehaze
Dividing an image I into a plurality of non-overlapping image blocks P with the size of P multiplied by P;
in the step, two ways can be adopted for the uniform blocking of the image:
mode 1: firstly, cutting an image I, wherein the size of the cut image is integral multiple of p in width and height; then, carrying out uniform and non-overlapped blocking on the cut image to obtain a plurality of P multiplied by P image blocks P;
mode 2: the image I is directly divided uniformly and non-overlapped according to the block size p multiplied by p, and the division is usually performed from left to right and from top to bottom, or vice versa; after the segmentation is completed, blocks that are less than p × p are discarded.
For any image block P, firstly, respectively calculating the minimum value of each pixel in P on three color channels of R (Red), G (Green) and B (Blue) to obtain R min ,G min ,B min
Figure BDA0003703965590000031
Wherein P (x) represents a single pixel point within P, C R (P(x))、C G (P(x)、C B (P (x) respectively represents the values of R, G and B channels corresponding to the acquired pixel point P (x).
Finding R min ,G min ,B min The minimum value of (a) is taken as the score s of the image block P:
s=P min =min(R min ,G min ,B min ) (2)
based on S normalization corresponding to all image blocks P, then averaging to obtain S 11
Figure BDA0003703965590000032
Where i denotes the number of the image block P, s i Denotes the score of the image block P numbered i, N denotes the number of image blocks P, s min ,s max Respectively representing the minimum value and the maximum value of the scores s corresponding to the N image blocks P.
Preferably, in step 2, after the defogging is obtainedReference-free evaluation score S of image 12 The method comprises the following steps:
obtaining a grayscale image I of an image I gray
Among them, the preferred gradation map conversion processing is
I gray =0.3I R +0.59I G +0.11I B (4)
In the formula (4), I R ,I G ,I B Respectively representing R, G and B channel values of the image I;
then, for the gray image I gray Filling the boundary, and then selecting a corresponding Gaussian-like filter kernel omega for filtering to obtain a first response graph mu with the same size as the image I:
Figure BDA0003703965590000041
where k and l are the spatial indices of the current point, ω k,l Representing the numerical value of a specified spatial index position in omega, and I and j represent the spatial index of a pixel point in an image I;
acquiring a second response map σ of the same size as the image I:
Figure BDA0003703965590000042
calculating a reference-free evaluation score S 12
Figure BDA0003703965590000043
Where W and H represent the width and height of the image I, respectively.
Preferably, in step 2, the defogged image is acquired with a reference evaluation score S 21 The method specifically comprises the following steps:
obtaining an image I dehaze And I hazy The gradient value of each pixel point:
Figure BDA0003703965590000044
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003703965590000045
representing the second partial derivatives in the x and y directions, respectively, f (I) representing the image I dehaze F (j) represents the value of the ith pixel point in (1), f (j) represents the image I hazy The value of the jth pixel point in (1). G dehaze (i),G hazy (j) Respectively represent I dehaze The sum of the gradient of the ith pixel point and the sum of the gradient of the ith pixel point hazy The gradient size of the j-th pixel point;
based on all G dehaze (i) Obtain a gradient matrix G dehaze Based on all G hazy (j) Obtain a gradient matrix G hazy To the gradient matrix G dehaze And G hazy Carrying out transformation and comparison processing to obtain a comparison metric value of a corresponding position, and recording the comparison metric value as Num i Wherein i =2,3.... W.xh;
statistics of Num i Is averaged to obtain S 21
Figure BDA0003703965590000051
Wherein, for the gradient matrix G dehaze And G hazy Performing conversion and comparison processing, and acquiring a comparison metric value specifically as follows:
gradient matrix G with size W × H dehaze And G hazy Flattened into a one-dimensional matrix M of size 1 × (W × H) dehaze And M hazy Sequentially splicing the matrix M according to dimensions dehaze And M hazy Obtaining a matrix M with the size of 2 (W multiplied by H);
arranging the first row elements and the second row elements in the matrix M in a descending order or an ascending order according to the sizes of the second row elements at the same time to obtain a matrix M'; the elements of the second row of the matrix M 'are sequentially arranged in a descending order or an ascending order, and the first row of the matrix M' is taken out to obtain a 1 x (W x H) matrix M compare
For matrix M compare The ith (i e {2, 3...., w.times.h }) element of (a), the sizes of the ith element and the preceding i-1 elements are sequentially compared (i-1 times in total), and the number of values smaller or larger than the ith element in the preceding i-1 elements is recorded as Num i
The invention calculates S 21 In the process, the gradient information of the original fog image is combined, a reference evaluation index is provided, and whether the abnormal texture appears in the defogged image can be effectively judged without depending on the universality characteristics obtained from a large number of clear fog-free images.
Preferably, in step 2, a reference evaluation score S of the defogged image is acquired 22 The method specifically comprises the following steps:
if I dehaze And I hazy If the color space is not the HSV space, performing color space conversion, and extracting the value of the S channel of the HSV space: i is dehaze_s And I hazy_s (ii) a If I dehaze And I hazy If the color space of (2) is HSV space, the value of S channel is directly extracted to obtain image I dehaze_s And image I hazy_s
Separately acquiring images I dehaze_s And I hazy_s To obtain a gradient matrix G dehaze_s And G hazy_s
For gradient matrix G dehaze_s And G hazy_s Performing conversion and comparison processing to obtain a comparison metric value (a specific mode and Num) i Same), is denoted as Num _ s i ,i=2,3,........,W×H;
Statistics Num _ s i Is averaged to obtain S 22
Figure BDA0003703965590000061
Preferably, if I dehaze And I hazy The color space of (a) is an RGB color space, the value of the S channel can be obtained by:
Figure BDA0003703965590000062
Figure BDA0003703965590000063
where x represents the pixel point position of the image, R (x), G (x), and B (x) represent the pixel values of the image in the three color channels of R, G, and B, respectively, and S (x) represents the pixel value of the S channel.
In the invention, the gradient information of the original fog image is combined, a reference evaluation index is provided, and the method does not depend on the universality characteristic obtained from a large number of clear fog-free images, and can effectively judge whether the image has color cast after defogging.
The technical scheme provided by the invention at least has the following beneficial effects:
(1) The method combines a non-reference quality evaluation index and a reference quality evaluation index, the non-reference evaluation index is used for measuring the defogging degree of the image, and the reference evaluation index evaluates the degree of color cast and abnormal texture generated by the image by comparing the gradient and the saturation between the original foggy image and the defogged image.
(2) The probability of color cast and abnormal texture caused by each pixel point is quantitatively measured through the gradient and saturation change degree of each pixel point of the defogged image compared with the original fog image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a processing flow chart of a semi-reference image defogging algorithm evaluation method according to an embodiment of the present invention.
FIG. 2 shows an embodiment of the present invention, M compare And (5) a matrix construction process schematic diagram.
Detailed Description
To make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The embodiment of the invention provides a semi-reference image defogging algorithm evaluation method which can be used for outputting images after different image defogging algorithms process foggy images, and scores which accord with human eye subjective evaluation are given to the output images by combining the information of the input foggy images corresponding to the output images. According to the embodiment of the invention, through the evaluation index combining the reference evaluation and the non-reference evaluation, reliable evaluation scores can be provided for the output images after the real-scene fog images are processed by different image defogging algorithms. Considering from two aspects of residual degree of fog in the image and whether the image after defogging generates color cast or abnormal texture; aiming at the residual degree of fog in an output image, estimating by using two non-reference indexes of dark channel prior and MSCN; and judging whether the color cast and the abnormal texture exist in the output image by combining the original fog image with two reference indexes of gradient and saturation.
As shown in fig. 1, the specific implementation steps of the semi-reference image defogging algorithm evaluation method provided by the embodiment of the present invention include:
step S1, inputting the defogged image I dehaze And a corresponding original fog image I hazy
S2, acquiring a non-reference evaluation score S of the defogged image according to the DCP method 11
Step S201, aiming at the output image I processed by different image defogging algorithms dehaze (in this embodiment, image I) dehaze RGB) is obtained, the image is denoted as an image I, the width and height of the image I are W and H respectively, the image I is divided into a plurality of image blocks P which are 10 × 10 in size and do not overlap with each other in the horizontal direction and the vertical direction, N image blocks P are obtained, and blocks which are smaller than 10 × 10 in size are discarded after the division is completed.
For a given image block P, the maximum of each pixel in P on the three color channels R, G, and B is first calculatedSmall value of R min ,G min ,B min Calculating according to formula (1);
step S202, obtaining R min ,G min ,B min The minimum value of (d) is used as the score s of the image block P.
Step S203, normalizing S corresponding to the N image blocks P to 0-100, and then calculating the mean value to obtain S 11 The specific calculation formula is shown in formula (3).
Step S3, obtaining the non-reference evaluation score S of the defogged image according to the MSCN index 12
Step S301, obtaining a gray scale image I of the image I according to the formula (4) gray
Step S302, for the gray image I gray The boundary of (2) is filled, in this embodiment, the filling value is set to 3;
step S303, selecting a Gaussian-like filter kernel omega, and using omega to gray level image I as shown in the following gray Filtering with the step length of 1;
Figure BDA0003703965590000081
obtaining a response graph mu with the same size as the image I according to the formula (5), wherein in the embodiment, the values of K and L are both 3;
step S304, similar to step S303, obtaining a response map σ having the same size as the image I according to equation (6);
step S305, calculating according to the formula (7) to obtain S 12
S4, obtaining a reference evaluation score S of the defogged image by comparing the gradient information of the defogged image with the original fog image 21
Step S401, respectively calculating images I according to formula (9) dehaze And image I hazy The gradient value of each pixel point: g dehaze (i),G hazy (j) (ii) a Based on all G dehaze (i) Obtaining a gradient matrix G with the size of W multiplied by H dehaze Based on all G hazy (j) Obtaining a gradient matrix G with the size of W multiplied by H hazy
Step S402, as shown in FIG. 2, a gradient matrix G with a size of W × H dehaze And G hazy Flattened into a one-dimensional matrix M of size 1 × (W × H) dehaze And M hazy Then M is added dehaze And M hazy Connecting to obtain a new matrix M with the size of 2 x (W x H), arranging the first row elements and the second row elements in the M in descending order (or ascending order) according to the size of the second row elements simultaneously to obtain a matrix M ', and arranging the M' second row elements in descending order (or ascending order) in sequence, wherein at the moment, the elements in the same column of the first row and the second row respectively represent G of the same pixel point dehaze (i),G hazy (i) Then the first row of M' is taken out to obtain a matrix M of 1X (W X H) compare
Then, the slave matrix M compare Starting with the second element of (a), the sizes of the ith (i e {2,3,.. Multidot.WxH }) element and the preceding i-1 elements are compared in sequence (i-1 times), and the number of values smaller than the ith element in the preceding i-1 elements is recorded as Num i
Step S403, num obtained based on comparison i Obtaining S according to the formula (9) 21
Step S5, obtaining the reference evaluation score S of the defogged image by comparing the saturation information of the defogged image and the original fog image 22
Step S501, according to the formulas (11) and (12), I dehaze And I hazy Value of converting to HSV space and obtaining S channel I dehaze_s And I hazy_s
Step S502, respectively calculating I according to formula (9) dehaze_s And I hazy_s Based on I dehaze_s Each gradient value of (a) yields a gradient matrix G of size WxH dehaze_s Based on I hazy_s Each gradient value of (a) yields a gradient matrix G of size WxH hazy_s
Step S503, according to the same processing method as step S402, for the gradient matrix G dehaze_s And G hazy_s Performing conversion and comparison to obtain comparison degreeMagnitude, denoted as Num _ s i ,i=2,3,........,W×H;
Step S503, num _ S obtained based on the comparison i Obtaining S according to the formula (10) 22
Step S6, calculating a final Score Score by adopting a weighted sum mode:
Score=α 1 S 112 S 123 S 214 S 22 (14)
wherein alpha is 1 ,α 2 ,α 3 ,α 4 The weight coefficients representing the four evaluation scores were 0.2,0.3,0.25, and 0.25 in this example, respectively.
The foregoing is merely a detailed description of specific embodiments of the invention and is not intended to limit the invention. Various alterations, modifications and improvements will occur to those skilled in the art without departing from the spirit and scope of the invention.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
What has been described above are merely some of the embodiments of the present invention. It will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention.

Claims (8)

1. A semi-reference image defogging algorithm evaluation method is characterized by comprising the following steps:
step 1, inputting a defogged image I dehaze And corresponding originalFog image I hazy
Step 2, obtaining the evaluation score of the defogged image by adopting four evaluation modes:
obtaining a non-reference evaluation score S of the defogged image according to a dark channel prior method 11
Acquiring a non-reference evaluation score S of the defogged image according to the local average and the contrast normalization coefficient of the natural image 12
Obtaining the reference evaluation score S of the defogged image by comparing the gradient information of the defogged image with the original fog image 21
Obtaining the reference evaluation score S of the defogged image by comparing the saturation information of the defogged image and the original fog image 22
Step 3, evaluating score S for no reference 11 No reference evaluation score S 12 With a reference evaluation score S 21 And a reference evaluation score S 22 Weighted summation is carried out to output image I dehaze The image quality evaluation score of (1).
2. The method according to claim 1, wherein in step 2, a no-reference evaluation score S of the defogged image is obtained 11 The method specifically comprises the following steps:
if image I dehaze If the color space is not the RGB color space, the color space is converted into the RGB color space and recorded as an image I; if image I dehaze In RGB color space, image I is directly processed dehaze As an image I;
carrying out uniform and non-overlapping blocking processing on the image I to obtain a plurality of image blocks P with the size of P multiplied by P;
calculating the score s for each image block P: respectively calculating the minimum value of each pixel in the image block P on the three color channels of R, G and B to obtain R min ,G min ,B min (ii) a R is to be min ,G min ,B min The minimum value of the sum is used as the score s of the current image block P;
score for all image blocks P of image IS is normalized to obtain a normalized score S ', and then a non-reference evaluation score S is obtained based on the mean value of the normalized scores S' of all the image blocks P 11
3. The method according to claim 2, wherein normalizing the scores s of all image blocks P is specifically:
Figure FDA0003703965580000011
wherein s is min ,s max Respectively representing the minimum and maximum values of the score s for all image blocks P of the image I.
4. The method according to claim 1, wherein in step 2, a no-reference evaluation score S of the defogged image is obtained 12 The method comprises the following steps:
if the image I dehaze If the color space is not the RGB color space, converting the color space into the RGB color space and marking as an image I; if the image I dehaze In RGB color space, image I is directly processed dehaze As an image I;
obtaining a grayscale image I of an image I gray
For gray scale image I gray Filling the boundary, and then selecting a corresponding Gaussian-like filter kernel omega for filtering to obtain a first response graph mu:
Figure FDA0003703965580000021
wherein u (i, j) represents an arbitrary pixel value of the first response map μ, i and j represent spatial indexes of pixels of the image, k and l represent spatial index positions of a gaussian-like filtering kernel ω, ω k,l Denotes a numerical value in ω specifying a spatial index position, K and L denote width and height of ω, respectively;
calculating a second response map σ:
Figure FDA0003703965580000022
calculating a non-reference evaluation score S 12
Figure FDA0003703965580000023
Where W and H represent the width and height of the image I, respectively.
5. Method according to claim 4, characterized in that a grayscale image I of image I is acquired gray Comprises the following steps:
I gray =0.3I R +0.59I G +0.11I B
wherein, I R ,I G ,I B Respectively representing the R, G, B channel values of image I.
6. The method of claim 1, wherein in step 2, the defogged image is acquired with a reference evaluation score S 21 The method specifically comprises the following steps:
obtaining an image I dehaze And I hazy Obtaining the gradient value of each pixel point to obtain a gradient matrix G dehaze And G hazy
For gradient matrix G dehaze And G hazy Performing conversion and comparison processing to obtain a comparison metric value of the corresponding position, which is recorded as Num i Wherein I =2,3.... Wherein W × H, W, H represents the image I dehaze Width and height of (d);
statistics of Num i Is averaged to obtain S 21
Figure FDA0003703965580000031
Wherein, the transformation and comparison processing are carried out on the two gradient matrixes, and the acquisition of the comparison metric specifically comprises the following steps:
respectively combining two gradientsThe matrix is flattened into a one-dimensional matrix, which is marked as M1 and M2, wherein the matrix M1 corresponds to the image I dehaze M2 corresponds to the image I hazy
Sequentially splicing the matrixes M1 and M2 according to the dimension to obtain a two-dimensional matrix M;
arranging the first row elements and the second row elements in the matrix M in a descending order or an ascending order according to the sizes of the second row elements at the same time to obtain a matrix M'; sequentially arranging the elements of the second row of the matrix M 'in descending or ascending order, and taking out the first row of the matrix M' to obtain a 1 (W multiplied by H) matrix M compare
For matrix M compare The size of the ith element and the size of the previous i-1 element are sequentially compared, and the number of values smaller or larger than the ith element in the previous i-1 elements is recorded as Num i Wherein i ∈ {2, 3.,..,. W × H }.
7. The method according to claim 6, wherein in step 2, a non-reference evaluation score S of the defogged image is obtained 11 The method specifically comprises the following steps:
if I dehaze And I hazy If the color space is not the HSV space, performing color space conversion, and extracting the value of the S channel of the HSV space: i is dehaze_s And I hazy_s (ii) a If I dehaze And I hazy If the color space is HSV space, the value of S channel is directly extracted to obtain image I dehaze_s And image I hazy_s
Separately acquiring images I dehaze_s And I hazy_s To obtain a gradient matrix G dehaze_s And G hazy_s
For gradient matrix G dehaze And G hazy Performing transformation and comparison processing to obtain a comparison metric value, which is recorded as Num _ s i ,i=2,3,........,W×H;
Statistics Num _ s i Is averaged to obtain S 22
Figure FDA0003703965580000032
8. The method of claim 7, wherein if I dehaze And I hazy If the color space is RGB, the value of the S channel extracted from HSV space is:
R'(x)=R(x)/255,G'(x)=G(x)/255,B'(x)=B(x)/255
Cmax(x)=max(R'(x),G'(x),B'(x))
Cmin(x)=min(R'(x),G'(x),B'(x))
Δ(x)=Cmax(x)-Cmin(x)
Figure FDA0003703965580000041
where x represents the pixel point position of the image, R (x), G (x), and B (x) represent the pixel values of the image in the three color channels of R, G, and B, respectively, and S (x) represents the pixel value of the S channel.
CN202210701142.5A 2022-06-20 2022-06-20 Semi-reference image defogging algorithm evaluation method Pending CN115187471A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210701142.5A CN115187471A (en) 2022-06-20 2022-06-20 Semi-reference image defogging algorithm evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210701142.5A CN115187471A (en) 2022-06-20 2022-06-20 Semi-reference image defogging algorithm evaluation method

Publications (1)

Publication Number Publication Date
CN115187471A true CN115187471A (en) 2022-10-14

Family

ID=83515745

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210701142.5A Pending CN115187471A (en) 2022-06-20 2022-06-20 Semi-reference image defogging algorithm evaluation method

Country Status (1)

Country Link
CN (1) CN115187471A (en)

Similar Documents

Publication Publication Date Title
CN111292257B (en) Retinex-based image enhancement method in scotopic vision environment
Zhang et al. A no-reference evaluation metric for low-light image enhancement
US10145790B2 (en) Image processing apparatus, image processing method, image capturing device and storage medium
CN109218716B (en) No-reference tone mapping image quality evaluation method based on color statistics and information entropy
CN112488948B (en) Underwater image restoration method based on black pixel point estimation back scattering
WO2020223963A1 (en) Computer-implemented method of detecting foreign object on background object in image, apparatus for detecting foreign object on background object in image, and computer-program product
CN113284061B (en) Underwater image enhancement method based on gradient network
CN111242878A (en) Mine underground image enhancement method based on cuckoo search
CN116757988B (en) Infrared and visible light image fusion method based on semantic enrichment and segmentation tasks
CN110706196B (en) Clustering perception-based no-reference tone mapping image quality evaluation algorithm
CN111415304A (en) Underwater vision enhancement method and device based on cascade deep network
CN110910347B (en) Tone mapping image non-reference quality evaluation method based on image segmentation
CN114841846A (en) Self-coding color image robust watermark processing method based on visual perception
CN111462002B (en) Underwater image enhancement and restoration method based on convolutional neural network
CN114881905A (en) Processing method for fusing infrared image and visible light image based on wavelet transformation
Hashim et al. No reference Image Quality Measure for Hazy Images.
CN111476739B (en) Underwater image enhancement method, system and storage medium
CN115187471A (en) Semi-reference image defogging algorithm evaluation method
Yuan et al. Color image quality assessment with multi deep convolutional networks
CN112950592B (en) Non-reference light field image quality evaluation method based on high-dimensional discrete cosine transform
CN111354048B (en) Quality evaluation method and device for obtaining pictures by facing camera
CN114549386A (en) Multi-exposure image fusion method based on self-adaptive illumination consistency
CN112508847A (en) Image quality evaluation method based on depth feature and structure weighted LBP feature
CN112581453B (en) Depth, structure and angle-based non-reference light field image quality evaluation method
CN117575953B (en) Detail enhancement method for high-resolution forestry remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination