CN110996096B - Tone mapping image quality evaluation method based on structural similarity difference - Google Patents

Tone mapping image quality evaluation method based on structural similarity difference Download PDF

Info

Publication number
CN110996096B
CN110996096B CN201911349334.9A CN201911349334A CN110996096B CN 110996096 B CN110996096 B CN 110996096B CN 201911349334 A CN201911349334 A CN 201911349334A CN 110996096 B CN110996096 B CN 110996096B
Authority
CN
China
Prior art keywords
image
eta
epsilon
neighborhood
quality evaluation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911349334.9A
Other languages
Chinese (zh)
Other versions
CN110996096A (en
Inventor
徐翘楚
汪斌
陈淑聪
姜飞龙
朱海滨
毛凌航
张奥
李兴隆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiaxing University
Original Assignee
Jiaxing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiaxing University filed Critical Jiaxing University
Priority to CN201911349334.9A priority Critical patent/CN110996096B/en
Publication of CN110996096A publication Critical patent/CN110996096A/en
Application granted granted Critical
Publication of CN110996096B publication Critical patent/CN110996096B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details

Abstract

The invention discloses a tone mapping image quality evaluation method based on structural similarity difference, which comprises the steps of firstly extracting the mean value and the variance of a gradient binarization matrix of an image as local gradient characteristics, then extracting the mean value, the variance, the kurtosis and the skewness of minimum directional curvature as local structure characteristics, then utilizing a uniform local binary pattern histogram of a phase image as global phase characteristics, finally adopting the mean value, the variance, the kurtosis and the skewness of local Structure Similarity (SSIM) difference between a neighborhood pixel block and a central pixel block as local neighborhood structure similarity difference characteristics, fusing to obtain total image quality evaluation characteristics, and sending the total image quality evaluation characteristics to a support training regression machine for training and testing to obtain an objective image quality evaluation result; the method avoids the defect of low evaluation precision caused by adopting a single global feature or a local feature, and improves the image quality evaluation precision.

Description

Tone mapping image quality evaluation method based on structural similarity difference
Technical Field
The invention belongs to the field of image processing, and particularly relates to a tone mapping image quality evaluation method based on structural similarity difference.
Background
Image quality evaluation is a key problem in the field of image processing, and image quality evaluation methods can be divided into subjective image quality evaluation methods and objective image quality evaluation methods according to whether a person participates in the image quality evaluation methods. The subjective image quality evaluation method is characterized in that people score images, the evaluation result is accurate, but the evaluation process is complex, the time consumption is long, and real-time application is difficult to achieve. The objective image quality evaluation method automatically predicts the image quality through a specific computer algorithm without human participation, and can be divided into a full-reference image quality evaluation method, a half-reference image quality evaluation method and a no-reference image quality evaluation method according to whether an original undistorted image is used as a reference. The full-reference algorithm predicts the image quality by using all information of the reference image, the semi-reference image quality evaluation method performs image quality prediction by using partial information of the reference image, and the no-reference image quality evaluation method performs image quality evaluation without using any information of the reference image.
The High Dynamic Range (HDR) image has a large display brightness range and can bring better visual experience to a viewer, while the existing graphic display device mostly only supports displaying 8-bit Low Dynamic Range (LDR) images, so that a tone mapping algorithm needs to be adopted to convert the HDR image into the LDR image, the LDR image converted from the HDR image becomes a tone mapped image, and in the tone mapping process of the tone mapped image, distortions such as contrast, naturalness and color saturation exist, and distortions such as blocking effect, blur, white noise are less, while the conventional no-reference image quality evaluation method mainly aims at the natural image and distortions such as blocking effect, blur, white noise, and the like, and has a poor effect on the tone mapped image, so that the no-reference image quality evaluation on the tone mapped distorted image is more challenging. For the Quality evaluation of Tone mapping image without reference, Guanghui Yue [ Guanghui Yue, Chunnig Hou, and Tianwei Zhou, blank Quality Assessment of Tone-Mapped Images construction colorization and Colorfulness, Naturability and Structure, IEEE transactions ON Industrial Electronics,2018 ] extracting chroma, Naturalness and structural features of chroma mapping distorted image to evaluate the image Quality; gangyi Jiang [ blade-mapped has been by with the School of Electronic and image quality assessment based on bright/dark regions, naturalness and aesthtics, IEEE Access, vol.6, pp.2231-2240,2018 ] using image natural statistical and aesthetic characteristics for quality evaluation; however, these methods mainly examine the natural statistics and global features of images, and most of them adopt the traditional feature extraction method effective for natural images, so that the effect is different from the subjective evaluation result. In the process of image tone mapping, the local details and the global phase of the image have large changes, so that the method extracts the image gradient, the minimum curvature and the Structural Similarity Index (SSIM) difference degree of the image block neighborhood to evaluate the distortion of the local tone image, and adopts the phase texture characteristics to evaluate the global tone mapping distortion, thereby improving the quality evaluation precision of the tone image.
Disclosure of Invention
The invention aims to provide a tone mapping image quality evaluation method based on structural similarity difference aiming at the defects of the prior art. The method simultaneously extracts the local features and the global features of the image to evaluate the image quality, fully considers the influence of image distortion on the local edge, the local minimum curvature, the local SSIM difference degree and the global phase texture, and has more accurate prediction effect compared with the prior art.
The purpose of the invention is realized by the following technical scheme: a tone mapping image quality evaluation method based on structural similarity difference degree comprises the following steps:
(1) inputting a tone mapping color distortion image and converting the tone mapping color distortion image into a tone mapping gray level distortion image D;
(2) construction of gradient features F1The method comprises the following substeps:
(2.1) processing the tone mapping gray level distortion image D obtained in the step (1) by adopting a Canny operator to obtain a gradient binarization image G;
(2.2) calculating the total number X (i, j) of neighborhood pixels with the value of 1 in the NxN neighborhood of the pixel point G (i, j);
(2.3) calculating the mean value mu of the total number X (i, j) of the neighborhood pixelsXSum variance σXConstructing a gradient feature F1=[μXX];
(3) Constructing a minimum directional curvature feature F2The method comprises the following substeps:
(3.1) construction of an angle of 0,
Figure BDA0002334280260000021
And
Figure BDA0002334280260000022
first derivative ofFilter h0、h1、h2、h3、h4And h5Respectively convolving the tone mapping gray level distortion images D obtained in the step (1) by using a first derivative filter to obtain corresponding first derivatives D0(i,j)、d1(i,j)、d2(i,j)、d3(i,j)、d4(i, j) and d5(i,j);
(3.2) construction of an angle of 0,
Figure BDA0002334280260000023
And
Figure BDA0002334280260000024
second derivative filter g of0、g1、g2、g3、g4And g5Respectively convolving the tone mapping gray level distortion images D obtained in the step (1) to obtain corresponding second derivative t0(i,j)、t1(i,j)、t2(i,j)、t3(i,j)、t4(i, j) and t5(i,j);
(3.3) calculating the tone-mapped grayscale distortion image D at an angle of 0,
Figure BDA0002334280260000025
And
Figure BDA0002334280260000026
direction of curvature Kn(i, j) using the formula:
Figure BDA0002334280260000027
wherein n is 0-5; i | is an absolute value solving operation;
(3.4) calculating the minimum directional curvature K (i, j) ═ minKn(i,j);
(3.5) calculating the mean value μ of the minimum directional curvatures K (i, j)KVariance σKKurtosis betaKDegree of sum deviation gammaKConstructing a minimum directional curvature feature F2=[μKKKK];
(4) Constructing a uniform local binary pattern histogram feature F3The method comprises the following substeps:
(4.1) carrying out two-dimensional discrete Fourier transform on the tone mapping gray level distortion image D obtained in the step (1) to obtain Fourier transform coefficients Y (u, v), wherein u is a horizontal index of Y (u, v), and v is a vertical index of Y (u, v);
(4.2) obtaining a phase image phi according to the Fourier transform coefficient Y (u, v):
Figure BDA0002334280260000031
wherein re (Y (u, v)) is the real part of Y (u, v), im (Y (u, v)) is the imaginary part of Y (u, v), and arctan (·) is the inverse tangent operation;
(4.3) calculating the uniform local binary pattern characteristic ULBP of the phase image phiB
Figure BDA0002334280260000032
Figure BDA0002334280260000033
Wherein, LBPB(u, v) is a local binary pattern feature of the phase image Φ; phib={Φ01,...,ΦB-1B neighborhood points of Φ (u, v), with B being 8, B being 0-B-1; when phi isbT [ phi ] when not less than phi (u, v)b-Φ(u,v)]When 1, whenbT [ phi ] at < phi (u, v)b-Φ(u,v)]=0;
(4.4) obtaining the uniform local binary pattern characteristic ULBP according to the step (4.3)BConstructing a uniform local binary pattern histogram feature F3
F3=hist[ULBPB(u,v)]=[f0,f1,f2,...,fB+1]
Wherein hist [. C]For operation of taking a histogram, fkTaking k as 0-B +1 for the number of elements in the histogram group with the value of k;
(5) construction of neighborhood structural similarity difference feature F4The method comprises the following substeps:
(5.1) dividing the tone-mapped gray-scale distorted image D obtained in step (1) into a matrix consisting of image blocks A (epsilon, eta) with the size of W multiplied by W,
(5.2) sequentially calculating the structural similarity between the image block A and the upper, upper right, lower left, left and upper left neighborhood image blocks to obtain neighborhood structural similarity values, which are recorded as S1 (epsilon, eta), S2 (epsilon, eta), S3 (epsilon, eta), S4 (epsilon, eta), S5 (epsilon, eta), S6 (epsilon, eta), S7 (epsilon, eta) and S8 (epsilon, eta), and averaging to obtain a neighborhood structural similarity mean value SA (epsilon, eta);
(5.3) sequentially comparing the neighborhood structure similarity numerical values obtained in the step (5.2) with the average value SA (epsilon, eta) to obtain corresponding neighborhood structure similarity contrast values L1 (epsilon, eta), L2 (epsilon, eta), L3 (epsilon, eta), L4 (epsilon, eta), L5 (epsilon, eta), L6 (epsilon, eta), L7 (epsilon, eta) and L8 (epsilon, eta); when the neighborhood structure similarity value is larger than or equal to the mean value; the contrast value of the similarity of the neighborhood structures is 1, otherwise, the contrast value is 0;
(5.4) forming a binary sequence by the neighborhood structure similarity contrast values obtained in the step (5.3) according to the sequence of L1 (epsilon, eta) -L8 (epsilon, eta), and converting the binary sequence into decimal integers serving as the neighborhood structure similarity difference Q (epsilon, eta) of the image block A;
(5.5) calculating the mean value mu of the similarity difference degree Q (epsilon, eta) of the neighborhood structureQVariance σQKurtosis betaQDegree of sum deviation gammaQConstructing a neighborhood structure similarity difference characteristic F4=[μQQQQ];
(6) Gradient characteristic F obtained according to the step (2)1And (4) obtaining the minimum direction curvature characteristic F in the step (3)2The uniform local binary pattern histogram feature F obtained in the step (4)3And (5) obtaining a neighborhood structure similarity difference characteristic F4Constructing an image quality evaluation feature F ═ F1,F2,F3,F4];
(7) Sending the image quality evaluation characteristics F obtained in the step (6) and the corresponding average subjective opinions into a support vector regression machine for training;
(8) and (4) extracting image quality evaluation characteristics F of the image to be evaluated according to the steps (1) to (7), and inputting the image to be evaluated into the support vector regression machine trained in the step (7) to obtain an image quality evaluation result.
Further, in the step (2), the standard deviation of the gaussian filter in the Canny operator is 1.5, and the dual thresholds are 0.04 and 0.1, respectively.
Further, in the step (3.1), the angle is 0,
Figure BDA0002334280260000041
And
Figure BDA0002334280260000042
the first derivative filter of (1) is:
Figure BDA0002334280260000051
Figure BDA0002334280260000052
further, in the step (3.2), the angle is 0,
Figure BDA0002334280260000053
And
Figure BDA0002334280260000054
the second derivative filter of (2) is:
Figure BDA0002334280260000055
Figure BDA0002334280260000056
further, the kernel function of the support vector regression machine is a radial basis function.
The invention has the beneficial effects that: the method comprises the steps of firstly extracting the mean value and the variance of a gradient binarization matrix of an image as local gradient features, then extracting the mean value, the variance, the kurtosis and the skewness of the minimum directional curvature as local structure features, then using a uniform local binary pattern histogram of a phase image as global phase features, and finally using the mean value, the variance, the kurtosis and the skewness of the local structure similarity difference between a neighborhood pixel block and a central pixel block as local neighborhood structure similarity difference features, and fusing the local gradient features, the local structure features, the global phase features and the local neighborhood structure similarity difference features to obtain total features for image quality evaluation.
Drawings
FIG. 1 is a flow chart of a tone mapping image quality evaluation method based on structural similarity difference;
FIG. 2 is a schematic diagram of an image block A at a (ε, η) position in a tone-mapped grayscale distorted image D and its neighborhood image blocks;
FIG. 3 is a schematic diagram of structural similarity values between an (ε, η) position image block A and a neighborhood image block in a tone-mapped grayscale distorted image D;
fig. 4 is a schematic diagram of the structural similarity contrast values between the image block a at the (epsilon, eta) position in the tone-mapped gray-scale distorted image D and the image blocks in the neighborhood thereof.
Detailed Description
The invention is described in detail below with reference to the accompanying drawings and examples.
The flow of the tone mapping image quality evaluation method based on the structural similarity difference degree is shown in fig. 1, and specifically comprises the following steps:
step (1): taking 1811 tone mapping distortion images in ESPL-LIVE HDR image database of Austin university of Texas USA, which provides subjective MOS score of each image, as input image set; randomly dividing an input image set into a training image set and a test image set, wherein 80% of images are used as the training image set, and 20% of images are used as the test image set; extracting tone mapping color distorted images from the input training image set, and converting the tone mapping color distorted images in the training image set into tone mapping gray level distorted images D;
step (2): processing the tone mapping gray level distortion image D by adopting a Canny operator to obtain a gradient binarization image G; wherein, the standard deviation of a Gaussian filter in the Canny operator is 1.5, and the dual thresholds are 0.04 and 0.1 respectively;
and (3): taking the total number X (i, j) of neighborhood pixels with the value of 1 in the NxN neighborhood taking the (i, j) pixel as the center for the (i, j) pixel in the gradient binarization image G obtained in the step (2); wherein, N takes the value of 3;
and (4): calculating the mean value mu of the total number X (i, j) of the neighborhood pixels obtained in the step (3) in the whole gradient binarization image GXSum variance σXThe calculation formula is as follows:
Figure BDA0002334280260000061
Figure BDA0002334280260000062
wherein, WXIs the width of the total number X of the neighborhood pixels, HXHeight of the total number X of the neighborhood pixels;
and (5): adopting the mean value mu obtained in the step (4)XSum variance σXConstruction of gradient features F1The combination formula is adopted as follows:
F1=[μXX]
and (6): calculating the mean, variance, kurtosis and skewness of the minimum direction curvature of the tone mapping gray level distorted image D obtained in the step (1) at the pixel (i, j), and comprising the following sub-steps:
step (6.1): the construction angle is 0,
Figure BDA0002334280260000071
And
Figure BDA0002334280260000072
first derivative filter h of0、h1、h2、h3、h4And h5The following were used:
Figure BDA0002334280260000073
Figure BDA0002334280260000074
step (6.2): the construction angle is 0,
Figure BDA0002334280260000075
And
Figure BDA0002334280260000076
second derivative filter g of0、g1、g2、g3、g4And g5The following were used:
Figure BDA0002334280260000077
Figure BDA0002334280260000078
step (6.3): adopting the first derivative filter h constructed in the step (6.1)0、h1、h2、h3、h4And h5Convolving each pixel (i, j) of the tone-mapped gray-scale distorted image D with a first derivative filter to obtain a corresponding first derivative D0(i,j)、d1(i,j)、d2(i,j)、d3(i,j)、d4(i, j) and d5(i,j);
Step (6.4): adopting the second derivative filter g constructed in the step (6.2)0、g1、g2、g3、g4And g5Convolving each pixel (i, j) of the tone-mapped gray-scale distorted image D by using a second derivative filter to obtain a corresponding second derivative t0(i,j)、t1(i,j)、t2(i,j)、t3(i,j)、t4(i, j) and t5(i,j);
Step (6.5): calculating the pixel at (i, j) of the tone-mapped gray-scale distorted image D at an angle of 0,
Figure BDA0002334280260000081
Figure BDA0002334280260000082
And
Figure BDA0002334280260000083
direction of curvature Kn(i, j), the calculation formula is as follows:
Figure BDA0002334280260000084
wherein, tn(i, j) is the second derivative of the pixel at (i, j) in the nth direction, dn(i, j) is the first derivative of the pixel at the position (i, j) in the nth direction, and n is 0-5, which corresponds to 0,
Figure BDA0002334280260000085
And
Figure BDA0002334280260000086
these 6 directions; i | is an absolute value solving operation;
step (6.6): calculating the minimum direction curvature K (i, j) of the n directions, wherein the calculation formula is as follows:
K(i,j)=minKn(i,j)
wherein min (-) is an operation of solving the minimum direction curvature in 6 directions, n is 0-5, and (i, j) is an image pixel coordinate;
step (6.7): calculating the mean value mu of the minimum direction curvature K (i, j) obtained in the step (6.6) in the whole imageKVariance σKKurtosis betaKDegree of sum deviation gammaKThe calculation formula is as follows:
Figure BDA0002334280260000087
Figure BDA0002334280260000088
wherein, WKIs the maximum horizontal index, H, of a pixel point (i, j) in an imageKThe maximum vertical index of a pixel point (i, j) in an image, L is the total number of the pixel points (i, j) in the image, and L is equal to WK×HK
And (7): adopting the mean value mu of the minimum direction curvature obtained in the step (6.7)KVariance σKKurtosis betaKDegree of sum deviation gammaKConstructing a minimum directional curvature feature F2The combination formula is adopted as follows:
F2=[μKKKK]
and (8): performing two-dimensional discrete Fourier transform on the tone mapping gray level distorted image D obtained in the step (1) to obtain Fourier transform coefficients Y (u, v) of the tone mapping gray level distorted image D, wherein u is a horizontal index of Y (u, v), v is a vertical index of Y (u, v), and the two-dimensional discrete Fourier transform is a general algorithm in the field;
and (9): obtaining phase information from Fourier transform coefficients Y (u, v)
Figure BDA0002334280260000091
The following formula is adopted:
Figure BDA0002334280260000092
wherein re (Y (u, v)) is the real part of Y (u, v), im (Y (u, v)) is the imaginary part of Y (u, v), and arctan (·) is the inverse tangent operation;
step (10): the phase information obtained in the step (9) is processed
Figure BDA0002334280260000093
Forming a phase image phi;
step (11): calculating the uniform local binary pattern characteristic ULBP of the phase image phi obtained in the step (10) at (u, v)BThe calculation method is as follows:
Figure BDA0002334280260000094
Figure BDA0002334280260000095
wherein, LBPB(u, v) is the local binary pattern feature of the phase image Φ at (u, v), ULBPB(u, v) is a uniform local binary pattern feature of the phase image Φ at (u, v); phi (u, v) is the value of the phase image phi at (u, v); phibTaking the values of B neighborhood points of the phase image phi at (u, v), and taking B as 0-B-1, then phib={Φ01,...,ΦB-1B is 8, then phi01,...,Φ7Respectively corresponding to phi (u +1, v), phi (u +1, v +1), phi (u-1, v-1), phi (u, v-1) and phi (u-1, v + 1); when phi isbT [ phi ] when not less than phi (u, v)b-Φ(u,v)]When 1, whenbT [ phi ] at < phi (u, v)b-Φ(u,v)]=0;
Step (12): utilizing the uniform local binary pattern characteristic ULBP obtained in the step (11)BCalculating a uniform local binary pattern histogram, combining into a uniform local binary pattern histogram feature F3The calculation method is as follows:
F3=hist[ULBPB(u,v)]=[f0,f1,f2,...,fB+1]
wherein, ULBPB(u, v) is a uniform local binary pattern feature of the phase image phi at (u, v), and the number of groups of the uniform local binary pattern histogram is B +2, hist [ ·]For operation of taking a histogram, fkTaking the number of elements in a histogram group with the value of k in the histogram, wherein k is 0-B + 1;
calculating the mean, variance, kurtosis and skewness of the neighborhood SSIM (structural similarity) difference degree of the tone mapping gray level distortion image D obtained in the step (1), wherein the method comprises the following substeps:
step (13.1) of dividing the tone-mapped gray-scale distorted image D into non-overlapping image blocks A with the size of W multiplied by W, wherein A belongs to RW×WR is a real number, and W is the width or height of the image block A; the position of the image block A in the image D is marked as (epsilon, eta), wherein epsilon is the horizontal index of the image block matrix A in the image D, and eta is the vertical index of the image block A in the image D;
calculating neighborhood structure similarity values between the image block A and the upper, upper right, lower left and upper left neighborhood image blocks of the image block A; the neighborhood structure similarity values between an image block a and its upper, upper right, lower left, and upper left neighborhood image blocks (as shown in fig. 2) are respectively denoted as a1 (epsilon, eta), a2 (epsilon, eta), A3 (epsilon, eta), a4 (epsilon, eta), a5 (epsilon, eta), a6 (epsilon, eta), a7 (epsilon, eta), and A8 (epsilon, eta), and the neighborhood structure similarity values between the image block a and its upper, upper right, lower left, and upper left neighborhood image blocks (as shown in fig. 3) are respectively denoted as S1 (epsilon, eta), S2 (epsilon, eta), S3 (epsilon, eta), S4 (epsilon, eta), S5 (epsilon, eta), S6 (epsilon, eta), S7 (epsilon, eta), and S8 (epsilon, eta), with the formula S1 (epsilon, eta) being calculated as an example:
S1(ε,η)=SSIM[A1(ε,η),A(ε,η)]
wherein, A (epsilon, eta) is the image block at the position (epsilon, eta), and A1 (epsilon, eta) is the image block at the position above the neighborhood of the position (epsilon, eta); SSIM [. is an SSIM numerical function for calculating the distance between two Image blocks, and is realized by calling an SSIM sub-function of MATLAB, and the adopted method is an algorithm of Wang Zhou [ Zhou Wang, Alan Conrad Bovik, Hamid Rahim Sheikh, Image Quality Assessment: From Error Visibility to Structural Similarity, IEEE Transactions on Image Processing, vol.13, pp.600-612,2004 ], and other neighborhood structure Similarity numerical values can be obtained by using a similar method;
step (13.3) of calculating the average value of eight neighborhood structure similarity values S1 (epsilon, eta), S2 (epsilon, eta), S3 (epsilon, eta), S4 (epsilon, eta), S5 (epsilon, eta), S6 (epsilon, eta), S7 (epsilon, eta) and S8 (epsilon, eta) of the image block A obtained in step (13.2), and recording the average value as SA (p, q);
step (13.4) comparing the magnitude of the eight neighborhood structure similarity values S1 (epsilon, eta), S2 (epsilon, eta), S3 (epsilon, eta), S4 (epsilon, eta), S5 (epsilon, eta), S6 (epsilon, eta), S7 (epsilon, eta) and S8 (epsilon, eta) with SA (epsilon, eta); eight neighborhood structure similarity contrast values (as shown in fig. 4) are obtained, respectively, L1 (epsilon, eta), L2 (epsilon, eta), L3 (epsilon, eta), L4 (epsilon, eta), L5 (epsilon, eta), L6 (epsilon, eta), L7 (epsilon, eta), and L8 (epsilon, eta), and L1 (epsilon, eta) is calculated as an example, and the calculation formula is:
Figure BDA0002334280260000111
step (13.5) forming a 0-1 binary sequence by the eight neighborhood structure similarity contrast values obtained in step (13.4) according to the sequence of L1 (epsilon, eta), L2 (epsilon, eta), L3 (epsilon, eta), L4 (epsilon, eta), L5 (epsilon, eta), L6 (epsilon, eta), L7 (epsilon, eta) and L8 (epsilon, eta), and converting the binary sequence into decimal integers; the decimal integer is the similarity difference of the neighborhood structure of the image block A and is marked as Q (epsilon, eta);
calculating the mean, variance, kurtosis and skewness of the neighborhood structure similarity difference Q (epsilon, eta) obtained in the step (13.5) in the image D and recording the mean, variance, kurtosis and skewness as mu in the image DQ、σQ、βQAnd gammaQThe calculation formula is as follows:
Figure BDA0002334280260000112
Figure BDA0002334280260000113
wherein, WQFor structural similarity of neighborhoodsHorizontal index maximum, H, of degree of sexual dissimilarity Q (ε, η)QIs the maximum vertical index of the similarity difference Q (epsilon, eta) of the neighborhood structures, and the lambda is the total number of the similarity difference Q (epsilon, eta) of the neighborhood structures, namely that the lambda is WQ×HQ
Step (14): the mean value mu obtained in the step (13.6) is adoptedQVariance σQKurtosis betaQDegree of sum deviation gammaQComposition neighborhood structural similarity difference feature F4The combination formula is adopted as follows:
F4=[μQQQQ]
step (15): adopting the gradient characteristic F obtained in the step (5)1The minimum direction curvature characteristic F obtained in the step (7)2The uniform local binary pattern histogram feature F obtained in the step (12)3And the neighborhood structure similarity difference characteristic F obtained in the step (14)4The total image quality evaluation characteristics F are combined, and a combination formula is adopted as follows:
F=[F1,F2,F3,F4]
step (16): sending the image quality evaluation feature F obtained in the step (15) and a corresponding Mean Opinion Score (MOS) in an ESPL-LIVE HDR image database to a support vector regression machine for training to obtain a trained support vector regression machine;
step (17): extracting a feature vector F from the test set image according to the flow from the step (1) to the step (15), and sending the feature vector F to the support vector regression machine trained in the step (16) for testing to obtain an objective image quality evaluation result; in the above steps, the support vector regression machine is trained and tested by using a Libsvm support vector machine toolkit developed by taiwan university, and a radial basis function is used as a kernel function.
Firstly, extracting the mean value, the variance, the kurtosis and the skewness of the total number of pixels of a gradient edge image in a local neighborhood as gradient characteristics, wherein the characteristics fully consider the influence of image distortion on local edge information of a tone mapping image; meanwhile, the mean value, the variance, the kurtosis and the skewness of the minimum direction curvature are extracted as the minimum direction curvature characteristic, and the influence of image distortion on the local curvature is considered; extracting the mean value, the variance, the kurtosis and the skewness of the structural similarity difference of the image block neighborhood as a neighborhood structural similarity difference characteristic, wherein the characteristic considers the influence of image distortion on the structural similarity difference between the image block and the neighborhood image block; finally, extracting a uniform local binary pattern histogram of the phase image as a uniform local binary pattern histogram feature, wherein the feature considers the influence of image distortion on all texture characteristics of the phase image; the method simultaneously extracts the average value, the variance, the kurtosis and the skewness of the total number of gradient neighborhood pixels, the minimum directional curvature and the structural similarity difference as local features, and extracts the uniform local binary pattern histogram features of the phase image as global features; the image distortion degree is measured globally and locally, and the accuracy of tone mapping image quality evaluation is improved.

Claims (5)

1. A tone mapping image quality evaluation method based on structural similarity difference degree is characterized by comprising the following steps:
(1) inputting a tone mapping color distortion image and converting the tone mapping color distortion image into a tone mapping gray level distortion image D;
(2) construction of gradient features F1The method comprises the following substeps:
(2.1) processing the tone mapping gray level distortion image D obtained in the step (1) by adopting a Canny operator to obtain a gradient binarization image G;
(2.2) calculating the total number X (i, j) of neighborhood pixels with the value of 1 in the NxN neighborhood of the pixel point G (i, j);
(2.3) calculating the mean value mu of the total number X (i, j) of the neighborhood pixelsXSum variance σXConstructing a gradient feature F1=[μXX];
(3) Constructing a minimum directional curvature feature F2The method comprises the following substeps:
(3.1) construction of an angle of 0,
Figure FDA0002955601540000011
And
Figure FDA0002955601540000012
first derivative filter h of0、h1、h2、h3、h4And h5Respectively convolving the tone mapping gray level distortion images D obtained in the step (1) by using a first derivative filter to obtain corresponding first derivatives D0(i,j)、d1(i,j)、d2(i,j)、d3(i,j)、d4(i, j) and d5(i,j);
(3.2) construction of an angle of 0,
Figure FDA0002955601540000013
And
Figure FDA0002955601540000014
second derivative filter g of0、g1、g2、g3、g4And g5Respectively convolving the tone mapping gray level distortion images D obtained in the step (1) to obtain corresponding second derivative t0(i,j)、t1(i,j)、t2(i,j)、t3(i,j)、t4(i, j) and t5(i,j);
(3.3) calculating the tone-mapped grayscale distortion image D at an angle of 0,
Figure FDA0002955601540000015
And
Figure FDA0002955601540000016
direction of curvature Kn(i, j) using the formula:
Figure FDA0002955601540000017
wherein n is 0-5; i | is an absolute value solving operation;
(3.4) calculating the minimum directional curvature K (i, j) ═ minKn(i,j);
(3.5) calculating the mean value μ of the minimum directional curvatures K (i, j)KVariance σKKurtosis betaKDegree of sum deviation gammaKConstructing a minimum directional curvature feature F2=[μKKKK];
(4) Constructing a uniform local binary pattern histogram feature F3The method comprises the following substeps:
(4.1) carrying out two-dimensional discrete Fourier transform on the tone mapping gray level distortion image D obtained in the step (1) to obtain Fourier transform coefficients Y (u, v);
(4.2) obtaining a phase image phi according to the Fourier transform coefficient Y (u, v):
Figure FDA0002955601540000021
wherein re (Y (u, v)) is the real part of Y (u, v), im (Y (u, v)) is the imaginary part of Y (u, v), and arctan (·) is the inverse tangent operation;
(4.3) calculating the uniform local binary pattern characteristic ULBP of the phase image phiB
Figure FDA0002955601540000022
Figure FDA0002955601540000023
Wherein, LBPB(u, v) is a local binary pattern feature of the phase image Φ; phib={Φ01,...,ΦB-1B neighborhood points of Φ (u, v), with B being 8, B being 0-B-1; when phi isbT [ phi ] when not less than phi (u, v)b-Φ(u,v)]When 1, whenbT [ phi ] at < phi (u, v)b-Φ(u,v)]=0;
(4.4) obtaining the uniform local binary pattern characteristic ULBP according to the step (4.3)BConstructing a uniform local binary pattern histogram feature F3
F3=hist[ULBPB(u,v)]=[f0,f1,f2,...,fB+1]
Wherein hist [. C]For operation of taking a histogram, fkTaking k as 0-B +1 for the number of elements in the histogram group with the value of k;
(5) construction of neighborhood structural similarity difference feature F4The method comprises the following substeps:
(5.1) dividing the tone mapping gray level distortion image D obtained in the step (1) into a matrix consisting of image blocks A (epsilon, eta) with the size of W multiplied by W;
(5.2) sequentially calculating the structural similarity between the image block A and the upper, upper right, lower left, left and upper left neighborhood image blocks to obtain neighborhood structural similarity values, which are recorded as S1 (epsilon, eta), S2 (epsilon, eta), S3 (epsilon, eta), S4 (epsilon, eta), S5 (epsilon, eta), S6 (epsilon, eta), S7 (epsilon, eta) and S8 (epsilon, eta), and averaging to obtain a neighborhood structural similarity mean value SA (epsilon, eta);
(5.3) sequentially comparing the neighborhood structure similarity numerical values obtained in the step (5.2) with the average value SA (epsilon, eta) to obtain corresponding neighborhood structure similarity contrast values L1 (epsilon, eta), L2 (epsilon, eta), L3 (epsilon, eta), L4 (epsilon, eta), L5 (epsilon, eta), L6 (epsilon, eta), L7 (epsilon, eta) and L8 (epsilon, eta); when the neighborhood structure similarity value is larger than or equal to the mean value; the contrast value of the similarity of the neighborhood structures is 1, otherwise, the contrast value is 0;
(5.4) forming a binary sequence by the neighborhood structure similarity contrast values obtained in the step (5.3) according to the sequence of L1 (epsilon, eta) -L8 (epsilon, eta), and converting the binary sequence into decimal integers serving as the neighborhood structure similarity difference Q (epsilon, eta) of the image block A;
(5.5) calculating the mean value mu of the similarity difference degree Q (epsilon, eta) of the neighborhood structureQVariance σQKurtosis betaQDegree of sum deviation gammaQConstructing a neighborhood structure similarity difference characteristic F4=[μQQQQ];
(6) Gradient characteristic F obtained according to the step (2)1And (4) obtaining the minimum direction curvature characteristic F in the step (3)2The uniform local binary pattern histogram feature F obtained in the step (4)3And (5) obtaining a neighborhood structure similarity difference characteristic F4Constructing an image quality evaluation feature F ═ F1,F2,F3,F4];
(7) Sending the image quality evaluation characteristics F obtained in the step (6) and the corresponding average subjective opinions into a support vector regression machine for training;
(8) and (4) extracting image quality evaluation characteristics F of the image to be evaluated according to the steps (1) to (7), and inputting the image to be evaluated into the support vector regression machine trained in the step (7) to obtain an image quality evaluation result.
2. The method according to claim 1, wherein the standard deviation of the gaussian filter in the Canny operator in step (2) is 1.5, and the dual thresholds are 0.04 and 0.1, respectively.
3. The method for evaluating the quality of a tone-mapped image based on the difference in structural similarity as claimed in claim 1, wherein the angle in said step (3.1) is 0,
Figure FDA0002955601540000031
And
Figure FDA0002955601540000032
the first derivative filter of (1) is:
Figure FDA0002955601540000033
Figure FDA0002955601540000041
4. the method for evaluating the quality of a tone-mapped image based on the difference in structural similarity as claimed in claim 1, wherein the angle in said step (3.2) is 0,
Figure FDA0002955601540000042
And
Figure FDA0002955601540000043
the second derivative filter of (2) is:
Figure FDA0002955601540000044
Figure FDA0002955601540000045
5. the method for evaluating the quality of a tone-mapped image based on the difference degree of structural similarity according to claim 1, wherein the kernel function of the support vector regression is a radial basis function.
CN201911349334.9A 2019-12-24 2019-12-24 Tone mapping image quality evaluation method based on structural similarity difference Active CN110996096B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911349334.9A CN110996096B (en) 2019-12-24 2019-12-24 Tone mapping image quality evaluation method based on structural similarity difference

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911349334.9A CN110996096B (en) 2019-12-24 2019-12-24 Tone mapping image quality evaluation method based on structural similarity difference

Publications (2)

Publication Number Publication Date
CN110996096A CN110996096A (en) 2020-04-10
CN110996096B true CN110996096B (en) 2021-05-25

Family

ID=70076291

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911349334.9A Active CN110996096B (en) 2019-12-24 2019-12-24 Tone mapping image quality evaluation method based on structural similarity difference

Country Status (1)

Country Link
CN (1) CN110996096B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085668B (en) * 2020-08-14 2023-10-10 深圳大学 Image tone mapping method based on region self-adaptive self-supervision learning
CN113014918B (en) * 2021-03-03 2022-09-02 重庆理工大学 Virtual viewpoint image quality evaluation method based on skewness and structural features
CN116703739A (en) * 2022-02-25 2023-09-05 索尼集团公司 Image enhancement method and device
CN114842000A (en) * 2022-07-01 2022-08-02 杭州同花顺数据开发有限公司 Endoscope image quality evaluation method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106683079A (en) * 2016-12-14 2017-05-17 浙江科技学院 No-reference image objective quality evaluation method based on structural distortion
KR101846743B1 (en) * 2016-11-28 2018-04-09 연세대학교 산학협력단 Objective quality assessment method and apparatus for tone mapped images
CN109003265A (en) * 2018-07-09 2018-12-14 嘉兴学院 A kind of non-reference picture assessment method for encoding quality based on Bayes's compressed sensing
CN110046673A (en) * 2019-04-25 2019-07-23 上海大学 No reference tone mapping graph image quality evaluation method based on multi-feature fusion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101846743B1 (en) * 2016-11-28 2018-04-09 연세대학교 산학협력단 Objective quality assessment method and apparatus for tone mapped images
CN106683079A (en) * 2016-12-14 2017-05-17 浙江科技学院 No-reference image objective quality evaluation method based on structural distortion
CN109003265A (en) * 2018-07-09 2018-12-14 嘉兴学院 A kind of non-reference picture assessment method for encoding quality based on Bayes's compressed sensing
CN110046673A (en) * 2019-04-25 2019-07-23 上海大学 No reference tone mapping graph image quality evaluation method based on multi-feature fusion

Also Published As

Publication number Publication date
CN110996096A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
CN110996096B (en) Tone mapping image quality evaluation method based on structural similarity difference
CN110046673B (en) No-reference tone mapping image quality evaluation method based on multi-feature fusion
CN116309570B (en) Titanium alloy bar quality detection method and system
CN108520504B (en) End-to-end blurred image blind restoration method based on generation countermeasure network
CN105261013B (en) A kind of scan image quality overall evaluation method and evaluation system
WO2019242329A1 (en) Convolutional neural network training method and device
CN110032989B (en) Table document image classification method based on frame line characteristics and pixel distribution
CN109978854B (en) Screen content image quality evaluation method based on edge and structural features
CN109886945B (en) No-reference contrast distortion image quality evaluation method based on contrast enhancement
Lin et al. Msaff-net: Multiscale attention feature fusion networks for single image dehazing and beyond
CN110956632B (en) Method and device for automatically detecting pectoralis major region in molybdenum target image
CN112017263A (en) Intelligent test paper generation method and system based on deep learning
Feng et al. Low-light image enhancement algorithm based on an atmospheric physical model
CN115760826A (en) Bearing wear condition diagnosis method based on image processing
CN112381751A (en) Online intelligent detection system and method based on image processing algorithm
Maragatham et al. Contrast enhancement by object based histogram equalization
CN112132774A (en) Quality evaluation method of tone mapping image
CN111047618A (en) Multi-scale-based non-reference screen content image quality evaluation method
CN114612399A (en) Picture identification system and method for mobile phone appearance mark
CN113421223A (en) Industrial product surface defect detection method based on deep learning and Gaussian mixture
CN116012767B (en) Visual detection method for cracks of clutch housing of electrically-controlled silicone oil fan
CN109685757B (en) Non-reference image quality evaluation method and system based on gray difference statistics
CN112950479B (en) Image gray level region stretching algorithm
CN116051421A (en) Multi-dimensional-based endoscope image quality evaluation method, device, equipment and medium
CN115100068A (en) Infrared image correction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Xu Qiaochu

Inventor after: Wang Bin

Inventor after: Chen Shucong

Inventor after: Jiang Feilong

Inventor after: Zhu Haibin

Inventor after: Mao Linghang

Inventor after: Zhang Ao

Inventor after: Li Xinglong

Inventor before: Wang Bin

Inventor before: Chen Shucong

Inventor before: Jiang Feilong

Inventor before: Zhu Haibin

Inventor before: Mao Linghang

Inventor before: Xu Qiaochu

Inventor before: Zhang Ao

Inventor before: Li Xinglong

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant