CN112184672A - No-reference image quality evaluation method and system - Google Patents
No-reference image quality evaluation method and system Download PDFInfo
- Publication number
- CN112184672A CN112184672A CN202011059158.8A CN202011059158A CN112184672A CN 112184672 A CN112184672 A CN 112184672A CN 202011059158 A CN202011059158 A CN 202011059158A CN 112184672 A CN112184672 A CN 112184672A
- Authority
- CN
- China
- Prior art keywords
- image
- training
- test
- model
- quality evaluation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Abstract
The invention discloses a no-reference image quality evaluation method and a no-reference image quality evaluation system, wherein the method comprises the following steps: determining features of a training image; fitting the characteristics of the training image by adopting a multivariate Gaussian model to obtain a training model; determining features of the test image; fitting the characteristics of the test image by adopting a multivariate Gaussian model to obtain a test model; and comparing the training model with the test model to determine the quality evaluation of the test image and measure the accuracy of the quality evaluation. The system comprises: the device comprises a training image feature extraction unit, a training model fitting unit, a test image feature extraction unit, a test model fitting unit and a quality evaluation unit. The method and the system for evaluating the quality of the non-reference image have the advantages of simple process, accurate evaluation result, good generalization performance and the like.
Description
Technical Field
The invention relates to the technical field of image quality evaluation, in particular to a no-reference image quality evaluation method and a no-reference image quality evaluation system.
Background
With the rapid development of the internet technology, the digital media technology and the communication field, the digital image which is used as an important information source sensed by human eyes plays an important role in information exchange of people; in addition, the popularity of electronic equipment such as smart phones and tablet computers is improved, and great convenience is provided for image acquisition and transmission. However, during the processes of image acquisition, compression, processing, transmission, display and the like, the image is easily distorted to different degrees. Distorted images not only affect the look and feel, but also may cause loss of information. In order to measure whether the image quality meets the requirements of a specific application, an effective image quality evaluation mechanism needs to be established.
The image quality evaluation is divided into subjective evaluation and objective evaluation. In the subjective evaluation, different observers are required to carry out subjective scoring on the same picture, and after extreme values are removed, the average value is taken as the subjective score of the image quality; the objective evaluation is divided into three modes of full reference, half reference and no reference evaluation. In the no-reference image quality evaluation process, the original lossless image of the distorted image is not needed to be used as a reference, and the method has higher practical value compared with a full-reference and semi-reference image quality evaluation method. In recent years, the field of no-reference image quality evaluation is rapidly developed, and a plurality of excellent methods appear: [1] a supervision method (BRISQE) for evaluating the distortion degree of a natural image performs model fitting by using a support vector regression method after calculating the statistical characteristics of the natural image; [2] the method of the unsupervised process is carried out in two steps, wherein a first step uses an undistorted image to extract features on an image block and then uses a multivariate Gaussian model to carry out fitting, a second step inputs the distorted image and extracts the same features on the image block in the first step, a covariance matrix is calculated and compared with coefficients of the multivariate Gaussian model in the first step, and finally the average value of scores of all blocks is taken as the quality score of the image.
The existing non-reference image quality evaluation has the following defects: the supervised non-reference evaluation algorithm needs the distorted image with subjective score to carry out model training, the accuracy of quality evaluation is reduced when the model trained on a single database is tested on other databases, and the method has poor generalization capability and more limited application scene; the unsupervised non-reference image quality evaluation uses the undistorted image to train the model, the training data is easy to obtain, the accuracy is high when the quality evaluation is carried out on different image quality evaluation databases, and the generalization of the model is good.
In order to obtain better performance, the feature dimensionality required to be extracted is very high, the model is complex, and the algorithm complexity is high in the existing unsupervised reference-free image quality evaluation method.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a no-reference image quality evaluation method and a no-reference image quality evaluation system, and the method and the system have the advantages of simple flow, accurate evaluation result, good generalization performance and the like.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention provides a no-reference image quality evaluation method, which comprises the following steps:
s11: determining features of a training image;
s12: fitting the characteristics of the training image in the S11 by adopting a multivariate Gaussian model to obtain a training model;
s13: determining features of the test image;
s14: fitting the characteristics of the test image in the S13 by adopting a multivariate Gaussian model to obtain a test model;
s15: and comparing the training model with the test model to determine the quality evaluation of the test image, and measuring the accuracy of the quality evaluation.
Preferably, the S11 includes:
s111: extracting natural image statistical characteristics of the training images;
s112: extracting phase consistency characteristics of the training images;
s113: extracting gradient features of the training image;
s114: extracting a karhunen-loey transformation feature of the training image;
further, the S13 includes:
s131: extracting natural image statistical characteristics of the test image;
s132: extracting phase consistency characteristics of the test image;
s133: extracting gradient features of the test image;
s134: and extracting the characteristic of the karhunen-loeve transformation of the test image.
Preferably, the step S111 further comprises:
s110: carrying out multi-scale blocking on the training image;
further, the S131 further includes:
s130: and carrying out multi-scale blocking on the training image.
Preferably, the method for extracting statistical features of natural images in S111 and S131 includes:
calculating two-dimensional natural image statistical characteristic coefficient I consistent with input image sizeNWherein the input image is converted to a gray space for calculation;
wherein, I represents the input image, x and y represent the image color information corresponding to the image coordinates, μ represents the image mean, and σ represents the image standard deviation, and is calculated as follows:
wherein, w represents 3 standard deviations of the image sampled by the two-dimensional circular symmetric Gaussian weighting function, the image is filtered after 7 multiplied by 7 filter normalization is obtained, the mean value mu is obtained, and the zero-mean generalized Gaussian function pair is used for obtaining INAnd (3) fitting:
wherein the content of the first and second substances,
wherein alpha controls the diffusion degree of the function, sigma is a shape parameter, and f is a functionWherein a is more than 0; obtaining two-dimensional natural image statistical characteristics;
then mix IN(x, y) respectively calculating products I (I, j) I (I, j +1), I (I, j) I (I +1, j), I (I, j) I (I +1, j +1) and I (I, j) I (I +1, j-1) with coefficients in four adjacent directions of horizontal, vertical and main and auxiliary diagonals, and fitting by respectively using asymmetric generalized Gaussian distribution functions after reshaping the four obtained product matrixes to be one-dimensional:
wherein the content of the first and second substances,
wherein, betal,βrRespectively representing the left and right shapes of the asymmetric Gaussian distribution, gamma is a shape parameter, eta describes the difference between the left and right distributions to obtain the fitted characteristic of sixteen-dimensional asymmetric generalized Gaussian distribution and obtain the statistical characteristic F of eighteen-dimensional natural imagesN。
Preferably, the phase consistency feature extraction method in S112 and S132 includes:
the input image is transformed from RGB color space to O in calculation1,O2And O3The color space is calculated as follows:
the phase consistency of two-dimensional signals is generalized from one dimension, for two-dimensional signals in the direction thetajFiltering under the sum scale n to obtain a filter response vector at the point xFurther, the direction θ can be obtainedjAnd magnitude of the response vector on scale n:
and a local energy function:
wherein the content of the first and second substances,
the two-dimensional phase consistency at point x is defined as:
and fitting the phase consistency result obtained by calculation on each dimensional space by adopting Weibull distribution:
where x is a random variable, λ is a scale parameter, k is a shape parameter, and each dimension of data can be characterized by two values, λ and k, at O1,O2And O3Co-obtaining six-dimensional features F in color spacePC。
Preferably, the gradient feature extraction method in S113 and S133 includes:
gradient features in both horizontal and vertical directions of the input image are calculated,
Gh=I*Dh
Gv=I*Dv
wherein D ish=[1,-1],Dv=[1,-1]THorizontal and vertical operators, respectively;
using generalized Gaussian distribution model pair Gh,GvFitting to obtain four-dimensional characteristic FG。
Preferably, the charonan-loeve transformation feature extraction method in S114 and S134 includes:
s71: calculating a transformation kernel:
extracting the statistical characteristics of natural images to obtain INAfter clipping to the maximum resolution where both width and height can be divided by 2, divide INDividing the block into m 2 × 2 small blocks, and subtracting the average value of each small block to form an m × 4 matrix; calculating the standard deviation of each row of the mx 4 matrix, and calculating a covariance matrix by taking the matrix with the standard deviation larger than zero corresponding to the row to obtain a karyon-loey transformation kernel;
s72: extracting image features according to a transformation kernel:
the product between the karhunen-loey transform kernel and the image block is computed,
Pm×4=Im×4*k4×4
wherein Im×4The input image I is divided into m 2 multiplied by 2 small blocks, each dimension of the four-dimensional coefficient in the P obtained by calculation is fitted by adopting generalized Gaussian distribution, and eight-dimensional characteristics F are obtainedKLT。
Preferably, the S15 further includes:
s151: calculating a difference value between the mean vector of the training model and the mean vector of the test model, and a difference value between the covariance matrix of the training model and the covariance matrix of the test model, representing a quality evaluation score of the test image;
s152: and measuring the accuracy of the quality evaluation by using the spearman rank correlation coefficient.
Preferably, when the image includes a plurality of image blocks, the difference in S151 is the difference of each image block, and the differences of the plurality of image blocks are averaged to represent the quality evaluation score of the test image.
The invention also provides a no-reference image quality evaluation system, which is used for realizing the no-reference image quality evaluation method and comprises the following steps: the device comprises a training image feature extraction unit, a training model fitting unit, a test image feature extraction unit, a test model fitting unit and a quality evaluation unit; wherein the content of the first and second substances,
the feature extraction unit of the training image is used for determining the features of the training image;
the training model fitting unit is used for fitting the characteristics of the training images in the training image characteristic extraction unit by adopting a multivariate Gaussian model to obtain a training model;
the feature extraction unit of the test image is used for determining the features of the test image;
the test model fitting unit is used for fitting the characteristics of the test image in the characteristic extraction unit of the test image by adopting a multivariate Gaussian model to obtain a test model;
the quality evaluation unit is used for comparing the training model and the test model, determining the quality evaluation of the test image and measuring the accuracy of the quality evaluation.
Compared with the prior art, the invention has the following advantages:
(1) according to the non-reference image quality evaluation method and system, the multivariate Gaussian model is used for fitting and predicting the image quality, so that accurate estimation on the quality of different types and degrees of distorted images, even mixed distorted images is facilitated, experimental results show that the model has good generalization performance, and the process is simple;
(2) according to the no-reference image quality evaluation method and system, the natural attributes such as local normalized contrast of the natural image statistical feature description image, the phase consistency feature, the structural information of the gradient feature description image and the karhunen-loeve transformation feature description are used for obtaining the sparse feature of the image under the data drive of the no-distortion image, so that the no-reference image scoring and the human subjective scoring have strong consistency, and the review result is accurate;
(3) according to the no-reference image quality evaluation method and system, through extraction of the statistical characteristics, the phase consistency characteristics, the gradient characteristics and the Carlo south-Loy transformation characteristics of the natural images, the statistical characteristics and the structural information of the natural images are fully considered, various characteristics are extracted for fusion and multi-dimensional calculation, and the fused characteristics are more fit with the visual characteristics of human eyes;
(4) according to the method and the system for evaluating the quality of the non-reference image, the accuracy of quality evaluation is measured by using the spearman rank correlation coefficient, so that the generalization performance is better;
(5) according to the non-reference image quality evaluation method and system provided by the invention, 72-dimensional features are obtained by extracting the statistical features, the phase consistency features, the gradient features and the karhunen-loey transformation features of natural images, and the extracted feature dimensions are moderate in the same type of evaluation method, so that the model is simple, the algorithm is simple, the accuracy is higher, and the generalization performance of the model is better.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
Embodiments of the invention are further described below with reference to the accompanying drawings:
FIG. 1 is a flow chart of a method for non-reference image quality evaluation according to an embodiment of the present invention;
FIG. 2 is a flowchart of S11 according to the preferred embodiment of the present invention;
FIG. 3 is a flowchart of S13 according to the preferred embodiment of the present invention;
fig. 4 is a schematic structural diagram of a non-reference image quality evaluation system according to an embodiment of the present invention.
Description of reference numerals: the method comprises the steps of 1-training image feature extraction unit, 2-training model fitting unit, 3-testing image feature extraction unit, 4-testing model fitting unit and 5-quality evaluation unit.
Detailed Description
The following examples are given for the detailed implementation and specific operation of the present invention, but the scope of the present invention is not limited to the following examples.
Fig. 1 is a flowchart illustrating a method for evaluating quality of a non-reference image according to an embodiment of the present invention.
Referring to fig. 1, the method for evaluating quality of a non-reference image of the present embodiment includes:
s11: determining features of a training image;
s12: fitting the characteristics of the training image in the S11 by adopting a multivariate Gaussian model to obtain a training model;
s13: determining features of the test image;
s14: fitting the characteristics of the test image in the S13 by adopting a multivariate Gaussian model to obtain a test model;
s15: and comparing the training model with the test model to determine the quality evaluation of the test image and measure the accuracy of the quality evaluation.
In a preferred embodiment, S11 includes:
s111: extracting natural image statistical characteristics of the training images;
s112: extracting phase consistency characteristics of the training images;
s113: extracting gradient features of the training image;
s114: extracting the charonan-loey transformation features of the training images, as shown in fig. 2;
further, S13 includes:
s131: extracting natural image statistical characteristics of the test image;
s132: extracting phase consistency characteristics of the test image;
s133: extracting gradient features of the test image;
s134: the karhunen-loeve transformation features of the test image are extracted as shown in fig. 3.
In a preferred embodiment, S111 further includes:
s110: carrying out multi-scale blocking on the training image;
further, S131 further includes:
s130: and carrying out multi-scale blocking on the training image.
In a preferred embodiment, the method for extracting statistical features of natural images in S111 and S131 includes:
calculating two-dimensional natural image statistical characteristic coefficient I consistent with input image sizeNWherein the input image is converted to a gray space for calculation;
wherein, I represents the input image, x and y represent the image color information corresponding to the image coordinates, μ represents the image mean, and σ represents the image standard deviation, and is calculated as follows:
wherein, w represents 3 standard deviations of the image sampled by the two-dimensional circular symmetric Gaussian weighting function, the image is filtered after 7 multiplied by 7 filter normalization is obtained, the mean value mu is obtained, and a zero-mean generalized Gaussian function is usedPairs of obtained INAnd (3) fitting:
wherein the content of the first and second substances,
wherein alpha controls the diffusion degree of the function, sigma is a shape parameter, and f is a functionWherein a is more than 0; obtaining two-dimensional natural image statistical characteristics;
then mix IN(x, y) respectively calculating products I (I, j) I (I, j +1), I (I, j) I (I +1, j), I (I, j) I (I +1, j +1) and I (I, j) I (I +1, j-1) with coefficients in four adjacent directions of horizontal, vertical and main and auxiliary diagonals, and fitting by respectively using asymmetric generalized Gaussian distribution functions after reshaping the four obtained product matrixes to be one-dimensional:
wherein the content of the first and second substances,
wherein, betal,βrRespectively representing the shapes of the left and right asymmetric Gaussian distributions, and gamma is a shape parameterThe number, η, describes the difference between the left and right distributions, and four-dimensional features (γ, β) can be obtained in each directionl,βrη), a feature fitting a sixteen-dimensional asymmetric generalized gaussian distribution can be obtained, and (2+4 × 4) ═ 18 natural image statistical features F can be obtainedN。
In a preferred embodiment, the phase consistency feature extraction method in S112 and S132 includes:
the input image is transformed from RGB color space to O in calculation1,O2And O3The color space is calculated as follows:
the phase consistency of two-dimensional signals is generalized from one dimension, for two-dimensional signals in the direction thetajFiltering under the sum scale n to obtain a filter response vector at the point xFurther, the direction θ can be obtainedjAnd magnitude of the response vector on scale n:
and a local energy function:
wherein the content of the first and second substances,
the two-dimensional phase consistency at point x is defined as:
and fitting the phase consistency result obtained by calculation on each dimensional space by adopting Weibull distribution:
where x is a random variable, λ is a scale parameter, k is a shape parameter, and each dimension of data can be characterized by two values, λ and k, at O1,O2And O3Co-obtaining six-dimensional features F in color spacePC。
In a preferred embodiment, the gradient feature extraction method in S113 and S133 includes:
gradient features in both horizontal and vertical directions of the input image are calculated,
Gh=I*Dh
Gv=I*Dv
wherein D ish=[1,-1],Dv=[1,-1]THorizontal and vertical operators, respectively;
using generalized Gaussian distribution model pair Gh,GvFitting to obtain four-dimensional characteristic FG。
In a preferred embodiment, the method for extracting features of karhunen-loeve transformation in S114 and S134 includes:
s71: calculating a transformation kernel:
extracting the statistical characteristics of natural images to obtain INAfter clipping to the maximum resolution where both width and height can be divided by 2, divide INDividing the block into m 2 × 2 small blocks, and subtracting the average value of each small block to form an m × 4 matrix; calculating the standard deviation of each row of the mx 4 matrix, and calculating a covariance matrix by taking the matrix with the standard deviation larger than zero corresponding to the row to obtain a karyon-loey transformation kernel;
s72: extracting image features according to a transformation kernel:
calculating the product between the karyon-loeve transform kernel and the image block as a karyon-loeve transform characteristic of the input image,
Pm×4=Im×4*k4×4
wherein Im×4The input image I is divided into m 2 multiplied by 2 small blocks, each dimension of the four-dimensional coefficient in the P obtained by calculation is fitted by adopting generalized Gaussian distribution, and eight-dimensional characteristics F are obtainedKLT。
In a preferred embodiment, S15 further includes:
s151: calculating a difference value between the mean vector of the training model and the mean vector of the test model, and a difference value between the covariance matrix of the training model and the covariance matrix of the test model, and representing the quality evaluation score of the test image;
s152: and measuring the accuracy of the quality evaluation by using the spearman rank correlation coefficient.
In a preferred embodiment, when the image comprises a plurality of image blocks, the difference in S151 is the difference of each image block, and the differences of the plurality of image blocks are averaged to represent the quality evaluation score of the test image.
In the preferred embodiment, when the training model and the test model are established, the multi-scale characteristic of the human visual system is considered, so that the image is downsampled on the basis of image blocks, and the side length of the downsampled image block is reduced to half of the original side length. And carrying out a feature extraction process on the image on two scales to obtain a 72-dimensional feature vector. In the aspect of training, a distortion-free natural image is used, a high-contrast image block in the distortion-free natural image is taken to calculate a karyon-loey transformation kernel, and features are extracted to fit a multivariate Gaussian model. Further, the high-contrast image blocks refer to that after the image block mean values are arranged in a descending order, the image blocks of the first 25% in the order are selected as the high-contrast image blocks; in the aspect of testing, all image blocks in a test image are selected for feature extraction.
In order to illustrate the superiority of the evaluation method of the present invention, the performance of the method proposed by the present invention was compared on different databases. In the experiment, six databases commonly used for image quality evaluation, such as L IVE [3], CSIQ [4], T ID2013[5], MI CT [6], LIVE-Cha L enge [7] (LIVE-C for short) and CI D2013[8], are used for carrying out the experiment. The types of distortions in different databases may differ, as shown in table 1, where the type of distortion a represents that the image contains at least two types of mixed distortions, and for the purpose of more objective comparison, common distortion types are selected for comparison. The L IVE database is widely applied to image quality evaluation, and all distortion types in the LIVE database are selected; selecting four distortion types of jpeg, jpeg2000, white noise and Gaussian blur in a CS IQ database for testing; the database TI D2013 has 24 distortion types in total, and four distortion types which are the same as the four selected distortion types in the CS IQ database are selected; the MICT of the database only has two distortion types of jpeg and jpeg2000, and all contents of the database are selected; the images in the L IVE-C and CID2013 databases are all mixed type distortions, so all are selected.
The results in table 2 calculate the absolute values and weighted average values of the spearman rank correlation coefficients under different characteristics, wherein the weights are distributed according to the number of the test images, and the algorithm can obtain better results on different databases, which indicates that the generalization performance of the algorithm is good and superior to most of the existing mainstream algorithms.
TABLE 1
Database name | Kind of distortion | Reference pictures | Distorted image |
LIVE | 5 | 29 | 779 |
CSIQ | 6 | 30 | 886 |
TID2013 | 24 | 25 | 3000 |
|
2 | 14 | 168 |
LIVE-C | A | 0 | 1169 |
CID2013 | A | 0 | 474 |
TABLE 2
Spearman rank correlation coefficient | LIVE | TID2013 | CSIQ | MICT | CID2013 | LIVE-C | Average |
Statistical features of natural images | 0.9076 | 0.8314 | 0.8892 | 0.8687 | 0.6737 | 0.4597 | 0.7206 |
Phase consistency feature | 0.6844 | 0.5362 | 0.5363 | 0.5822 | 0.5981 | 0.2776 | 0.4956 |
Features of gradient | 0.5682 | 0.4641 | 0.6073 | 0.2765 | 0.6797 | 0.4372 | 0.5200 |
Based on charonan-lov transformation characteristics | 0.8516 | 0.7864 | 0.8532 | 0.62 | 0.6574 | 0.4426 | 0.6780 |
Proposed method | 0.9162 | 0.8768 | 0.9043 | 0.8753 | 0.7659 | 0.5021 | 0.7566 |
Fig. 4 is a schematic structural diagram of a non-reference image quality evaluation system according to an embodiment of the present invention.
Referring to fig. 4, the no-reference image quality evaluation system of the present embodiment is used to implement the no-reference image quality evaluation method of the above embodiment, and includes: the device comprises a training image feature extraction unit 1, a training model fitting unit 2, a test image feature extraction unit 3, a test model fitting unit 4 and a quality evaluation unit 5; wherein the content of the first and second substances,
the feature extraction unit 1 of the training image is used for determining the features of the training image;
the training model fitting unit 2 is used for fitting the characteristics of the training images in the training image characteristic extraction unit 1 by adopting a multivariate Gaussian model to obtain a training model;
the feature extraction unit 3 of the test image is used for determining the features of the test image;
the test model fitting unit 4 is used for fitting the characteristics of the test image in the characteristic extraction unit of the test image by adopting a multivariate Gaussian model to obtain a test model;
the quality evaluation unit 5 is used for comparing the training model and the test model, determining the quality evaluation of the test image, and measuring the accuracy of the quality evaluation.
The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, and not to limit the invention. Any modifications and variations within the scope of the description, which may occur to those skilled in the art, are intended to be within the scope of the invention.
Claims (10)
1. A no-reference image quality evaluation method is characterized by comprising the following steps:
s11: determining features of a training image;
s12: fitting the characteristics of the training image in the S11 by adopting a multivariate Gaussian model to obtain a training model;
s13: determining features of the test image;
s14: fitting the characteristics of the test image in the S13 by adopting a multivariate Gaussian model to obtain a test model;
s15: and comparing the training model with the test model to determine the quality evaluation of the test image, and measuring the accuracy of the quality evaluation.
2. The no-reference image quality evaluation method according to claim 1, wherein said S11 includes:
s111: extracting natural image statistical characteristics of the training images;
s112: extracting phase consistency characteristics of the training images;
s113: extracting gradient features of the training image;
s114: extracting a karhunen-loey transformation feature of the training image;
further, the S13 includes:
s131: extracting natural image statistical characteristics of the test image;
s132: extracting phase consistency characteristics of the test image;
s133: extracting gradient features of the test image;
s134: and extracting the characteristic of the karhunen-loeve transformation of the test image.
3. The method for evaluating the quality of a reference-free image according to claim 2, wherein S111 is preceded by:
s110: carrying out multi-scale blocking on the training image;
further, the S131 further includes:
s130: and carrying out multi-scale blocking on the training image.
4. The method for evaluating the quality of a reference-free image according to claim 2, wherein the natural image statistical feature extraction method in S111 and S131 is:
calculating two-dimensional natural image statistical characteristic coefficient I consistent with input image sizeNWherein the input image is converted to a gray space for calculation;
wherein, I represents the input image, x and y represent the image color information corresponding to the image coordinates, μ represents the image mean, and σ represents the image standard deviation, and is calculated as follows:
wherein, w represents 3 standard deviations of the image sampled by the two-dimensional circular symmetric Gaussian weighting function, the image is filtered after 7 multiplied by 7 filter normalization is obtained, the mean value mu is obtained, and the zero-mean generalized Gaussian function pair is used for obtaining INAnd (3) fitting:
wherein the content of the first and second substances,
wherein alpha controls the diffusion degree of the function, sigma is a shape parameter, and f is a functionWherein a is more than 0; obtaining two-dimensional natural image statistical characteristics;
then mix IN(x, y) respectively calculating products I (I, j) I (I, j +1), I (I, j) I (I +1, j), I (I, j) I (I +1, j +1) and I (I, j) I (I +1, j-1) with coefficients in four adjacent directions of horizontal, vertical and main and auxiliary diagonals, and fitting by respectively using asymmetric generalized Gaussian distribution functions after reshaping the four obtained product matrixes to be one-dimensional:
wherein the content of the first and second substances,
wherein, betal,βrRespectively representing the left and right shapes of the asymmetric Gaussian distribution, gamma is a shape parameter, eta describes the difference between the left and right distributions to obtain the fitted characteristic of sixteen-dimensional asymmetric generalized Gaussian distribution and obtain the statistical characteristic F of eighteen-dimensional natural imagesN。
5. The method for evaluating the quality of a reference-free image according to claim 4, wherein the method for extracting the phase consistency features in S112 and S132 comprises:
the input image is transformed from RGB color space to O in calculation1,O2And O3The color space is calculated as follows:
the phase consistency of two-dimensional signals is generalized from one dimension, for two-dimensional signals in the direction thetajFiltering under the sum scale n to obtain a filter response vector at the point xFurther, the direction θ can be obtainedjAnd magnitude of the response vector on scale n:
and a local energy function:
wherein the content of the first and second substances,
the two-dimensional phase consistency at point x is defined as:
and fitting the phase consistency result obtained by calculation on each dimensional space by adopting Weibull distribution:
where x is a random variable, λ is a scale parameter, k is a shape parameter, and each dimension of data can be characterized by two values, λ and k, at O1,O2And O3Co-obtaining six-dimensional features F in color spacePC。
6. The method for evaluating the quality of a non-reference image according to claim 5, wherein the gradient feature extraction method in S113 and S133 is:
gradient features in both horizontal and vertical directions of the input image are calculated,
Gh=I*Dh
Gv=I*Dv
wherein D ish=[1,-1],Dv=[1,-1]TRespectively horizontal and vertical operators;
Using generalized Gaussian distribution model pair Gh,GvFitting to obtain four-dimensional characteristic FG。
7. The method for evaluating the quality of a reference-free image according to claim 2, wherein the charonan-loeve transformation feature extraction method in S114 and S134 comprises:
s71: calculating a transformation kernel:
extracting the statistical characteristics of natural images to obtain INAfter clipping to the maximum resolution where both width and height can be divided by 2, divide INDividing the block into m 2 × 2 small blocks, and subtracting the average value of each small block to form an m × 4 matrix; calculating the standard deviation of each row of the mx 4 matrix, and calculating a covariance matrix by taking the matrix with the standard deviation larger than zero corresponding to the row to obtain a karyon-loey transformation kernel;
s72: extracting image features according to a transformation kernel:
the product between the karhunen-loey transform kernel and the image block is computed,
Pm×4=Im×4*k4×4
wherein Im×4The input image I is divided into m 2 multiplied by 2 small blocks, each dimension of the four-dimensional coefficient in the P obtained by calculation is fitted by adopting generalized Gaussian distribution, and eight-dimensional characteristics F are obtainedKLT。
8. The no-reference image quality evaluation method according to claim 1, wherein said S15 further comprises:
s151: calculating a difference value between the mean vector of the training model and the mean vector of the test model, and a difference value between the covariance matrix of the training model and the covariance matrix of the test model, representing a quality evaluation score of the test image;
s152: and measuring the accuracy of the quality evaluation by using the spearman rank correlation coefficient.
9. The non-reference image quality evaluation method according to claim 8, wherein when the image comprises a plurality of image blocks, the difference in S151 is a difference for each image block, and the differences for the plurality of image blocks are averaged to represent a quality evaluation score of the test image.
10. A no-reference image quality evaluation system for implementing the no-reference image quality evaluation method according to any one of claims 1 to 9, comprising: the device comprises a training image feature extraction unit, a training model fitting unit, a test image feature extraction unit, a test model fitting unit and a quality evaluation unit; wherein the content of the first and second substances,
the feature extraction unit of the training image is used for determining the features of the training image;
the training model fitting unit is used for fitting the characteristics of the training images in the training image characteristic extraction unit by adopting a multivariate Gaussian model to obtain a training model;
the feature extraction unit of the test image is used for determining the features of the test image;
the test model fitting unit is used for fitting the characteristics of the test image in the characteristic extraction unit of the test image by adopting a multivariate Gaussian model to obtain a test model;
the quality evaluation unit is used for comparing the training model and the test model, determining the quality evaluation of the test image and measuring the accuracy of the quality evaluation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011059158.8A CN112184672A (en) | 2020-09-30 | 2020-09-30 | No-reference image quality evaluation method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011059158.8A CN112184672A (en) | 2020-09-30 | 2020-09-30 | No-reference image quality evaluation method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112184672A true CN112184672A (en) | 2021-01-05 |
Family
ID=73945546
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011059158.8A Pending CN112184672A (en) | 2020-09-30 | 2020-09-30 | No-reference image quality evaluation method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112184672A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112766419A (en) * | 2021-03-09 | 2021-05-07 | 东华理工大学 | Image quality evaluation method and device based on multitask learning |
CN113240668A (en) * | 2021-06-08 | 2021-08-10 | 南京师范大学 | Weld pool image quality evaluation method based on image digital feature distribution |
CN113570596A (en) * | 2021-08-13 | 2021-10-29 | 云南北方光学科技有限公司 | No-reference structure definition evaluation method based on human visual system |
CN114782422A (en) * | 2022-06-17 | 2022-07-22 | 电子科技大学 | SVR feature fusion non-reference JPEG image quality evaluation method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103996192A (en) * | 2014-05-12 | 2014-08-20 | 同济大学 | Non-reference image quality evaluation method based on high-quality natural image statistical magnitude model |
CN104851098A (en) * | 2015-05-22 | 2015-08-19 | 天津大学 | Objective evaluation method for quality of three-dimensional image based on improved structural similarity |
US20190362484A1 (en) * | 2018-05-24 | 2019-11-28 | Tfi Digital Media Limited | Patch selection for neural network based no-reference image quality assessment |
CN111311594A (en) * | 2020-03-16 | 2020-06-19 | 清华大学深圳国际研究生院 | No-reference image quality evaluation method |
-
2020
- 2020-09-30 CN CN202011059158.8A patent/CN112184672A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103996192A (en) * | 2014-05-12 | 2014-08-20 | 同济大学 | Non-reference image quality evaluation method based on high-quality natural image statistical magnitude model |
CN104851098A (en) * | 2015-05-22 | 2015-08-19 | 天津大学 | Objective evaluation method for quality of three-dimensional image based on improved structural similarity |
US20190362484A1 (en) * | 2018-05-24 | 2019-11-28 | Tfi Digital Media Limited | Patch selection for neural network based no-reference image quality assessment |
CN111311594A (en) * | 2020-03-16 | 2020-06-19 | 清华大学深圳国际研究生院 | No-reference image quality evaluation method |
Non-Patent Citations (2)
Title |
---|
ANISH MITTAL 等: "《Completely Blind" Image Quality Analyzer》", 《IEEE SIGNAL PROCESSING LETTERS》 * |
元婴老怪: ""KLT降维与图像压缩(附MATLAB代码)"", 《CSDN》 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112766419A (en) * | 2021-03-09 | 2021-05-07 | 东华理工大学 | Image quality evaluation method and device based on multitask learning |
CN113240668A (en) * | 2021-06-08 | 2021-08-10 | 南京师范大学 | Weld pool image quality evaluation method based on image digital feature distribution |
CN113240668B (en) * | 2021-06-08 | 2024-04-16 | 南京师范大学 | Image digital feature distribution-based generated molten pool image quality evaluation method |
CN113570596A (en) * | 2021-08-13 | 2021-10-29 | 云南北方光学科技有限公司 | No-reference structure definition evaluation method based on human visual system |
CN114782422A (en) * | 2022-06-17 | 2022-07-22 | 电子科技大学 | SVR feature fusion non-reference JPEG image quality evaluation method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109325550B (en) | No-reference image quality evaluation method based on image entropy | |
CN112184672A (en) | No-reference image quality evaluation method and system | |
Qureshi et al. | Towards the design of a consistent image contrast enhancement evaluation measure | |
Manap et al. | Non-distortion-specific no-reference image quality assessment: A survey | |
Shen et al. | Hybrid no-reference natural image quality assessment of noisy, blurry, JPEG2000, and JPEG images | |
CN105049851B (en) | General non-reference picture quality appraisement method based on Color perception | |
George et al. | A survey on different approaches used in image quality assessment | |
Wee et al. | Image quality assessment by discrete orthogonal moments | |
CN109978854B (en) | Screen content image quality evaluation method based on edge and structural features | |
CN109255358B (en) | 3D image quality evaluation method based on visual saliency and depth map | |
CN107481236A (en) | A kind of quality evaluating method of screen picture | |
CN108053396B (en) | No-reference evaluation method for multi-distortion image quality | |
CN105007488A (en) | Universal no-reference image quality evaluation method based on transformation domain and spatial domain | |
Wang et al. | A new blind image quality framework based on natural color statistic | |
CN111160284A (en) | Method, system, equipment and storage medium for evaluating quality of face photo | |
CN111047618B (en) | Multi-scale-based non-reference screen content image quality evaluation method | |
CN110415207A (en) | A method of the image quality measure based on image fault type | |
Morzelona | Human visual system quality assessment in the images using the IQA model integrated with automated machine learning model | |
CN107018410B (en) | A kind of non-reference picture quality appraisement method based on pre- attention mechanism and spatial dependence | |
Ahmed et al. | PIQI: perceptual image quality index based on ensemble of Gaussian process regression | |
CN108682005B (en) | Semi-reference 3D synthetic image quality evaluation method based on covariance matrix characteristics | |
CN112132774A (en) | Quality evaluation method of tone mapping image | |
CN109754390A (en) | A kind of non-reference picture quality appraisement method based on mixing visual signature | |
Zhou et al. | No-reference image quality assessment based on neighborhood co-occurrence matrix | |
CN104835172A (en) | No-reference image quality evaluation method based on phase consistency and frequency domain entropy |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |