CN104023230B - A kind of non-reference picture quality appraisement method based on gradient relevance - Google Patents

A kind of non-reference picture quality appraisement method based on gradient relevance Download PDF

Info

Publication number
CN104023230B
CN104023230B CN201410284237.7A CN201410284237A CN104023230B CN 104023230 B CN104023230 B CN 104023230B CN 201410284237 A CN201410284237 A CN 201410284237A CN 104023230 B CN104023230 B CN 104023230B
Authority
CN
China
Prior art keywords
image
gradient
property
quality
variance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410284237.7A
Other languages
Chinese (zh)
Other versions
CN104023230A (en
Inventor
刘利雄
化毅
赵清杰
黄华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201410284237.7A priority Critical patent/CN104023230B/en
Publication of CN104023230A publication Critical patent/CN104023230A/en
Application granted granted Critical
Publication of CN104023230B publication Critical patent/CN104023230B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention proposes a kind of non-reference picture quality appraisement method based on gradient relevance, belong to computer image analysis field.First this method obtains the three seed character being subject to distortion effect in the middle of image gradient, is respectively image gradient amplitude character, image gradient direction qualitative change and image gradient amplitude qualitative change.These three kinds of character are carried out the image block of M × M, the statistical variance of above-mentioned three kinds of character is obtained in each image block, and using the average of the statistical variance of image blocks all in entire image as its characteristics of image, the method finally using SVMs to combine with two step frameworks in image quality evaluation tries to achieve picture quality.The inventive method has the advantages such as time complexity is little, subjective consistency is high, the low versatility of characteristics of image dimension is good, can apply to Small computing devices or the application relevant with picture quality, have good practical value.

Description

Non-reference image quality evaluation method based on gradient relevance
Technical Field
The invention relates to an image quality evaluation method, in particular to a non-reference image quality evaluation method based on gradient relevance, and belongs to the technical field of image analysis.
Background
Vision is one of the most basic and effective ways for human to recognize the world, and images are based on human vision. The image information can accurately and directly help people to obtain the environment and meaning of the information to be expressed, which is incomparable with information such as voice, characters and the like. Therefore, the field of image analysis is of great importance in the field of computer research. The image records the state of an environment at a certain moment, and during the recording state, some errors occur, such as the influence of shaking hands of a camera during photographing, the influence of X-ray intensity and film during imaging, and the like. These effects directly interfere with the sharpness of the image, thereby reducing the information contained in the image and introducing distortions.
Image quality evaluation methods are generally classified into subjective evaluation methods and objective evaluation methods. Generally, the subjective evaluation method is to evaluate the quality score of a pair of images by an individual based on the visual perception of the individual on one image, and to take the average of the scores of a plurality of people on the same image as the final score of the pair of images. Although the evaluation method is the most accurate, the evaluation method consumes large human resources and long time, and is easily influenced by factors such as knowledge and viewpoints among evaluators. The objective evaluation method is to use a computer to evaluate the image quality, and eliminates the factors of personnel participation, so that the evaluation efficiency is greatly improved.
Generally speaking, the objective quality evaluation method is to apply computer technology to replace human factors in the subjective evaluation method, so that the result evaluated by the objective quality evaluation method is similar to the result evaluated by the subjective evaluation method. And finally, finishing the process of simulating human perception image signals by the computer. Objective quality assessment methods have applications in many areas: 1. an evaluation algorithm as an effect of an algorithm such as image fusion and segmentation; 2. a pre-algorithm serving as an image processing algorithm provides an initial value for a subsequent algorithm and the like; 3. measuring the quality of a communication channel; 4. embedded into small image acquisition equipment for application and the like.
The objective quality evaluation methods can be further classified into three categories: the method comprises a full reference image quality evaluation method, a partial reference image quality evaluation method and a no reference image quality evaluation method. As the name implies, the full-reference image quality evaluation method requires not only a distorted image but also an undistorted image corresponding to the distorted image. The partial reference image quality evaluation method is to compare the distorted image with the partial information, such as the extracted features, of the undistorted image corresponding to the distorted image to evaluate the quality of the distorted image. The quality evaluation without reference image only needs information of a distorted image to evaluate the quality of the distorted image. In practical applications, it is most practical to evaluate the quality of the non-reference image, because the original image corresponding to the distorted image is difficult to obtain in practical applications.
In conclusion, the research on the objective quality evaluation method has wide theoretical significance and important application value. Moorthy et al, in the document "Atwo-stepframe for constructing the structure of image quality evaluation", propose a two-step frame without reference to image quality evaluation, and the related basic background technology mainly comprises image gradient properties and cable operator properties.
Two-step framework for reference-free image quality evaluation
Moorthy et al propose a two-step framework for reference-free image quality evaluation. In the framework, an input distorted image is firstly classified, and then the scores predicted by the distorted image in each type of distortion are weighted and summed.
When an image training set with n distortion types is given, firstly, mapping between image features and distortion classification needs to be established, correct distortion classification and image features are input into a model in a training model, the distortion classification model is trained, and then the image distortion classification can be obtained by inputting the image features through the model.
It should be noted that, in the classification model, a hard classifier needs to be established, and what is needed is a method capable of describing the probability of the distortion of the image in each distortion type, so that an n-dimensional vector p is obtained, and each dimensional value in p represents the probability of the distortion of the input image in each determined distortion type.
Then, a regression model for each distortion type is trained, i.e. a mapping between image features and image quality is established. In training, the image training set is divided into n shares, each containing only images of one determined distortion type, and therefore, n regression models need to be trained to predict the image quality score for a particular type of distortion present in the input image. This may greatly enhance the accuracy of the mapping.
Inputting the image with the test into n regression models to obtain n quality scores, and changing the quality scores into n-dimensional quality vectors q according to the sequence corresponding to the model classification vectors p.
Finally, the quality scores are weighted and summed by using distortion classification vectors in the image, and objective prediction scores are obtained
Q = Σ i = 1 n p i q i - - - ( 1 )
Wherein p isiRepresenting the i-th component of the vector p, qiRepresenting the ith component of the vector q and n representing the number of types of distortion.
(II) image gradient information
The gradient information of the image contains a large amount of image structure information, and the gradient of the image generally represents the places where the gray value of the image has drastic change, and the places are generally the edge parts of the image and the sensitive areas of the human visual system.
For discrete digital images, the gradient magnitude is generally defined as:
Gradient ( i , j ) = ( Gradient _ x ( i , j ) ) 2 + ( Gradient _ y ( i , j ) ) 2
where Gradient _ X (i, j) and Gradient _ Y (i, j) are the partial derivatives in two orthogonal directions at the X and Y points at the i, j, respectively, computed using discrete operators of various approximations, such as the operators Sobel, Prewitt and Canny.
The direction of the gradient is where the gradient magnitude changes the fastest, and the direction of the image gradient is defined as:
orientation ( i , j ) = arc tan ( Gradient _ y ( i , j ) Gradient _ x ( i , j ) )
in summary, when there are edges in the image, there must be a large gradient value; the smoother part of the image has smaller gray value change and generally has smaller gradient. In image processing, the mode of the gradient is often referred to as the gradient, and an image composed of image gradients is referred to as a gradient image.
Disclosure of Invention
The invention aims to solve the problems of high time and space complexity, low performance and the like in the current non-reference image quality evaluation method, and establishes a non-reference natural image quality method which has high performance, low complexity and high consistency with a subjective evaluation result.
The method of the invention is realized by the following technical scheme.
A no-reference image quality evaluation method based on gradient relevance comprises the following steps:
step one, performing feature extraction on an input distorted image.
Firstly, three different sub-properties of each image in terms of image gradient are obtained, namely a gradient amplitude property GM, a gradient direction change property CO and a gradient amplitude change property CM. Wherein the three gradient properties are defined by formulas 1,2, and 3, respectively:
GM ( i , j ) = ( Gx ( i , j ) ) 2 + ( Gy ( i , j ) ) 2 - - - ( 1 )
CO(i,j)=orientation(i,j)-orientationavg(i,j)(2)
CM ( i , j ) = ( Gx ( i , j ) - Gx avg ( i , j ) ) 2 + ( Gy ( i , j ) - Gy avg ( i , j ) ) 2 - - - ( 3 )
wherein,
orientation ( i , j ) = arc tan ( Gy ( i , j ) Gx ( i , j ) )
orientation avg ( i , j ) = arc tan ( Gy avg ( i , j ) Gx avg ( i , j ) )
Gx avg = Σ i M Σ j N Gx ( i , j ) M × N
Gy avg = Σ i M Σ j N Gy ( i , j ) M × N
where orientation (i, j) denotes the direction of the image gradient, Gx and Gy are the derivatives of the discrete digital image in two orthogonal directions X, Y, and M, N is the size of the window set to describe the change in the region. And in finding the gradients Gx and Gy, the operator is computed using sobel gradients and extended from one set of orthogonal directions to two sets. Including a set of orthogonal directions of 0 degrees and 90 degrees and a set of orthogonal directions of-45 degrees and 45 degrees.
Sobel operators of 0 degree and 90 degree in orthogonal directions are:
S y = 0 0 0 0 0 0 1 0 - 1 0 0 2 0 - 2 0 0 1 0 - 1 0 0 0 0 0 0 and S x = 0 0 0 0 0 0 1 2 1 0 0 0 0 0 0 0 - 1 - 2 - 1 0 0 0 0 0 0
the sobel operators at-45 degrees and 45 degrees in the orthogonal direction are:
S y = 0 0 0 0 0 1 0 - 1 0 0 0 2 0 2 0 0 0 1 0 - 1 0 0 0 0 0 and S x = 0 0 0 0 0 0 0 1 0 - 1 0 2 0 - 2 0 1 0 - 1 0 0 0 0 0 0 0
wherein the operator denoted y is the operator calculating Gy, wherein the operator denoted x is the operator calculating Gx.
Then, the 6 image gradient property images obtained above are subjected to block processing, a statistical variance is obtained for each block of gradient property, and then the variance is normalized by a log function, and the statistical variances obtained for each block of gradient are fused into a variance of the whole image gradient property in the form of a calculated mean. The method comprises the following specific steps:
dividing each image gradient property into image gradient property image blocks of 128 × 128 size, and calculating the statistical variance of each image gradient property image block, namely the variance d of the nth image blockn. The calculation formula is as follows:
dn=∑(h(x)-E(h(x)))2(4)
wherein,
h(x)=pdf(θ)
wherein θ is a parameter in each property image, pdf is a statistical distribution of the parameter, h (x) represents a statistical probability representation after quantitative statistics of θ, and E (h (x)) represents an expectation of h (x).
Then, each calculated variance d is calculatednAnd normalizing by using a log function, and fusing statistical variance obtained by image blocks with image gradient properties by using an average aggregation method to obtain a final characteristic f, wherein the final characteristic f is shown as a formula 5.
f = 1 N × log ( Π n = 1 N d n ) , n = 1,2 . . . . . . . N - - - ( 5 )
And finally, performing down-sampling on the input distorted image to obtain an image with a second scale, and repeating the process to finally obtain a group of 12-dimensional Feature vectors.
Feature=[fGM,fCO,fCM×2orientation×2scale]
And step two, performing feature mapping. And taking the 12-dimensional feature vector obtained in the step one as a final image quality feature, and accordingly establishing a mapping relation between the image feature and the image score.
First, the image library is divided into a training set and a test set. The test set is used for establishing a mapping relation between image characteristics and image quality, and the test set is used for testing the functions of the established mapping relation. And (3) training a distortion classification model and a quality evaluation model of each corresponding distortion classification for the image features of the images in the test set by using a method of a support vector machine.
Then, based on the flow of two steps of frames in the no-reference image quality evaluation, score prediction test is carried out in the test set, namely, a distortion classification model is used for classifying the distortion of the tested image, then a quality evaluation model is used for predicting the quality of the tested image in the distortion classification, so that the quality score of the tested image is obtained, and the quality score of the tested image is further evaluated by utilizing the existing algorithm performance standard.
Advantageous effects
Compared with the prior art of the same category, the no-reference image quality evaluation method based on the image gradient correlation has the characteristics of high subjective consistency and small time and space complexity, can be applied to a small system or embedded into an algorithm and equipment related to image quality, and has high application value.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 is a box plot of subjective consistency comparisons of the method of the invention with several other full-reference, no-reference algorithms in accordance with example 1 of the present invention.
Detailed Description
The method of the present invention is further described in detail below with reference to the figures and the specific examples.
Example 1:
as shown in fig. 1, a method for evaluating quality of a non-reference image based on gradient correlation includes the following steps:
step one, performing feature extraction on an input image.
First, the three properties of the gradient defined in the present invention are found for the input image. Amplitude, direction change and amplitude change information are obtained by the image gradient in two directions.
Then, the three properties of the two directions, for a total of six gradient property images, are subjected to a blocking process. Then, the corresponding statistical variance is calculated for each block.
Then, the block statistical variances of the obtained six gradient property images are combined, and the average value of all block statistical variances in each property is obtained by using an average aggregation mode to be used as the final overall statistical variance of the six gradient properties.
And finally, down-sampling the input image, changing the input image into an image of a second scale, and repeating the process to finally expand the image characteristics from 6 dimensions to 12 dimensions.
And step two, performing feature mapping. And taking the 12-dimensional feature vector obtained in the step one as a final image quality feature, and accordingly establishing a mapping relation between the image feature and the image score.
Firstly, a method of a Support Vector Machine (SVM) is used for training a distortion classification model and a quality evaluation model of each corresponding distortion classification for the image characteristics of the images in a test set.
And then performing fractional prediction test in a test set based on a two-step framework flow in the no-reference image quality evaluation. The method comprises the steps of classifying distortion of a tested image by using a distortion classification model, predicting the quality of the tested image by using a quality evaluation model in the distortion classification, obtaining the quality score of the tested image, and evaluating the quality of the algorithm by using the existing algorithm performance index (SROCC).
In this embodiment, LIVE image database is used to test the efficiency and performance of the present invention. For comparison, this embodiment uses some well-known full-reference image quality evaluation methods and no-reference quality evaluation methods as a comparison of the present invention. In the testing process, the database is divided into a test set and a training set, and the test set and the training set are respectively set to be 20% and 80% according to a five-fold cross validation method. The two-step prediction framework from which the scores of the test set are predicted is then applied, with the required classification and regression models being trained from the image features in the training set. And finally calculating SROCC indexes of the predicted score and the actual score as a basis for evaluating the invention. Meanwhile, in order to reduce the influence of accidental factors, the above process is repeated 1000 times, each time, the training set and the test set are randomly divided, and finally, the SROCC, namely the median of the spearman correlation coefficient index, is taken as the final algorithm evaluation score (see table 1). A value of SROCC closer to 1 indicates that the algorithm correlates better with human perception. In order to more intuitively display the relative merits of various algorithms, box plots of the SROCC values of various algorithms are also drawn, as shown in fig. 2.
As can be seen from Table 1, the performance of the present invention is best overall among the five non-reference algorithms (BIQI, DIVINE, BLIINDS-II, BRISQUE and the present invention). And the method shows good subjective consistency for images of various distortion categories, so that the method has good universality and great advantages in performance. Compared with other three full reference image quality evaluation methods, namely peak signal to noise ratio (PSNR), structural similarity algorithm (SSIM) and visual information fidelity algorithm (VIF), the algorithm has lower performance than the VIF algorithm.
TABLE 1 comparison of subjective consistency index (SROCC) for each algorithm in LIVE library
JP2K JPEG NOISE BLUR FF ALL
PSNR 0.8990 0.8484 0.9835 0.8076 0.8986 0.8293
SSIM 0.9510 0.9173 0.9697 0.9513 0.9555 0.8996
VIF 0.9515 0.9104 0.9844 0.9722 0.9631 0.9521
BIQI 0.8551 0.7767 0.9764 0.9258 0.7695 0.7599
DIIVINE 0.9352 0.8921 0.9828 0.9551 0.9096 0.9174
BLIINDS-II 0.9462 0.9350 0.9634 0.9336 0.8992 0.9329
BRISQUE 0.9445 0.9221 0.9889 0.9578 0.9173 0.9432
Proposed 0.9531 0.9412 0.9858 0.9689 0.9079 0.9476
To demonstrate the good performance of the present invention in terms of time complexity, the present invention was compared with the time complexity of three well-known reference-free evaluation algorithms (DIVINE, BLIINDS-II, BRISQE and the methods proposed in the present invention). In table 2, the total time required for the four algorithms to compute the 982 images in the LIVEIQA database and the time required to average each image are listed. It can be seen that the time efficiency of the present invention is much higher than that of the other two non-reference algorithms DIVINE and BLIINDS-II, and is slightly inferior to BRISQUE.
TABLE 2 comparison of time complexity for the reference-free method
Time taken for 982 images Averaging the time taken for each image
DIIVINE 2.9519*10(4)s 30.5294
BLIINDS-II 1.3112*10(5)s 133.5213
BRISQUE 109.3859s 0.111391s
The invention 275.7670 0.280822s

Claims (1)

1. A no-reference image quality evaluation method based on gradient relevance is characterized by comprising the following steps:
firstly, extracting characteristics of an input distorted image;
firstly, three different sub-properties of each image in terms of image gradient are obtained, namely a gradient amplitude property GM, a gradient direction change property CO and a gradient amplitude change property CM, wherein the three gradient properties are respectively defined as follows:
G M ( i , j ) = ( G x ( i , j ) ) 2 + ( G y ( i , j ) ) 2 - - - ( 1 )
CO(i,j)=orientation(i,j)-orientationavg(i,j)(2)
C M ( i , j ) = ( G x ( i , j ) - Gx a v g ( i , j ) ) 2 + ( G y ( i , j ) - Gy a v g ( i , j ) ) 2 - - - ( 3 )
wherein,
o r i e n t a t i o n ( i , j ) = arctan ( G y ( i , j ) G x ( i , j ) )
orientation a v g ( i , j ) = arctan ( Gy a v g ( i , j ) Gx a v g ( i , j ) )
Gx a v g = Σ i M Σ j N G x ( i , j ) M × N
Gy a v g = Σ i M Σ j N G y ( i , j ) M × N
where orientation (i, j) denotes the direction of the image gradient, Gx and Gy are the derivatives of the discrete digital image in two orthogonal directions X, Y, M, N is the size of the window set to describe the change within the region; in the calculation of the gradients Gx and Gy, an operator is calculated by using the sobel gradient and is expanded to two groups from one group of orthogonal directions; a set of orthogonal directions including 0 degrees and 90 degrees, and a set of-45 degrees and 45 degrees;
sobel operators of 0 degree and 90 degree in orthogonal directions are:
S y = 0 0 0 0 0 0 1 0 - 1 0 0 2 0 - 2 0 0 1 0 - 1 0 0 0 0 0 0 and S x = 0 0 0 0 0 0 1 2 1 0 0 0 0 0 0 0 - 1 - 2 - 1 0 0 0 0 0 0
the sobel operators at-45 degrees and 45 degrees in the orthogonal direction are:
S y = 0 0 0 0 0 1 0 - 1 0 0 0 2 0 2 0 0 0 1 0 - 1 0 0 0 0 0 and S x = 0 0 0 0 0 0 0 1 0 - 1 0 2 0 - 2 0 1 0 - 1 0 0 0 0 0 0 0
wherein the operator marked with y is the operator for computing Gy, and the operator marked with x is the operator for computing Gx;
then, the image gradient property image obtained above is processed by block division, a statistical variance is obtained on each block of gradient property, and after the variance is normalized by a log function, the statistical variance obtained for each block of gradient is fused into a variance of the whole image gradient property in the form of a calculated mean, which is as follows:
dividing each image gradient property into image gradient property image blocks of 128 × 128 size, and calculating the statistical variance of each image gradient property image block, namely the variance d of the nth image blockn(ii) a The calculation formula is as follows:
dn=Σ(h(x)-E(h(x)))2(4)
wherein,
h(x)=pdf(θ)
wherein theta is a parameter in each property image, pdf is a statistical distribution of the parameter, h (x) represents a statistical probability representation after quantitative statistics of theta, and E (h (x)) represents an expectation of h (x);
then, each calculated variance d is calculatednNormalizing by using a log function, and fusing statistical variance obtained by image blocks with image gradient properties by using an average aggregation method to serve as a final characteristic f, wherein the final characteristic f is shown as a formula 5;
f = 1 H × l o g ( Π n = 1 H d n ) , - - - ( 5 ) n 1,2
Finally, the input distorted image is subjected to down sampling to be changed into an image with a second scale, and the process is repeated to finally obtain a group of 12-dimensional feature vectors;
step two, performing feature mapping, and taking the 12-dimensional feature vector obtained in the step one as a final image quality feature, thereby establishing a mapping relation between the image feature and the image score;
firstly, dividing an image library into a training set and a testing set; the test set is used for establishing a mapping relation between image characteristics and image quality, and the test set is used for testing the function of the established mapping relation; training a distortion classification model and a corresponding quality evaluation model of each distortion classification for the image characteristics of the images in the test set by using a method of a support vector machine;
then, based on the flow of two steps of frames in the no-reference image quality evaluation, score prediction test is carried out in the test set, namely, a distortion classification model is used for classifying the distortion of the tested image, then a quality evaluation model is used for predicting the quality of the tested image in the distortion classification, so that the quality score of the tested image is obtained, and the quality score of the tested image can be evaluated by using the existing algorithm performance standard.
CN201410284237.7A 2014-06-23 2014-06-23 A kind of non-reference picture quality appraisement method based on gradient relevance Expired - Fee Related CN104023230B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410284237.7A CN104023230B (en) 2014-06-23 2014-06-23 A kind of non-reference picture quality appraisement method based on gradient relevance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410284237.7A CN104023230B (en) 2014-06-23 2014-06-23 A kind of non-reference picture quality appraisement method based on gradient relevance

Publications (2)

Publication Number Publication Date
CN104023230A CN104023230A (en) 2014-09-03
CN104023230B true CN104023230B (en) 2016-04-13

Family

ID=51439772

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410284237.7A Expired - Fee Related CN104023230B (en) 2014-06-23 2014-06-23 A kind of non-reference picture quality appraisement method based on gradient relevance

Country Status (1)

Country Link
CN (1) CN104023230B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915945A (en) * 2015-02-04 2015-09-16 中国人民解放军海军装备研究院信息工程技术研究所 Quality evaluation method without reference image based on regional mutual information
CN104902267B (en) * 2015-06-08 2017-02-01 浙江科技学院 No-reference image quality evaluation method based on gradient information
CN104899893B (en) * 2015-07-01 2019-03-19 电子科技大学 The picture quality detection method of view-based access control model attention
CN105007488A (en) * 2015-07-06 2015-10-28 浙江理工大学 Universal no-reference image quality evaluation method based on transformation domain and spatial domain
CN105338343B (en) * 2015-10-20 2017-05-31 北京理工大学 It is a kind of based on binocular perceive without refer to stereo image quality evaluation method
CN105491371A (en) * 2015-11-19 2016-04-13 国家新闻出版广电总局广播科学研究院 Tone mapping image quality evaluation method based on gradient magnitude similarity
CN105528791B (en) * 2015-12-17 2019-08-30 广东工业大学 A kind of quality evaluation device and its evaluation method towards touch screen hand-drawing image
CN105844640A (en) * 2016-03-24 2016-08-10 西安电子科技大学 Color image quality evaluation method based on gradient
CN105976361B (en) * 2016-04-28 2019-03-26 西安电子科技大学 Non-reference picture quality appraisement method based on multistage wordbook
CN106204548B (en) * 2016-06-30 2021-09-28 上海联影医疗科技股份有限公司 Image distinguishing method and device
CN107316323B (en) * 2017-06-28 2020-09-25 北京工业大学 No-reference image quality evaluation method established based on multi-scale analysis method
CN108717694B (en) * 2018-04-24 2021-04-02 天津大学 Electrical impedance tomography image quality evaluation method based on fuzzy C-means clustering
CN109345502B (en) * 2018-08-06 2021-03-26 浙江大学 Stereo image quality evaluation method based on disparity map stereo structure information extraction
CN108600745B (en) * 2018-08-06 2020-02-18 北京理工大学 Video quality evaluation method based on time-space domain slice multi-map configuration
CN111524110B (en) * 2020-04-16 2023-06-09 北京微吼时代科技有限公司 Video quality evaluation model construction method, evaluation method and device
CN113658130B (en) * 2021-08-16 2023-07-28 福州大学 Dual-twin-network-based reference-free screen content image quality evaluation method
CN117876321B (en) * 2024-01-10 2024-07-30 中国人民解放军91977部队 Image quality evaluation method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100102077A (en) * 2010-08-30 2010-09-20 연세대학교 산학협력단 Method and system for video quality measurement
CN103200421A (en) * 2013-04-07 2013-07-10 北京理工大学 No-reference image quality evaluation method based on Curvelet transformation and phase coincidence
CN103475898A (en) * 2013-09-16 2013-12-25 北京理工大学 Non-reference image quality assessment method based on information entropy characters

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100102077A (en) * 2010-08-30 2010-09-20 연세대학교 산학협력단 Method and system for video quality measurement
CN103200421A (en) * 2013-04-07 2013-07-10 北京理工大学 No-reference image quality evaluation method based on Curvelet transformation and phase coincidence
CN103475898A (en) * 2013-09-16 2013-12-25 北京理工大学 Non-reference image quality assessment method based on information entropy characters

Also Published As

Publication number Publication date
CN104023230A (en) 2014-09-03

Similar Documents

Publication Publication Date Title
CN104023230B (en) A kind of non-reference picture quality appraisement method based on gradient relevance
CN108428227B (en) No-reference image quality evaluation method based on full convolution neural network
CN103475898B (en) Non-reference image quality assessment method based on information entropy characters
CN109325550B (en) No-reference image quality evaluation method based on image entropy
US7545985B2 (en) Method and system for learning-based quality assessment of images
CN110570435B (en) Method and device for carrying out damage segmentation on vehicle damage image
CN106127741B (en) Non-reference picture quality appraisement method based on improvement natural scene statistical model
CN109978854B (en) Screen content image quality evaluation method based on edge and structural features
CN110400293B (en) No-reference image quality evaluation method based on deep forest classification
CN108053396B (en) No-reference evaluation method for multi-distortion image quality
CN107146220B (en) A kind of universal non-reference picture quality appraisement method
CN111709914B (en) Non-reference image quality evaluation method based on HVS characteristics
CN103841410B (en) Based on half reference video QoE objective evaluation method of image feature information
CN105894507B (en) Image quality evaluating method based on amount of image information natural scene statistical nature
Li et al. Recent advances and challenges in video quality assessment
CN104915945A (en) Quality evaluation method without reference image based on regional mutual information
CN107590804A (en) Screen picture quality evaluating method based on channel characteristics and convolutional neural networks
CN103914835B (en) A kind of reference-free quality evaluation method for fuzzy distortion stereo-picture
Morzelona Human visual system quality assessment in the images using the IQA model integrated with automated machine learning model
CN104394405B (en) A kind of method for evaluating objective quality based on full reference picture
CN104835172A (en) No-reference image quality evaluation method based on phase consistency and frequency domain entropy
CN107578406A (en) Based on grid with Wei pool statistical property without with reference to stereo image quality evaluation method
Gavrovska et al. No-reference local image quality evaluation
CN112950479B (en) Image gray level region stretching algorithm
Lu et al. Automatic region selection for objective sharpness assessment of mobile device photos

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160413

Termination date: 20210623