CN111968073B - No-reference image quality evaluation method based on texture information statistics - Google Patents

No-reference image quality evaluation method based on texture information statistics Download PDF

Info

Publication number
CN111968073B
CN111968073B CN202010644429.XA CN202010644429A CN111968073B CN 111968073 B CN111968073 B CN 111968073B CN 202010644429 A CN202010644429 A CN 202010644429A CN 111968073 B CN111968073 B CN 111968073B
Authority
CN
China
Prior art keywords
image
gclbp
clbp
gradient
amplitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010644429.XA
Other languages
Chinese (zh)
Other versions
CN111968073A (en
Inventor
李春泉
肖典
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang University
Original Assignee
Nanchang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang University filed Critical Nanchang University
Priority to CN202010644429.XA priority Critical patent/CN111968073B/en
Publication of CN111968073A publication Critical patent/CN111968073A/en
Application granted granted Critical
Publication of CN111968073B publication Critical patent/CN111968073B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The invention discloses a no-reference image quality evaluation method based on texture information statistics, which is a universal no-reference image quality evaluation method. Firstly, filtering an image by using a Scharr operator, and extracting image edge information to obtain a gradient image; then processing the obtained gradient image by using a Complete Local Binary Pattern (CLBP) to obtain a gradient CLBP map (GCLBP); then, the amplitude of the gradient image is taken as weight, and a GCLBP histogram of the gradient amplitude weight is extracted by combining the GCLBP image; and finally, taking the normalized gradient amplitude weight GCLBP histogram as a characteristic, and mapping the histogram into an image score by using SVR. The invention has better consistency between the prediction result of the image and the subjective perception of human, lower time complexity, and good universality and robustness.

Description

No-reference image quality evaluation method based on texture information statistics
Technical Field
The invention belongs to the technical field of image processing and image quality evaluation, and relates to a non-reference image quality evaluation method based on texture information statistics. The quality of various natural images can be objectively quantified.
Background
With the large-scale application of mobile internet and the wave of big data era and 5G era, massive images and videos are full of medical, aviation, transportation, business, agriculture and other aspects, such as: medical imaging, satellite remote sensing imaging, intelligent traffic monitoring systems, virtual reality, and the like. In addition, the widespread popularity of smart devices has also led to people being enthusiastic in sharing life and obtaining information by means of pictures or short videos. The increasing demand for image services also makes the demand for image quality higher and higher, and therefore, new image technologies are continuously generated. For example: high Dynamic Range (HDR) images, 4K ultra High definition images, and the like have entered people's lives. However, even with the continuous innovation of imaging technology, the image acquisition phase still suffers from distortion. Not only this, in a communication system, when an image arrives from a transmitting end to a receiving end, the image inevitably suffers various distortions in compression, transmission, reconstruction, and the like. For example: in the capture phase, motion blur, contrast distortion, etc. may be caused by the capture device and environmental conditions; in the compression encoding stage, the encoding technique based on Discrete Cosine Transform (DCT) mostly causes blocking effect and blurring; the JPEG2000 compression technology based on wavelet transformation mostly causes the generation of blurring and ringing effects; during the transmission phase, it is inevitable to be interfered by channel noise and the like. Therefore, there is a need to use image quality assessment to ensure the end-user experience, guide and supervise the process of image acquisition, storage, compression, transmission and reproduction.
The image quality evaluation is broadly divided into subjective image quality evaluation and objective image quality evaluation. Subjective image quality evaluation cannot be applied on a large scale due to the defects that the task is complicated, the time is consumed, the cost is high, the evaluation result cannot be copied, and the evaluation main body is a human. The objective image quality evaluation uses a computer model to quantify the image quality, does not need human participation, depends on data driving, and is an effective substitute of a subjective evaluation method. The method has very wide application value in the fields related to digital images. Depending on how much of the original image information is used in the evaluation process, objective evaluation methods can be divided into: full reference image quality evaluation, half reference image quality evaluation, no reference (blind) image quality evaluation. Since no-reference image quality evaluation does not require any information of reference images, no-reference image quality evaluation is the most widely studied and used image quality evaluation method in the present.
In the process of constructing the objective image quality evaluation method, the characteristics of the human visual system are researched and simulated, so that the method is a basis for constructing the evaluation method which accords with human visual perception, and is a powerful means for obtaining objective evaluation results which are more consistent with human subjective perception results. A number of studies have shown that the human visual system is very sensitive to high frequency information such as image edges, textures, etc. Meanwhile, the image edge and texture can represent the image structure, and the distortion can destroy the image structure. Therefore, there are many studies to perform image quality evaluation using edge and texture information. In the existing image quality evaluation method, Local Binary Patterns (LBPs) have been widely used for texture information extraction. For example: li et al in the article "pigment image quality assessment using static structural and luminescence features [ J ]. IEEE Transactions on Multimedia,2016,18(12): 2457:" extract structural features of an image in a spatial domain using LBP; in the same year, they filter the image with the Prewitt operator in the article "No-reference quality assessment for multiple-discrete images in gradient domain [ J ]. IEEE Signal Processing Letters,2016,23(4):541- > 545", using LBP to extract features in the gradient domain; rezaie et al in the article "No-reference image quality assessment using local binding patterns in the wavelet domain [ J ]. Multimedia Tools and Applications,2018,77(2): 2529-. All three methods use LBP to extract the texture information of the image, and all obtain good results.
From the above, based on the characteristic that the human visual system is sensitive to high-frequency information such as texture, LBP is often used as a simple and easy-to-use texture information descriptor to construct an image quality evaluation method. However, LBP still has its drawbacks: LBP is insensitive to illumination variations but very sensitive to noise, except that LBP only describes the relationship between the central pixel and the surrounding pixels, while the amplitude information of the image is not exploited. Therefore, the invention aims to extract more perfect image information such as texture, edge and the like on the basis of utilizing the characteristics of the human visual system and construct a new universal non-reference image quality evaluation method.
Disclosure of Invention
The invention discloses a no-reference image quality evaluation method based on texture information statistics. The general reference-free image quality evaluation method can objectively quantify image quality without being limited by image distortion types and types, so that the evaluation method is truly reference-free and general, and the evaluation result is highly consistent with the subjective evaluation of human beings.
The invention is realized by the following technical scheme:
a no-reference image quality evaluation method based on texture information statistics comprises the following steps:
step 1: filtering the image by using a Scharr operator, and extracting image edge information to obtain a gradient image;
step 2: two operators in CLBP are used: processing the gradient image obtained in the step 1 by using a CLBP sign (CLBP _ sign, CLBP _ S) and a CLBP amplitude (CLBP _ Magnitude, CLBP _ M) to respectively obtain a gradient CLBP _ S (GCLBP _ S) graph and a gradient CLBP _ M (GCLBP _ M) graph;
and step 3: taking the amplitude of the gradient image obtained in the step 1 as a weight, and extracting a gradient amplitude weight GCLBP _ S histogram and a gradient amplitude weight GCLBP _ M histogram by combining the GCLBP _ S graph and the GCLBP _ M graph obtained in the step 2;
and 4, step 4: performing joint statistics on the two histograms obtained in the step 3, and taking the histograms as feature vectors for expressing image information;
and 5: downsampling the original image, and repeating the steps 1 to 4 until the set downsampling times are finished;
and 6: and sequentially cascading the feature vectors obtained in the steps to form final image features, and mapping the image features into quality scores by using a support vector regression machine.
Preferably, the image representation processed by using the Scharr operator in step 1 is as follows:
Figure GDA0003623916810000021
wherein, (i, j) represents a position index; i represents an image to be operated; g represents a gradient image obtained by operation;
Figure GDA0003623916810000022
is the convolution operator; hx,HyThe horizontal and vertical components of the Scharr filter are represented separately, in the form:
Figure GDA0003623916810000031
preferably, the CLBP sign (CLBP _ S) and CLBP magnitude (CLBP _ M) operator used in step 2 are defined as follows:
calculating the difference between the gray values of the surrounding pixels and the central pixel, namely: { d)i=ai-acI | ═ 0,1, …, P-1}, where acRepresenting the gray value of the central pixel; a is aiExpressing the gray value of the pixel points on the circumference, thereby obtaining the local difference information of the surrounding pixel points and the central pixel point by using a vector [ d ]0,d1,…,dP-1]Representing, then, d is converted using a partial differential sign-to-amplitude conversion (LDSMT)iThe decomposition is divided into a sign component and an amplitude component, and the LDSMT is expressed as:
di=si*mi (3)
Figure GDA0003623916810000032
wherein s isiDenotes diThe symbol of (a); m isiDenotes diThe amplitude of (d);
by the sign component siConstructing CLBP _ S, namely: changing-1 to 0 constitutes CLBP _ S, which is consistent with the construction principle of the original LBP, defined as:
Figure GDA0003623916810000033
Figure GDA0003623916810000034
wherein λ (x) is a sign function; r is the radius of the circular field during sampling, P is the number of the pixel points uniformly distributed on the circumference,
Figure GDA0003623916810000035
ri in (b) represents rotation invariance, u2 represents that the number of times of changing from 0 jump to 1 or from 1 jump to 0 in binary coding is less than or equal to 2, and is defined as:
Figure GDA0003623916810000036
rotation-invariant uniformity pattern described by equation (5)
Figure GDA0003623916810000037
The number of patterns of the most primitive LBP is 2PThe number of the seeds is reduced to P +2, and the description of the image texture information is completed with a small data volume;
by the amplitude component miConstructing CLBP _ M, which is defined as:
Figure GDA0003623916810000038
wherein m isPIs the amplitude component; μ is a newly set threshold, typically set to the whole image mPThe mean value of (a); the remaining variables all have the same meaning as in LBP;
CLBP _ S and CLBP _ M also have AND
Figure GDA0003623916810000039
Rotationally invariant homogeneous patterns with the same meaning, are respectively expressed as:
Figure GDA00036239168100000310
and
Figure GDA00036239168100000311
this is two operators of CLBP in step 2.
Preferably, the manner of extracting the gradient magnitude weight GCLBP _ S histogram in step 3 is the same as that of extracting the gradient magnitude weight GCLBP _ M histogram, and the gradient magnitude weight histogram extraction for GCLBP _ S is expressed as follows:
Figure GDA0003623916810000041
Figure GDA0003623916810000042
wherein, (i, j) represents a position index; v. ofi,jRepresenting the magnitude at the gradient image (i, j); k is an element of [0, K ]]Are possible patterns in GCLBP _ S.
Preferably, the step 4 performs joint statistical representation on two image features as follows:
feature=[hGCLBP_S,hGCLBP_M] (11)
the invention has the following advantages:
1. the dimension of the constructed image features is low, the calculation complexity is low while a good result is obtained, and the real-time requirement can be met;
2. based on the characteristic that the human visual system is very sensitive to high-frequency information such as image edges, textures and the like, the condition that the observation distance and the image resolution change are simulated by utilizing down sampling for many times is utilized, the human visual system is well considered and simulated, and the consistency of a prediction result and a human subjective perception result is good;
3. the gradient image is described by two operators of CLBP _ S and CLBP _ M in CLBP respectively, so that the structural information of the image can be well described, and the two characteristics supplement each other in the process of predicting the image score, so that the image structure can be effectively distinguished.
Drawings
FIG. 1 is a flow chart of the algorithm of the present invention;
FIG. 2 is an example of the operators CBLP _ S and CLBP _ M in CBLP;
FIG. 3 is a schematic diagram of a feature extraction process; in fig. 3: (a) and (b) is a distorted image; (c) and (d) respectively (a) a corresponding gradient CLBP _ S (GCLBP _ S) map and gradient CLBP _ M (GCLBP _ M) map; (g) and (h) are the GCLBP _ S histogram and the GCLBP _ M histogram extracted by means of (c) and (d), respectively, and using the gradient magnitude as weights; the same is true for the relationships between graphs (b) (e) (f) (i) (j).
Detailed Description
The invention will be further illustrated by the following examples in conjunction with the accompanying drawings.
101: detailed description of the entire embodiment
Step 1: and converting the original color image into a gray image, filtering the gray image by using a Scharr operator, and extracting edge information of the image to obtain a gradient image. The process can be described as follows:
Figure GDA0003623916810000043
wherein, (i, j) represents a position index; i represents an image to be operated; g represents the calculated gradient image;
Figure GDA0003623916810000044
is the convolution operator; hx,HyThe horizontal and vertical components of the Scharr filter are represented separately, in the form:
Figure GDA0003623916810000045
step 2: two operators in CLBP are used: the CLBP sign (CLBP _ sign, CLBP _ S) and the CLBP amplitude (CLBP _ magnetic, CLBP _ M) process the gradient image obtained in step 1 to obtain a gradient CLBP _ S (GCLBP _ S) map and a gradient CLBP _ M (GCLBP _ M) map, respectively (if the original image is (a) in fig. 3, the GCLBP _ S image after processing is (c) in fig. 3, and the GCLBP _ M image is (d) in fig. 3). The definition of the operator CLBP is as follows (fig. 2 shows the process of constructing CLBP _ S and CLBP _ M in a local image block of 3 × 3 size):
first, the definition of LBP is given:
LBP is generally based on the circular field, and for any given position, the difference between the gray value of the given position (central pixel) and the gray values of P uniformly distributed pixels on the circumference with the radius of R (the gray value of the central pixel is greater than the gray value of the comparison pixel and is counted as 0, otherwise, is counted as 1); then a string of binary numbers can be obtained by encoding, and the string of binary numbers is converted into decimal numbers (i.e. LBP codes) to be the new gray value of the central pixel. Described using a mathematical formula as:
Figure GDA0003623916810000051
Figure GDA0003623916810000052
wherein, acRepresenting the gray value of the central pixel; a isiRepresenting the gray value of the pixel point on the circumference; λ (x) is a sign function; r is the radius of the circular field during sampling; and P is the number of the pixels uniformly distributed on the circumference.
On the basis of the most primitive LBP, the proposed rotation-invariant homogeneous pattern of LBP is defined as:
Figure GDA0003623916810000053
wherein the content of the first and second substances,
Figure GDA0003623916810000054
ri in (b) represents rotation invariance, u2 represents that the number of times of changing from 0 jump to 1 or from 1 jump to 0 in binary coding is less than or equal to 2, and is defined as:
Figure GDA0003623916810000055
rotation-invariant uniformity pattern described by equation (5)
Figure GDA0003623916810000056
The number of patterns of the most original LBP is 2PAnd the type is reduced to P +2, and the description of the image texture information is completed with smaller data size.
The definitions of two operators in CLBP are next introduced:
calculating the difference between the gray values of the surrounding pixels and the central pixel, namely: { di=ai-ac|i=0,1,…,P-1}(acRepresenting the gray value of the central pixel; a isiRepresenting the gray value of a pixel point on the circumference). Therefore, the local difference information (using vector [ d ]) between the surrounding pixel point and the central pixel point can be obtained0,d1,…,dP-1]And) as shown in (b) of fig. 2. Then, d is transformed using Local Difference Sign-Magnitude Transform (LDSMT)iThe decomposition is into two parts, a sign component and an amplitude component (shown as (c) in fig. 2 and (d) in fig. 2, respectively). LDSMT is expressed as:
di=si*mi (7)
Figure GDA0003623916810000057
wherein s isiDenotes diThe symbol of (a); m isiDenotes diOf the amplitude of (c).
By the sign component siConstructing CLBP _ S. Namely: changing-1 to 0 constitutes CLBP _ S, which is consistent with the construction principle of the original LBP.
By the amplitude component miConstructing CLBP _ M, which is defined as:
Figure GDA0003623916810000061
wherein m isPIs the amplitude component; μ is a newly set threshold, typically set to the whole image mPThe mean value of (a); the remaining variables all have the same meaning as in LBP.
In FIG. 2, (e) and (f) are shown byThe result obtained in (a) of fig. 2 is processed with CLBP _ S and CLBP _ M. CLBP _ S and CLBP _ M also have AND
Figure GDA0003623916810000062
The rotationally invariant homogeneous patterns with the same meaning are respectively expressed as:
Figure GDA0003623916810000063
and with
Figure GDA0003623916810000064
And step 3: the amplitude of the gradient image obtained in step 1 is used as a weight, and the GCLBP _ S map and the GCLBP _ M map obtained in step 2 are combined to extract a gradient amplitude weight GCLBP _ S histogram and a gradient amplitude weight GCLBP _ M histogram (if the original image is (a) in fig. 3, the gradient amplitude weight GCLBP _ S histogram after processing is (g) in fig. 3, and the gradient amplitude weight GCLBP _ M histogram is (h) in fig. 3). The gradient magnitude weight histogram extraction for GCLBP _ S (the gradient magnitude weight histogram extraction method of GCLBP _ M and GCLBP _ S is the same) can be expressed as follows:
Figure GDA0003623916810000065
Figure GDA0003623916810000066
wherein, (i, j) represents a position index; v isi,jRepresenting the magnitude at the gradient image (i, j); k is an element of [0, K ]]Are possible patterns in GCLBP _ S.
And 4, step 4: and (4) carrying out joint statistics on the two histograms obtained in the step (3) and taking the two histograms as a feature vector for expressing image information. The joint statistics of two image features can be expressed as:
feature=[hGCLBP_S,hGCLBP_M] (12)
and 5: the original image is down-sampled and steps 1 to 4 are repeated until the set number of down-samplings is completed.
Step 6: and sequentially cascading the feature vectors obtained in the steps to form the final image features. Namely: the final features are expressed as: feature (Fea)1,feature2,…,feature5]The image features are mapped to quality scores using a support vector regression machine.
For the sample radius and number of sample points in the CLBP operator, in combination with computational efficiency and consideration that the feature vector dimension representing the image is as low as possible, it is recommended that R be set to 1 and P be set to 8.
For the number of downsamplings, the larger the dimension of the obtained image feature, and a balance between the feature dimension and the prediction accuracy is usually required. The number of downsampling times recommended is 5.
102: experimental setup and results
In order to verify the scheme provided by the invention, the commonly used LIVE image library is selected as an experimental database in the implementation example. The types of distortion contained in LIVE image databases are: gaussian Blur (GB), White Gaussian Noise (WN), JPEG2000 Compression (JPEG2000 Compression, JP2K), JPEG Compression (JPEG) and Simulated Fast Fading Rayleigh Channel (FF). 779 distortion images are generated from 29 original images in the database through the distortion type processing in 5, and the database contains the human subjective scores corresponding to all the images. The obtaining address of the LIVE database is as follows: http:// live. ec. utexes. edu/research/Quality/subject. html. A Spearman Rank order correlation coefficient (SRCC), a Pearson Linear Correlation Coefficient (PLCC), and a Root Mean Square Error (RMSE), which are commonly used in the field of image quality evaluation, are used as indexes for measuring the performance of the algorithm (the closer to 1 the SRCC and the PLCC are, the better the RMSE is, the smaller the RMSE is). Before calculating PLCC and RMSE, in order to eliminate the size and unit difference between the IQA algorithm score and the subjective score, the logic function is utilized to carry out nonlinear mapping on the algorithm prediction score, so that the performance evaluation result is more accurate. The logical mapping function is defined as follows:
Figure GDA0003623916810000071
wherein q represents the predictive score value of the IQA algorithm; ψ (q) represents a prediction score value after mapping; beta is a12345Fitting parameters in the nonlinear regression process.
In the experimental process, the reference images in the LIVE database are randomly divided into 80% and 20%, and the distorted images corresponding to the reference images are respectively used as a training set and a test set, so that no content overlapping between the training set and the test set can be ensured. Meanwhile, in order to reduce experimental error and eliminate implementation bias, the whole training-testing process is repeatedly performed 1000 times, and the respective median values of PLCC and RMSE are taken as the final experimental results 1000 times.
In this embodiment, in order to more fully verify the performance of the algorithm provided by the present invention, 8 (2 are classical full reference methods, and 6 are classical no reference methods) image algorithms are selected for performance comparison, where 7 are respectively from: the document "Image quality assessment from error visibility to structural similarity [ J ]. IEEE Transactions on Image Processing,2004,13(4): 600-: SSIM; the document "Making a" complete document "image quality analyzer [ J ]. IEEE Signal Processing Letters,2012,20(3): 209-: NIQE; the document "A feature-organized complete Image quality analyzer [ J ]. IEEE Transactions on Image Processing,2015,24(8):2579 and 2591" is abbreviated as: IL-NIQE; the document "No-reference Image quality assessment in the spatial domain [ J ]. IEEE Transactions on Image Processing,2012,21(12): 4695-: BRISQUE; the document "Black Image quality assessment using joint statistics of gradient maps and Laplacian features [ J ]. IEEE Transactions on Image Processing,2014,23(11):4850 and 4862" are abbreviated as: GMLOG; the document "pigment image quality assessment using static structural and luminescence defects [ J ]. IEEE Transactions on Multimedia,2016,18(12): 2457-: NRSL; the document "No-reference quality assessment for multiplexed-discrete images in the graphic domain [ J ]. IEEE Signal Processing Letters,2016,23(4):541 and 545" is abbreviated as: GWH-GLBP; the parameters used are kept consistent with the literature. PSNR (Peak Signal-to-Noise Ratio) is directly constructed by a simple mathematical operation, and thus is directly reproduced. The results of the implementation example and the results of the image algorithm in 8 above are shown in table 1:
TABLE 1 comparison of Performance of different methods on LIVE database
Figure GDA0003623916810000081
As can be seen from table 1, on the LIVE image database, the performance indexes SRCC, PLCC, and RMSE of the algorithm tested on the LIVE database are 0.9571, 0.9768, and 5.8163, respectively; the three indexes are the best indexes in the algorithm of the table 1, and exceed the performance of several popular general non-reference image quality evaluation algorithms at present. In addition, the invention can meet the real-time requirement by extracting the features of an image with the resolution of 512 x 512 (the notebook computer is configured to be Intel (R) core (TM) i5-4200U CPU @1.60GHz,8GB RAM, and the software platform is MATLAB 2016a) with the time of 0.3956 s. From the alignment and analysis of the examples performed above, it can be seen that: the prediction result of the algorithm is better consistent with human subjective perception, the time complexity is lower, and the algorithm has good general adaptation and robustness.
The foregoing merely represents preferred embodiments of the invention, which are described in some detail and detail, and therefore should not be construed as limiting the scope of the invention. It should be noted that, for those skilled in the art, various changes, modifications and substitutions can be made without departing from the spirit of the present invention, and these are all within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (5)

1. A no-reference image quality evaluation method based on texture information statistics is characterized in that: the method comprises the following steps:
step 1: filtering the image by using a Scharr operator, and extracting image edge information to obtain a gradient image;
step 2: processing the gradient image obtained in the step 1 by using two operational characters CLBP _ S and CLBP _ M in the CLBP to respectively obtain a GCLBP _ S image and a GCLBP _ M image;
and step 3: taking the amplitude of the gradient image obtained in the step 1 as a weight, and extracting a gradient amplitude weight GCLBP _ S histogram and a gradient amplitude weight GCLBP _ M histogram by combining the GCLBP _ S graph and the GCLBP _ M graph obtained in the step 2;
and 4, step 4: performing joint statistics on the two histograms obtained in the step 3, and taking the two histograms as a feature vector for expressing image information;
and 5: down-sampling the original image and repeating the steps 1 to 4 until the set down-sampling number is completed;
and 6: and sequentially cascading the feature vectors obtained in the steps to form final image features, and mapping the image features into quality scores by using a support vector regression machine.
2. The non-reference image quality evaluation method based on texture information statistics of claim 1, wherein: the image representation is processed by using the Scharr operator in the step 1 as follows:
Figure FDA0003623916800000011
wherein, (i, j) represents a position index; i represents an image to be operated; g represents the calculated gradient image;
Figure FDA0003623916800000012
is the convolution operator; hx,HyRespectively representing Scharr filteringThe horizontal and vertical components of the device are of the form:
Figure FDA0003623916800000013
3. the non-reference image quality evaluation method based on texture information statistics of claim 1, wherein: the CLBP _ S and CLBP _ M operators used in step 2 are defined as follows:
calculating the difference between the gray values of the surrounding pixels and the central pixel, namely: { di=ai-acI | 0,1, …, P-1}, wherein acRepresenting the gray value of the central pixel; a isiExpressing the gray value of the pixel points on the circumference, thereby obtaining the local difference information of the surrounding pixel points and the central pixel point by using a vector [ d ]0,d1,…,dP-1]Representing, then, d by using a partial differential sign-to-amplitude conversion LDSMTiThe decomposition is divided into a sign component and an amplitude component, and the LDSMT is expressed as:
di=si*mi (3)
Figure FDA0003623916800000014
wherein s isiDenotes diThe symbol of (a); m is a unit ofiDenotes diThe amplitude of (d);
by the sign component siConstructing CLBP _ S, namely: changing-1 to 0 constitutes CLBP _ S, which is consistent with the construction principle of the original LBP, defined as:
Figure FDA0003623916800000021
Figure FDA0003623916800000022
wherein λ (x) is a sign function; r is the radius of the circular field during sampling, P is the number of the pixel points uniformly distributed on the circumference,
Figure FDA0003623916800000023
ri in (b) represents rotation invariance, u2 represents that the number of times of changing from 0 jump to 1 or from 1 jump to 0 in binary coding is less than or equal to 2, and is defined as:
Figure FDA0003623916800000024
rotation-invariant uniformity pattern described by equation (5)
Figure FDA0003623916800000025
The number of patterns of the most primitive LBP is 2PThe number of the seeds is reduced to P +2, and the description of the image texture information is completed with smaller data volume;
by the amplitude component miConstructing CLBP _ M, which is defined as:
Figure FDA0003623916800000026
wherein m isPIs the amplitude component; μ is a newly set threshold value, and is set as the whole image mPThe mean value of (a); the remaining variables all have the same meaning as in LBP;
CLBP _ S and CLBP _ M also have AND
Figure FDA0003623916800000027
The rotationally invariant homogeneous patterns with the same meaning are respectively expressed as:
Figure FDA0003623916800000028
and
Figure FDA0003623916800000029
this is two operators of CLBP in step 2.
4. The non-reference image quality evaluation method based on texture information statistics as claimed in claim 1, wherein: the manner of extracting the gradient magnitude weight GCLBP _ S histogram and the gradient magnitude weight GCLBP _ M histogram in the step 3 is consistent, and the gradient magnitude weight histogram extraction for GCLBP _ S is expressed as follows:
Figure FDA00036239168000000210
Figure FDA00036239168000000211
wherein, (i, j) represents a position index; v. ofi,jRepresenting the magnitude at the gradient image (i, j); k is equal to [0, K ∈ ]]Are possible patterns in GCLBP _ S.
5. The non-reference image quality evaluation method based on texture information statistics of claim 1, wherein: step 4, performing joint statistics on two image features as follows:
feature=[hGCLBP_S,hGCLBP_M] (11)。
CN202010644429.XA 2020-07-07 2020-07-07 No-reference image quality evaluation method based on texture information statistics Active CN111968073B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010644429.XA CN111968073B (en) 2020-07-07 2020-07-07 No-reference image quality evaluation method based on texture information statistics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010644429.XA CN111968073B (en) 2020-07-07 2020-07-07 No-reference image quality evaluation method based on texture information statistics

Publications (2)

Publication Number Publication Date
CN111968073A CN111968073A (en) 2020-11-20
CN111968073B true CN111968073B (en) 2022-07-12

Family

ID=73361103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010644429.XA Active CN111968073B (en) 2020-07-07 2020-07-07 No-reference image quality evaluation method based on texture information statistics

Country Status (1)

Country Link
CN (1) CN111968073B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113920065B (en) * 2021-09-18 2023-04-28 天津大学 Imaging quality evaluation method for visual detection system of industrial site
CN116402791A (en) * 2023-04-07 2023-07-07 西安电子科技大学 Image quality evaluation method based on wavelet domain gradient LBP weighted histogram

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920232A (en) * 2017-02-22 2017-07-04 武汉大学 Gradient similarity graph image quality evaluation method and system based on conspicuousness detection
CN109344860A (en) * 2018-08-19 2019-02-15 天津大学 A kind of non-reference picture quality appraisement method based on LBP
CN109523542A (en) * 2018-11-23 2019-03-26 嘉兴学院 It is a kind of based on color vectors angle LBP operator without reference color image quality evaluation method
CN110415223A (en) * 2019-07-17 2019-11-05 西安邮电大学 A kind of the stitching image quality evaluating method and system of no reference
CN111127387A (en) * 2019-07-11 2020-05-08 宁夏大学 Method for evaluating quality of non-reference image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920232A (en) * 2017-02-22 2017-07-04 武汉大学 Gradient similarity graph image quality evaluation method and system based on conspicuousness detection
CN109344860A (en) * 2018-08-19 2019-02-15 天津大学 A kind of non-reference picture quality appraisement method based on LBP
CN109523542A (en) * 2018-11-23 2019-03-26 嘉兴学院 It is a kind of based on color vectors angle LBP operator without reference color image quality evaluation method
CN111127387A (en) * 2019-07-11 2020-05-08 宁夏大学 Method for evaluating quality of non-reference image
CN110415223A (en) * 2019-07-17 2019-11-05 西安邮电大学 A kind of the stitching image quality evaluating method and system of no reference

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"A Completed Modeling of Local Binary Pattern Operator for Texture Classification";Zhenhua Guo等;《IEEE Transactions on Image Processing》;20100308;第19卷(第6期);1-7 *
"A Referencelss Image Quality Assessment Based on BSIF,CLBP,LPQ,and LCP Texture Descriptors";Freitas Pedro Garcia等;《Electronic Imaging,Image Quallity and System Performance XVI》;20190113;1-7 *
"融合CLBP和局部集合特征的纹理目标分类";寇旗旗等;《光电工程》;20191209;第46卷(第11期);66-73 *

Also Published As

Publication number Publication date
CN111968073A (en) 2020-11-20

Similar Documents

Publication Publication Date Title
Li et al. Survey of single image super‐resolution reconstruction
Wee et al. Image quality assessment by discrete orthogonal moments
CN105049851B (en) General non-reference picture quality appraisement method based on Color perception
Yang et al. Multi-focus image fusion via clustering PCA based joint dictionary learning
CN101714257A (en) Method for main color feature extraction and structuring description of images
CN111968073B (en) No-reference image quality evaluation method based on texture information statistics
Kumar et al. Near lossless image compression using parallel fractal texture identification
Huang et al. Super-resolution reconstruction method of remote sensing image based on multi-feature fusion
Nirmal Jothi et al. Tampering detection using hybrid local and global features in wavelet-transformed space with digital images
Ahmed et al. PIQI: perceptual image quality index based on ensemble of Gaussian process regression
Elsharkawy et al. New and efficient blind detection algorithm for digital image forgery using homomorphic image processing
Fu et al. Detecting GAN-generated face images via hybrid texture and sensor noise based features
Chang et al. LG-IQA: Integration of local and global features for no-reference image quality assessment
Si et al. A full‐reference stereoscopic image quality assessment index based on stable aggregation of monocular and binocular visual features
Li et al. Automatic no-reference image quality assessment
Jagannadham et al. Novel performance analysis of DCT, DWT and fractal coding in image compression
Bagade et al. No reference image quality assessment with shape adaptive discrete wavelet features using neuro-wavelet model
CN116894234A (en) Robust image hash authentication method based on texture and statistical characteristics
Budhiraja et al. Performance Analysis of Multi-scale Transforms for Saliency-Based Infrared and Visible Image Fusion
Zhang et al. Blind image quality assessment based on local quantized pattern
Li et al. Image reconstruction for compressed sensing based on joint sparse bases and adaptive sampling
Ramakanth et al. Super resolution using a single image dictionary
Yang et al. A local structural information representation method for image quality assessment
Ramakanth et al. Synthetic image super resolution using FeatureMatch
Zhao et al. Image compression and denoising using multiresolution region-based image description scheme

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant