CN111354048A - Quality evaluation method and device for camera-oriented acquired pictures - Google Patents
Quality evaluation method and device for camera-oriented acquired pictures Download PDFInfo
- Publication number
- CN111354048A CN111354048A CN202010112925.0A CN202010112925A CN111354048A CN 111354048 A CN111354048 A CN 111354048A CN 202010112925 A CN202010112925 A CN 202010112925A CN 111354048 A CN111354048 A CN 111354048A
- Authority
- CN
- China
- Prior art keywords
- image
- picture
- quality
- extracting
- variance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 238000013441 quality evaluation Methods 0.000 title claims abstract description 32
- 230000008859 change Effects 0.000 claims abstract description 10
- 230000016776 visual perception Effects 0.000 claims abstract description 10
- 238000013507 mapping Methods 0.000 claims abstract description 9
- 238000010606 normalization Methods 0.000 claims abstract description 9
- 238000012549 training Methods 0.000 claims abstract description 7
- 230000008569 process Effects 0.000 claims description 10
- 238000001303 quality assessment method Methods 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 7
- 239000000284 extract Substances 0.000 claims description 5
- 230000006870 function Effects 0.000 claims description 5
- 238000010586 diagram Methods 0.000 claims description 3
- 238000005457 optimization Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 2
- 238000011176 pooling Methods 0.000 claims description 2
- 230000004044 response Effects 0.000 claims description 2
- 238000012546 transfer Methods 0.000 claims description 2
- 230000005484 gravity Effects 0.000 claims 1
- 238000007906 compression Methods 0.000 description 4
- 230000006835 compression Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000000354 decomposition reaction Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a quality evaluation method and a device for a camera to acquire pictures, wherein the method comprises the following steps: s1, extracting brightness and chroma characteristics, and estimating the brightness and chroma of the input picture; s2, extracting the characteristics of the picture noise, and estimating the noise degree of the input picture; s3, extracting structural features and estimating the fuzzy degree of the input picture; s4, extracting contrast characteristics and estimating the contrast of the input picture; s5, extracting the statistical characteristics of the picture normalization coefficient, and estimating the naturalness of the picture; s6, extracting visual perception characteristics of the picture, and estimating the change of the visual perception of the picture; s7, learning a mapping model of the image features extracted by the steps S1-S6 to the image quality on the training set by using a support vector regression method for predicting the image quality. The method does not need to refer to the original image, can obtain higher prediction performance, and has wide application value.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a quality evaluation method and device for a camera-oriented acquired picture.
Background
With the popularization of mobile terminal devices, it becomes more and more convenient to capture images with a camera. However, in the image acquisition process, the quality of the acquired picture is poor due to various factors, and the subsequent application of the image is affected. Digital images become an important support for the information industry, and the image quality evaluation technology has a wide application value for the information industry. For example, in image acquisition, image quality evaluation may be applied to monitor the quality of acquired images in real time, give a prompt for low-quality images, and reject low-quality images; the image quality evaluation can be applied to the image compression process, and is used for measuring the effectiveness of a compression algorithm and guiding the compression algorithm to reduce the loss of the image quality after compression as much as possible; the image quality evaluation can also be used to judge the goodness of the image processing algorithm.
The image quality evaluation can be broadly divided into subjective quality evaluation and objective quality evaluation, wherein the subjective quality evaluation means that an observer scores the quality of an image according to own experience, and the objective quality evaluation means that the quality of the image is evaluated by designing an objective algorithm. A good objective quality assessment method should be consistent with the results of subjective quality assessment. In image application, the final recipient of an image is a person, so subjective quality evaluation is the most reliable and accurate evaluation method, however, the subjective quality evaluation needs to consume a large amount of manpower, material resources and time to score the quality of the image, and the difficulty of the subjective quality evaluation is higher and even the subjective quality evaluation cannot be implemented as the number of images is larger, so the objective quality evaluation method is particularly important.
In terms of objective quality assessment, Mean Square Error (MSE) and peak to noise ratio (PSNR) are currently the most popular quality assessment criteria due to their simplicity, although sometimes not particularly compatible with the subjective scores of testers. The structural similarity method (SSIM) proposed by Wang, Z et al in IEEE Trans. Image Process, Vol.13, No. 4, pp.600 to 612 in the paper "Image quality assessment from original visibility to structural similarity" assesses the quality of distorted images by comparing the structural similarity of the original Image and the distorted Image. Sheikh, H.R. et al propose a Visual Information Fidelity (VIF) method in the article Image information and visual quality published from IEEE Trans. Image processing, Vol.15, pp.2, 430 to 444, to evaluate Image quality by quantifying information loss in the Image. Zhang, l. et al, in the article "FSIM: a fe source Image identity Index for Image Quality Assessment" published by ieee trans. Image Process, volume 20, phase 8, pages 2378 to 2386, utilize gradient features and phase consistency features to assess the Quality of the Image. Quality assessment is carried out by Xue, W et al using Gradient information in the paper "Gradient magnetic homogeneity development: A high throughput Experimental Quality Index", published in IEEE Trans. Image Process, Vol.23, pp.2, 684 to 695. Gao, x. et al, in the paper "Image quality Analysis Based on Multiscale geometry Analysis" published in ieee trans. Image Process, volume 18, phase 7, page 1409 to page 1423, perform Multiscale decomposition on images, weight the decomposition coefficients with a human eye contrast sensitivity function, and then Process the coefficients with a Just Noticeable Difference (JND) model to extract histogram feature prediction Image quality. Quality assessment is carried out by Xue, W et al using Gradient information in the paper "Gradient magnetic information Quality development: A high throughput Image Quality Index", published in IEEETrans. Image Process, Vol.23, pp.2, 684 to 695. The Blund Image Quality Indexes (BIQI) model proposed by Liu, H.et al, IEEE transactions systems, video technology, 21, 7, 971-982, classifies Image distortion and then performs Quality evaluation.
Disclosure of Invention
The main purpose of the present invention is to overcome at least one of the above technical drawbacks, and to provide a method and an apparatus for evaluating quality of a picture obtained by a camera.
In order to achieve the above object, the present invention provides a quality evaluation method for a camera-oriented acquired picture, the method comprising the steps of:
s1, extracting brightness and chroma characteristics, and estimating the brightness and chroma of the input picture;
s2, extracting the characteristics of the picture noise, and estimating the noise degree of the input picture;
s3, extracting structural features and estimating the fuzzy degree of the input picture;
s4, extracting contrast characteristics and estimating the contrast of the input picture;
s5, extracting the statistical characteristics of the picture normalization coefficient, and estimating the naturalness of the picture;
s6, extracting visual perception characteristics of the picture, and estimating the change of the visual perception of the picture;
s7, a model of mapping the image features extracted by steps S1-S6 to image quality is learned on the training set using support vector regression method for predicting the image quality.
Preferably, in step S1, the image is converted from RGB space to HSI space, and the mean value of each channel of the image is extracted as the luminance feature and the chrominance feature of the image.
Preferably, in step S2, the relationship between the kurtosis of the clean image and the corresponding noise image is modeled by using the principle of scale invariance of the natural image, and the variance of the image noise is solved.
Preferably, in the step S3, the structural features of the image, including the gradient strength and phase consistency of the image, are extracted, and then the two features are fused to describe the local structural features of the image, so as to quantify the blurring degree of the image.
Preferably, the gradient strength refers to: calculating the gradient strength, performing convolution on the image by using a Sobel operator to obtain gradient maps in the horizontal direction and the vertical direction, and taking the arithmetic square root of the square sum of the gradient maps in the horizontal direction and the vertical direction as a gradient strength map.
Preferably, the extracting the phase consistency feature of the image refers to: and calculating the phase consistency characteristic of the image by adopting a Kovesi calculation method.
Preferably, in step S4, the difference between the central pixel and its surrounding upper, lower, left and right pixels is calculated to measure the contrast of the image, and the larger the difference, the higher the contrast.
Preferably, in step S5, the image is locally normalized by using the local mean and variance of the image, and then the obtained normalization coefficient is fitted by using a 0-mean generalized gaussian distribution, and fitting parameters are extracted to estimate the naturalness of the image.
Preferably, in step S6, the image is sparsely represented in units of blocks, then the difference between the image and its sparse representation is calculated, and the mean, variance, kurtosis, skewness and information entropy representing the residual are obtained.
Preferably, in step S7, features such as luminance, chrominance, and noise are extracted for each image using a set of distorted images, and then subjective scores corresponding to the extracted features and the images are input into a support vector regression model, a model for mapping image features to image quality is learned, and the model is used to predict the image quality.
A quality evaluation device for a camera-oriented picture comprises a computer-readable storage medium and a processor, wherein the computer-readable storage medium stores an executable program, and the executable program is executed by the processor to realize the quality evaluation method for the camera-oriented picture.
A computer-readable storage medium, storing an executable program, which when executed by a processor, implements the method for evaluating quality of a picture taken with a camera.
The invention has the beneficial effects that:
the invention provides a quality evaluation method and a quality evaluation device for a camera-oriented acquired picture, which are used for extracting features sensitive to image quality change, representing the change of image quality, and utilizing a support vector regression model to learn the mapping from image features to image quality so as to judge the quality of an image. The method does not need to refer to the original image, can obtain higher prediction performance, and has wide application value.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a schematic diagram of an embodiment of a quality evaluation method for a camera-oriented image according to the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
The degradation reasons of images shot by a camera are various, the embodiment of the invention grasps main factors influencing the image quality, including brightness, chroma, contrast, noisiness, fuzziness and the like, respectively carries out modeling, extracts corresponding characteristics to describe the factors influencing the image quality, and learns the mapping from the image characteristics to the image quality by utilizing a support vector regression model.
Fig. 1 is a schematic diagram of an embodiment of a quality evaluation method for a camera-oriented image according to the present invention. As shown in fig. 1, an embodiment of the present invention provides a quality evaluation method for a camera-oriented captured picture, where the method includes: s1, extracting brightness and chroma characteristics, and estimating the brightness and chroma of the input picture; s2, extracting the characteristics of the picture noise, and estimating the noise degree of the input picture; s3, extracting structural features and estimating the fuzzy degree of the input picture; s4, extracting contrast characteristics and estimating the contrast of the input picture; s5, extracting the statistical characteristics of the picture normalization coefficient, and estimating the naturalness of the picture; s6, extracting visual perception characteristics of the picture, and estimating the change of the visual perception of the picture; s7, a model of mapping the image features extracted by steps S1-S6 to image quality is learned on the training set using support vector regression method for predicting the image quality. The method comprises the steps of representing the change of image quality by extracting features sensitive to the change of the image quality, including brightness, chroma, noisiness, fuzziness, contrast, naturalness and visual perception statistical features, and learning the mapping from the image features to the image quality by using a support vector regression model so as to judge the quality of the image. The method does not need to refer to an original image and can obtain higher prediction performance.
In an embodiment, a specific implementation process and detailed details of the quality evaluation method for the camera-oriented acquired picture are as follows:
the input picture is first converted from RGB space to HSI space:
wherein, I is an input image,for the transformed image, T (-) is the chrominance space transfer function. Then, the luminance and chrominance characteristics are calculated separately:
wherein ,F1Representing a luminance characteristic, F2 and F3The characteristic of the chromaticity is represented,is the luminance channel of the image and,andfor both chrominance channels of the image, N represents the number of image pixels.
Then, the variance of the noise in the image is estimated, so as to estimate the noise level of the image, and assuming that the clean noise-free image is x and the corresponding noise image is y, the relationship between the kurtosis of y and the kurtosis of x, the variance of x, and the variance of noise can be expressed as:
wherein ,κyDenotes the kurtosis of y, κxRepresenting the kurtosis of x, α is the shape parameter of the x distribution,is the variance of x and is,is the variance of the noise. Then the variance of the noise and the estimate of the image x kurtosis can be solved by minimizing:
wherein ,for the estimation of the kurtosis of x,is an estimate of the variance of the noise,andis yiVariance and kurtosis of yiIs the image obtained by filtering y with the ith DCT filter. By usingIndicating the level of image noise.
Extracting the structural features of the image, estimating the blurring degree of the image by using the structural features, and respectively calculating the gradient strength and the phase consistency of the image to extract the internal structure of the image. The gradient strength is calculated as follows: convolving the image by using a Sobel operator to obtain a gradient map in the horizontal direction and a gradient map in the vertical direction:
wherein ,GxDenotes the gradient in the horizontal direction, GyRepresenting the gradient in the vertical direction, and then calculating a gradient intensity map:
where GM denotes the gradient intensity map.
Extracting the phase consistency of the image by a Kovesi method: given a one-dimensional signal s, defineAndfilters at the n-scale, even and odd symmetry respectively, forming an orthogonal pair of filters, here approximated by log-Gabor filters, with which the image is filtered to obtain a response at the j-position Amplitude is defined asLet F (j) be ∑nen(j),H(j)=∑non(j) Then the phase consistency PC is calculated as:
wherein ,epsilon is a small positive number, the denominator is prevented from appearing 0, and the calculation of the one-dimensional signal PC is popularized to the calculation of the two-dimensional signal PC, and is defined as:
where o represents an index for each direction.
Fusing the gradient intensity map and the phase consistency map to obtain a local structure map of the image:
where LS denotes a partial structure, (i, j) denotes a pixel position, and max denotes a maximum value operation. Pooling the local structure chart to obtain an estimation of the blurring degree of the image:
where s represents the degree of blurring of the image, Ω is the set of the top 20% maximum values in LS, and M is the number of elements in Ω.
Estimating the contrast of the image, calculating the difference between the central pixel and the surrounding adjacent pixels pixel by pixel, wherein the larger the difference is, the higher the contrast of the image is, assuming that the current pixel value is a, and the pixel value above the current pixel value is a1The lower pixel value is a2The pixel value on the left side is a3The pixel value on the right is a4The difference between the current pixel and the surrounding pixel values is defined as:
where d represents a difference value between pixel values.
Estimating the naturalness of the image, and performing local normalization on the image by using the local mean and the variance to obtain a normalized image, wherein the steps are as follows: calculating the normalized coefficient of the image, namely:
where I is the input image, (x, y) represents positional information,representing a normalized coefficient image, μ (x, y), σ (x, y) is a mean and variance local to (x, y), and then the local normalized image is fitted with a 0-mean generalized gaussian distribution whose probability density is defined as:
where Γ (·) is a gamma function defined as:
wherein α is a shape parameter describing the shape of the distribution, β represents a standard deviation, and the two parameters can describe the distribution and extract the naturalness of the image.
Extracting visual perception statistical characteristics of an image, and for an image I, firstly extracting one image block to perform sparse representation, assuming thatIt has a size ofThis process can be expressed as:
xk=Rk(I)
wherein ,Rk(·) is an image block extraction operator that extracts an image block at position k, where k ═ 1,2, 3.
For image block xkIt is in the dictionaryThe sparse representation of (1) is to obtain a sparse vector(αkMost of the elements are 0 or close to 0) satisfies:
the first term is a fidelity term, the second term is a sparse constraint term, lambda is a constant and is used for balancing the proportion of the two terms, p is 0 or 1, if p is 0, the sparse term represents the number of non-0 in the coefficient and is consistent with the sparsity required by people, however, the optimization problem of the 0 norm is non-convex and is difficult to solve, and the alternative solution is to set p to 1, so that the above formula is changed into the solution of the convex optimization problem. Solving the above formula by using Orthogonal Matching Pursuit (OMP) algorithm to obtain image block xkIs sparse representation coefficientX is thenkCan be expressed sparsely asThe sparse representation of the entire image I can be written as:
where I' represents a sparse representation of image I. The image distortion changes the understanding mode or sparse representation mode of the brain to the image, so the difference between the image and the sparse representation mode is used for describing the change condition of the image quality. The residual between the input image and the sparse representation is first calculated:
PR(x,y)=I(x,y)-I′(x,y)
where PR is the representation residual, I is the input image (or image block), and I' is the sparse representation of the input image. Extracting statistical features representing the residual error to pool the representation residual error, calculating a mean value, a variance, a kurtosis, a skewness and an information entropy pool the representation residual error, and assuming that epsilon (-) is a mean value operation, the mean value, the variance, the kurtosis and the skewness of the representation can be calculated as:
mPR=ε(PR)
the information entropy is calculated as:
wherein ,piIs the probability density of the ith gray level.
And training a quality evaluation prediction model, extracting the above characteristics of each image by using a group of distorted images, inputting the extracted characteristics and the corresponding subjective scores into a support vector regression model, training a quality model, and predicting the quality of other images by using the model.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.
Claims (10)
1. A quality evaluation method for a camera-oriented acquired picture is characterized by comprising the following steps:
s1, extracting brightness and chroma characteristics, and estimating the brightness and chroma of the input picture;
s2, extracting the characteristics of the picture noise, and estimating the noise degree of the input picture;
s3, extracting structural features and estimating the fuzzy degree of the input picture;
s4, extracting contrast characteristics and estimating the contrast of the input picture;
s5, extracting the statistical characteristics of the picture normalization coefficient, and estimating the naturalness of the picture;
s6, extracting visual perception characteristics of the picture, and estimating the change of the visual perception of the picture;
s7, learning a mapping model of the image features extracted by the steps S1-S6 to the image quality on the training set by using a support vector regression method for predicting the image quality.
2. The method for evaluating the quality of pictures acquired by a camera according to claim 1, wherein in step S1, the input picture is converted from RGB space to HSI space, and the mean value of each channel of the image is extracted as the luminance characteristic and the chrominance characteristic of the image;
preferably, the step S1 specifically includes: the input picture is first converted from RGB space to HSI space:
wherein, I is an input image,for the transformed image, T (-) is the chrominance space transfer function; then, the luminance and chrominance characteristics are calculated separately:
3. The method for evaluating the quality of the pictures acquired by the camera according to the claim 1, wherein in the step S2, the relationship between the kurtosis of the clean image and the corresponding noise image is modeled by using the principle of scale invariance of the natural image, and the variance of the image noise is solved;
preferably, the step S2 specifically includes: estimating the variance of noise in the image to estimate the noise level of the image, wherein if the clean noise-free image is x and the corresponding noise image is y, the relationship between the kurtosis of y and the kurtosis of x, the variance of x and the variance of noise is expressed as follows:
wherein ,κyDenotes the kurtosis of y, κxRepresenting the kurtosis of x, α is the shape parameter of the x distribution,is the variance of x and is,is the variance of the noise; the variance of the noise and the estimate of the image x kurtosis are solved by minimizing:
4. The method for evaluating the quality of the image obtained by the camera according to claim 1, wherein in the step S3, the structural features of the image, including the gradient strength and phase consistency of the image, are extracted, and then the two features are fused to describe the local structural features of the image, so as to quantify the blurring degree of the image; preferably, the calculation of the gradient strength uses a Sobel operator to perform convolution on the image to obtain gradient maps in the horizontal direction and the vertical direction, and then takes the arithmetic square root of the square sum of the gradient maps in the horizontal direction and the vertical direction as a gradient strength map; preferably, a phase consistency characteristic of the image is calculated by adopting a calculation method of Kovesi;
preferably, the step S3 specifically includes: extracting structural features of the image, respectively calculating the gradient strength and phase consistency of the image to extract the internal structure of the image, and estimating the fuzzy degree of the image by using the structural features; the gradient strength is calculated as follows: convolving the image by using a Sobel operator to obtain a gradient map in the horizontal direction and a gradient map in the vertical direction:
wherein ,GxDenotes the gradient in the horizontal direction, GyRepresenting the gradient in the vertical direction, and then calculating a gradient intensity map:
wherein GM represents a gradient intensity map;
extracting the phase consistency of the image by a Kovesi method: given a one-dimensional signal s, defineAndfilters at the n-scale, even and odd symmetry respectively, forming an orthogonal pair of filters, preferably approximated by log-Gabor filters, with which the image is filtered to obtain a response at the j-position Amplitude is defined asLet F (j) be ∑nen(j),H(j)=∑non(j) Then the phase consistency PC is calculated as:
wherein ,epsilon is a small positive number, the denominator is prevented from appearing 0, and the calculation of the one-dimensional signal PC is popularized to the calculation of the two-dimensional signal PC, and is defined as:
wherein o represents an index for each direction;
fusing the gradient intensity map and the phase consistency map to obtain a local structure map of the image:
LS(i,j)=max{GM(i,j),PC(i,j)}
wherein LS represents a local structure diagram, (i, j) represents a pixel position, and max represents a maximum value operation; pooling the local structure chart to obtain an estimation of the blurring degree of the image:
where s represents the degree of blurring of the image, Ω is the set of the top 20% maximum values in LS, and M is the number of elements in Ω.
5. The method for evaluating the quality of pictures acquired by a camera according to claim 1, wherein in step S4, the difference between the central pixel and the surrounding upper, lower, left and right pixels is calculated to measure the difference of the images, and the larger the difference is, the higher the contrast is;
preferably, the step S4 specifically includes: calculating the difference between the central pixel and the adjacent pixels on a pixel-by-pixel basis, wherein the larger the difference is, the higher the contrast of the image is, and assuming that the current pixel value is a and is adjacent to the current pixel valueSquare pixel value of a1The lower pixel value is a2The pixel value on the left side is a3The pixel value on the right is a4The difference between the current pixel and the surrounding pixel values is defined as:
where d represents a difference value between pixel values.
6. The method for evaluating the quality of a picture obtained by a camera according to claim 1, wherein in step S5, the image is locally normalized by using the local mean and variance of the image, and then the obtained normalization coefficient is fitted by using a 0-mean generalized gaussian distribution, and fitting parameters are extracted to estimate the naturalness of the image;
preferably, the step S5 specifically includes: the local mean and variance are used for carrying out local normalization on the image to obtain a normalized image, and the normalization means that: calculating the normalized coefficient of the image, namely:
where I is the input image, (x, y) represents positional information,representing a normalized coefficient image, μ (x, y), σ (x, y) is a mean and variance local to (x, y), and then the local normalized image is fitted with a 0-mean generalized gaussian distribution whose probability density is defined as:
where Γ (·) is a gamma function defined as:
Γ(x)=∫0 ∞tx-1e-tdt,x>0
wherein α is a shape parameter describing the shape of the distribution, β represents a standard deviation, and the two parameters can describe the distribution and extract the naturalness of the image.
7. The method for evaluating the quality of pictures acquired by a camera according to claim 1, wherein in step S6, the image is sparsely represented in units of blocks, then the difference between the image and its sparse representation is calculated, and the mean, variance, kurtosis, skewness and information entropy representing the residual are calculated;
preferably, the step S6 specifically includes: for an image I, firstly, one image block is extracted for sparse representation, and the assumption is thatIt has a size ofThis process can be expressed as:
xk=Rk(I)
wherein ,Rk(·) is an image block extraction operator, which extracts an image block at a position k, where k is 1,2, 3.
For image block xkIt is in the dictionaryThe sparse representation of (1) is to obtain a sparse vector(αkMost of the elements are 0 or close to 0) satisfies:
wherein the first term is fidelity term, and the second term is sparse constraint termλ is constant to balance the specific gravities of the two terms, and p is 0 or 1; preferably, p is set to 1, and the above equation is changed into the solution of the convex optimization problem; solving the above formula by using Orthogonal Matching Pursuit (OMP) algorithm to obtain image block xkIs sparse representation coefficientX is thenkCan be expressed sparsely asThe sparse representation of the entire image I is:
wherein I' represents a sparse representation of image I; describing the change situation of the image quality by using the difference between the image and the sparse representation mode thereof; the residual between the input image and the sparse representation is first calculated:
PR(x,y)=I(x,y)-I′(x,y)
wherein, PR is a representation residual, I is an input image (or image block), and I' is a sparse representation of the input image; extracting statistical features representing the residual error to pool the representation residual error, calculating a mean value, a variance, a kurtosis, a skewness and an information entropy pool to represent the residual error, and assuming that epsilon (-) is a mean value operation, the mean value, the variance, the kurtosis and the skewness of the representation are calculated as follows:
mPR=ε(PR)
the information entropy is calculated as:
wherein ,piIs the probability density of the ith gray level.
8. The method for evaluating the quality of pictures acquired by a camera according to claim 1, wherein in the step S7, the step S7 specifically comprises: using a group of distorted images, executing steps S1-S6 for each image, extracting corresponding features, inputting the extracted features and corresponding subjective scores into a support vector regression model, training an image feature-to-image quality mapping model, and then using the model to predict the image quality.
9. A camera-oriented picture quality evaluation device comprising a computer-readable storage medium and a processor, wherein the computer-readable storage medium stores an executable program, and wherein the executable program, when executed by the processor, implements the camera-oriented picture quality evaluation method according to any one of claims 1 to 8.
10. A computer-readable storage medium, in which an executable program is stored, which, when executed by a processor, implements the method for quality assessment of pictures acquired by a camera according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010112925.0A CN111354048B (en) | 2020-02-24 | 2020-02-24 | Quality evaluation method and device for obtaining pictures by facing camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010112925.0A CN111354048B (en) | 2020-02-24 | 2020-02-24 | Quality evaluation method and device for obtaining pictures by facing camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111354048A true CN111354048A (en) | 2020-06-30 |
CN111354048B CN111354048B (en) | 2023-06-20 |
Family
ID=71197156
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010112925.0A Active CN111354048B (en) | 2020-02-24 | 2020-02-24 | Quality evaluation method and device for obtaining pictures by facing camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111354048B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112419300A (en) * | 2020-12-04 | 2021-02-26 | 清华大学深圳国际研究生院 | Underwater image quality evaluation method and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102750695A (en) * | 2012-06-04 | 2012-10-24 | 清华大学 | Machine learning-based stereoscopic image quality objective assessment method |
US20170286798A1 (en) * | 2016-03-31 | 2017-10-05 | Ningbo University | Objective assessment method for color image quality based on online manifold learning |
CN107371015A (en) * | 2017-07-21 | 2017-11-21 | 华侨大学 | One kind is without with reference to contrast modified-image quality evaluating method |
CN109255358A (en) * | 2018-08-06 | 2019-01-22 | 浙江大学 | A kind of 3D rendering quality evaluating method of view-based access control model conspicuousness and depth map |
-
2020
- 2020-02-24 CN CN202010112925.0A patent/CN111354048B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102750695A (en) * | 2012-06-04 | 2012-10-24 | 清华大学 | Machine learning-based stereoscopic image quality objective assessment method |
US20170286798A1 (en) * | 2016-03-31 | 2017-10-05 | Ningbo University | Objective assessment method for color image quality based on online manifold learning |
CN107371015A (en) * | 2017-07-21 | 2017-11-21 | 华侨大学 | One kind is without with reference to contrast modified-image quality evaluating method |
CN109255358A (en) * | 2018-08-06 | 2019-01-22 | 浙江大学 | A kind of 3D rendering quality evaluating method of view-based access control model conspicuousness and depth map |
Non-Patent Citations (1)
Title |
---|
刘玉涛, 《基于视觉感知与统计的图像质量评价方法研究》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112419300A (en) * | 2020-12-04 | 2021-02-26 | 清华大学深圳国际研究生院 | Underwater image quality evaluation method and system |
Also Published As
Publication number | Publication date |
---|---|
CN111354048B (en) | 2023-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | Structural similarity based image quality assessment | |
Shao et al. | Full-reference quality assessment of stereoscopic images by learning binocular receptive field properties | |
CN108932697B (en) | Distortion removing method and device for distorted image and electronic equipment | |
CN111932532B (en) | Method for evaluating capsule endoscope without reference image, electronic device, and medium | |
Tian et al. | Light field image quality assessment via the light field coherence | |
CN106127741B (en) | Non-reference picture quality appraisement method based on improvement natural scene statistical model | |
CN104023230B (en) | A kind of non-reference picture quality appraisement method based on gradient relevance | |
CN108830823B (en) | Full-reference image quality evaluation method based on spatial domain combined frequency domain analysis | |
CN110232670B (en) | Method for enhancing visual effect of image based on high-low frequency separation | |
CN109978854B (en) | Screen content image quality evaluation method based on edge and structural features | |
CN106127234B (en) | Non-reference picture quality appraisement method based on characteristics dictionary | |
CN108074241B (en) | Quality scoring method and device for target image, terminal and storage medium | |
KR20200140713A (en) | Method and apparatus for training neural network model for enhancing image detail | |
CN105007488A (en) | Universal no-reference image quality evaluation method based on transformation domain and spatial domain | |
CN110910347B (en) | Tone mapping image non-reference quality evaluation method based on image segmentation | |
CN110111347B (en) | Image sign extraction method, device and storage medium | |
CN111105357A (en) | Distortion removing method and device for distorted image and electronic equipment | |
Shi et al. | SISRSet: Single image super-resolution subjective evaluation test and objective quality assessment | |
CN105894507A (en) | Image quality evaluation method based on image information content natural scenario statistical characteristics | |
Wang et al. | New insights into multi-focus image fusion: A fusion method based on multi-dictionary linear sparse representation and region fusion model | |
CN107590804A (en) | Screen picture quality evaluating method based on channel characteristics and convolutional neural networks | |
Morzelona | Human visual system quality assessment in the images using the IQA model integrated with automated machine learning model | |
CN108257117B (en) | Image exposure evaluation method and device | |
Lévêque et al. | CUID: A new study of perceived image quality and its subjective assessment | |
CN111354048B (en) | Quality evaluation method and device for obtaining pictures by facing camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |