CN104361593A - Color image quality evaluation method based on HVSs and quaternions - Google Patents
Color image quality evaluation method based on HVSs and quaternions Download PDFInfo
- Publication number
- CN104361593A CN104361593A CN201410650245.9A CN201410650245A CN104361593A CN 104361593 A CN104361593 A CN 104361593A CN 201410650245 A CN201410650245 A CN 201410650245A CN 104361593 A CN104361593 A CN 104361593A
- Authority
- CN
- China
- Prior art keywords
- mrow
- image
- msub
- quaternion
- mover
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 238000013441 quality evaluation Methods 0.000 title claims abstract description 42
- 241000710169 Helenium virus S Species 0.000 title abstract 2
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 18
- 238000013210 evaluation model Methods 0.000 claims abstract description 14
- 239000013598 vector Substances 0.000 claims abstract description 12
- 230000000007 visual effect Effects 0.000 claims description 35
- 239000011159 matrix material Substances 0.000 claims description 16
- 230000000694 effects Effects 0.000 claims description 11
- 238000001303 quality assessment method Methods 0.000 claims description 7
- 238000005192 partition Methods 0.000 claims description 3
- 238000002154 thermal energy analyser detection Methods 0.000 claims description 2
- 238000011156 evaluation Methods 0.000 abstract description 43
- 230000006870 function Effects 0.000 abstract description 31
- 238000012545 processing Methods 0.000 abstract description 12
- 230000004438 eyesight Effects 0.000 abstract description 7
- 238000004458 analytical method Methods 0.000 abstract description 3
- 238000003491 array Methods 0.000 abstract 2
- 238000004422 calculation algorithm Methods 0.000 description 39
- 230000000873 masking effect Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000008447 perception Effects 0.000 description 5
- 230000016776 visual perception Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000013178 mathematical model Methods 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 102000002274 Matrix Metalloproteinases Human genes 0.000 description 1
- 108010000684 Matrix Metalloproteinases Proteins 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012113 quantitative test Methods 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a color image quality evaluation method based on HVSs and quaternions and belongs to the technical field of image processing and computer vision. The method includes the steps that firstly, mathematic evaluation models of an original reference image and a distorted image to be evaluated are established through analysis of human vision features, wherein the mathematic evaluation models comprise spatial location functions Q<L>, local variances Q<V>, texture edge complexity functions Q<TE> and color functions Q<C> of the images; secondly, quaternion arrays of the original reference image and the distorted image to be evaluated are established and singular value decomposition is conducted on the quaternion arrays, so that singular value feature vectors of the images can be acquired; thirdly, the image distortion degree is measured through the Euclidean distance of the singular value feature vectors of the original reference image and the distorted image to be evaluated. According to the method, the human vision features and the quaternions are combined, brightness and chromaticity information of the images is extracted, the spatial location functions, the texture edge complexity functions and the local variances are established through the human vision features, and an evaluation result accords with a result generated in the mode that the images are sensed by the human eyes.
Description
Technical Field
The invention relates to the technical field of image processing and computer vision, in particular to a method for evaluating the quality of a color image by constructing a mathematical model consistent with an image observed by human eyes by using the characteristics of a human eye vision system and combining quaternion singular value decomposition.
Background
The image quality is one of important parameters in the field of image processing and computer vision, and with the development of computer science and technology, the requirements on the image quality in the aspects of printing, ceramic tiles, images, image retrieval and the like are higher and higher, but image distortion and image degradation problems of different degrees are generated in the processes of image acquisition, processing, compression, transmission, display and the like.
Humans are the ultimate recipients of images, making their subjective quality assessment (DMOS) of images considered most reliable. The subjective quality evaluation is to make quality evaluation and score the visual perception effect of the target image to be evaluated according to the subjective perception experience of an observer or some evaluation standards uniformly specified in advance, and then to perform weighted average on the scores of all the observers, wherein the obtained result is the subjective quality score of the image. However, subjective image quality evaluation is time-consuming and labor-consuming, is greatly affected by an observer, an image type, and a surrounding environment, and is weak in real-time performance. Therefore, people are constantly dedicated to research objective image quality evaluation methods capable of correctly, timely and effectively reflecting subjective visual perception of people. The objective image quality evaluation is to utilize an algorithm, a mathematical model and the like to perform timely and rapid feedback on the image quality so as to obtain an evaluation result consistent with the subjective feeling of people. The method is various, and the classification method is different due to different entry points and basic ideas. The objective quality evaluation method is classified into 3 types of full reference type, partial reference type and no reference type according to the reference of the original image. The full reference type is suitable for encoder design and performance comparison of different encoders, and the partial reference type and the no-reference type are suitable for multimedia application with limited bandwidth. Because the full reference type can utilize all information of the original image, the evaluation result of the image is more consistent with human subjective evaluation.
The Peak Signal Noise Ratio (PSNR) and Mean Square Error (MSE) proposed in "Image quality assessment Based on Gradient Similarity", published by Liu a et al in IEEE Transactions on Image processing ", 2012 are the most classical full-reference objective Image quality evaluation methods. PSNR reflects the Fidelity (Fidelity) of the image to be evaluated, while MSE reflects the dissimilarity (Diversity) between the image to be evaluated and the original image. The theories of the two methods are simple and clear, easy to understand and convenient to calculate, but the theories only consider the comparison of all the pixel points of the image and do not consider the possible structural relationship and the like among all the pixel points of the image, and the structural relationship and the like have deviation from the true viewing of human eyes.
Z Wang et al, in 2004, published in IEEE Transactions on Image Processing, propose that SSIM algorithm comprehensively compares differences between three types of different information, namely brightness, contrast and structural similarity, of an original undistorted Image and an Image to be evaluated, and considers the structural relationship between pixels, but has the problems of poor detail grasp under severe blurring conditions, difficult index parameter determination and the like.
The gradient amplitude based similarity bias algorithm GMSD proposed in the A high level effective probability Image quality index, published by W.xue et al in 2013 on IEEE Transactions on Image Processing, takes into account that gradients are highly sensitive to Image distortion, but the Processing of color images must be converted to the grayscale domain. The evaluation of the color image by the method needs to be converted into a gray image, and the evaluation result has deviation from the actual condition seen by human eyes.
Through search, the Chinese patent application No. 200610027433.1, the application date is 6/8/2006, the name of the invention creation is: an image quality evaluation method based on supercomplex singular value decomposition; the method directly models the color image by using the hypercomplex number (quaternion), extracts the inherent energy characteristics of the color image by using the singular value decomposition of the hypercomplex number, constructs a distortion mapping matrix by using the distance between the original image and the singular value of the distortion image, and evaluates the quality of the color image by using the distortion mapping matrix. The Chinese patent application No. 201210438606.4, the application date is 2012, 11 and 6, the name of the invention is: the application of the color image quality evaluation algorithm takes the chromaticity, the brightness and the saturation of an image as imaginary parts of quaternions respectively, quaternion matrixes of a reference image and an image to be evaluated are constructed, singular value decomposition is carried out on the quaternion matrixes and the image to be evaluated respectively to obtain singular value characteristic vectors, finally, the gray relevance is applied to calculate the relevance between the singular value characteristic vectors of the reference image and the singular value characteristic vectors of the image to be evaluated, and the greater the relevance, the better the quality of the image to be evaluated is. However, the evaluation results obtained in the above applications still have a large deviation from the actual conditions seen by human eyes, and the evaluation method of color image quality still needs to be further optimized.
Disclosure of Invention
1. Technical problem to be solved by the invention
The invention provides a color image quality evaluation method based on HVS and quaternion, aiming at overcoming the problem that the deviation between an evaluation result and the actual condition seen by human eyes is larger because a color image needs to be converted into a gray image for processing when an evaluation model is constructed by a traditional evaluation method; the invention provides a method for extracting brightness and chrominance information of an image by combining human visual characteristics and quaternion, and constructing a spatial position function, a texture edge complexity function and a local variance by using the human visual characteristics, so that the energy characteristics of the image are extracted by using quaternion singular value decomposition in order to improve the traditional method of cutting R, G, B three channels, and the evaluation result is more consistent with the effect of human perception of the image.
2. Technical scheme
In order to achieve the purpose, the technical scheme provided by the invention is as follows:
the invention relates to a color image quality evaluation method based on HVS and quaternion, which comprises the following steps:
step one, constructing a mathematical evaluation model of an original reference image and a distorted image to be evaluated by analyzing human visual characteristics, wherein the mathematical evaluation model is used for performing mathematical evaluationThe price model comprises a spatial position function Q of the imageLLocal variance QVTexture edge complexity function QTEAnd a color function QC;
Step two, Q is addedL、QV、QTEAs the imaginary part of a quaternion, QCRespectively constructing quaternion matrixes of an original reference image and a distorted image to be evaluated as a real part of a quaternion, and performing singular value decomposition on the quaternion matrixes to obtain singular value eigenvectors of the image;
and step three, measuring the image distortion degree by utilizing Euclidean distances of singular value feature vectors of the original reference image and the distorted image to be evaluated.
Furthermore, the specific process of constructing the mathematical evaluation model in the step one is as follows:
(1) acquiring RGB tristimulus values of an original reference image and a distorted image to be evaluated;
(2) extracting the spatial position information of the original reference image and the distorted image to be evaluated, and constructing a spatial position function QLAnd texture edge complexity function QTE;
(3) Converting an original reference image and a distorted image to be evaluated from an RGB space into a YUV color space, extracting image brightness information and constructing a local variance QVExtracting image brightness and chroma information to construct color function QC。
Further, step one utilizes the foveal properties of the human visual system to construct the spatial location function QLSaid spatial position function
In the formula, eLThe distance from the pixel point (i, j) for human visual observation to the image center pixel point (M/2, N/2) andquotient of (d); e.g. of the typecIs a constant.
Further, step one constructs a texture edge complexity function Q using the masking effect of the human visual systemTESaid texture edge complexity function
QTE=QT×QE
In the formula, QTIs the texture complexity function, Q, of the pixel point (i, j)zIs the edge complexity function of pixel point (i, j).
Further, step one constructs the local variance Q using multi-channel properties of the human visual systemVThe local variance of
Wherein, non-overlapping blocks are divided according to the brightness component of the image to obtain Ii,jL is an image partition Ii,jContains the pixel point etapThe number of the (c) is, <math>
<mrow>
<mover>
<msub>
<mi>I</mi>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
</msub>
<mo>‾</mo>
</mover>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>L</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>p</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>L</mi>
</munderover>
<msub>
<mi>η</mi>
<mi>p</mi>
</msub>
<mo>.</mo>
</mrow>
</math>
further, the color function
QC=αQL+βQU
In the formula, QLAs luminance information of the image, QUAlpha and beta are the proportion of brightness and chroma respectively, which are the chroma information of the image.
Further, the Euclidean distance in step three
In the formula, λiIs a singular value feature vector of an original reference image,and K is the singular value eigenvector of the distorted image to be evaluated, and is the minimum value of the eigenvalues of the two singular value eigenvectors, namely the minimum value of the two quaternion matrix ranks:
3. advantageous effects
Compared with the prior art, the technical scheme provided by the invention has the following remarkable effects:
(1) the invention discloses a color image quality evaluation method based on HVS and quaternion, which constructs a space position function Q of an original reference image and a distorted image to be evaluated by analyzing the visual characteristics of human eyesLLocal variance QVTexture edge complexity function QTEAnd a color function QCThe four parts of image information are integrated through quaternion, energy characteristics of the image are obtained through singular value decomposition, the traditional three-channel method of splitting R, G, B is improved, the integrity of color information is well guaranteed, and the extracted image information contains global and local information, so that the evaluation result can represent all information of the image more completely;
(2) according to the color image quality evaluation method based on the HVS and the quaternion, the visual characteristics of human eyes and the quaternion are combined, the evaluation result is more consistent with the effect of human eyes on perceiving images, and the color image quality evaluation method is superior to the traditional SSIM and other typical image quality evaluation algorithms.
Drawings
FIG. 1 is a flow chart of an algorithm of a color image quality evaluation method based on HVS and quaternion according to the present invention;
FIG. 2 is an equivalent diagram of the foveal characteristic of the human visual system of the present invention;
FIG. 3 is a diagram of the fitting result of the subjective quality of the image with the quality evaluation method, the conventional method, according to the present invention; wherein, (a) in fig. 3 is a graph of nonlinear fitting of PSNR and DMOS values, fig. 3 (b) is a graph of nonlinear fitting of SSIM and DMOS values, fig. 3 (c) is a graph of nonlinear fitting of MS-SSIM and DMOS values, fig. 3 (d) is a graph of nonlinear fitting of SVD and DMOS values, fig. 3 (e) is a graph of nonlinear fitting of GMSD and DMOS values, and fig. 3 (f) is a graph of nonlinear fitting of the quality evaluation method of the present invention and DMOS values;
fig. 4 (a) to (e) are graphs comparing non-linear fitting curves of HVS-QSVD, GMSD, SSIM and DMOS values of five sets of images with different distortion types.
Detailed Description
For a further understanding of the invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings and examples.
Example 1
With reference to fig. 1, the present embodiment provides a color image quality evaluation method based on HVS and quaternion, aiming at the problem that when an evaluation model is constructed by a conventional evaluation method, a color image needs to be converted into a grayscale image for processing, so that an evaluation result does not conform to human eye perception. In the embodiment, the image energy characteristics conforming to the human eye perception are obtained by combining the human eye visual characteristics with the quaternion. Experiments show that the evaluation of the color image is superior to other methods, and the evaluation result is more consistent with the image perceived by human eyes. The image quality evaluation method of this embodiment will be described in detail below with reference to the experimental results:
the method comprises the following steps of firstly, constructing a mathematical evaluation model of an original reference image and a distorted image to be evaluated by analyzing human eye visual characteristics:
the visual characteristics of human eyes, which are provided on the basis of understanding the physiological structure of the human visual system, are closely related to how human eyes observe the external environment and the image. Since the person is the final recipient of the image, the evaluation result is consistent with what the human eye actually sees, and the present embodiment constructs a corresponding mathematical model by analyzing the visual characteristics of the human eye.
The human eye is similar to a variable focal length convex lens, but is affected by the complex structure of the human brain, unlike a general convex lens. In general, human visual characteristics include foveal characteristics, visual multichannel characteristics, visual nonlinearity, contrast sensitivity, and masking effects. The embodiment builds a mathematical evaluation model from the characteristics of the fovea medialis, the visual multichannel and the masking effect.
(1) Function of spatial position
The foveal characteristic of the human visual system means that when an image appears, the central position information of the image can be firstly observed by human eyes, and especially the position change information of the texture edge near the center of the image is easy to be sensed by the human eyes.
The human eye will see the center of the image first and then spread all around, and the human eye should treat equally at points all around at the same distance from the center. As shown in fig. 2, assuming that the center O is the center of the image, the distances from the points on the circle to the center are equal, the probability that A, B, C, D four points are observed by the human eye is the same, and E and F are also the same.
This example is in accordance with the literature (CHEN T, WU H R. space variant media filters for the retrieval of impulse noise corrected images [ J]IEEE Transactions on Circuits and Systems II: Analog and digital Signal Processing,2001,48(8):784 and 789.) it is mentioned that, depending on the foveal nature of the human visual system, it is expressed with a definite formula how the spatial resolution affects the human eye's view of the image. The specific process comprises the steps of firstly obtaining RGB tristimulus values of an original reference image and a distorted image to be evaluated, then extracting the spatial position information of the original reference image and the distorted image to be evaluated, and respectively constructing the spatial position function Q of the original reference image and the distorted image to be evaluatedL:
In the formula, eLThe distance from the pixel point (i, j) for visual observation of human eyes to the central pixel point (M/2, N/2) of the image is divided by the distance from the first pixel point (0,0) of the first pixel point of the image to the central pixel point of the imageecIs a constant determined according to the test result, and after the test, the embodiment sets ecIs 0.6.
(2) Texture edge complexity function
The masking effect of the human visual system means that phenomena that could otherwise be noticed are neglected due to the presence of other phenomena. In different regions, the masking effect of the human visual system can be reflected by the respective weight ratios, which is more consistent with the characteristics of the image observed by the human eye.
The embodiment extracts the texture feature information and the edge feature information of the image and obtains the texture edge complexity function Q of the imageTE,QTEThe larger the texture, the simpler the texture is, the more attention is paid to human eyes, and the larger the influence of the human eyes on the image quality is; conversely, a smaller QTE indicates a more complex texture and is more easily ignored by the human eye. The specific calculation process is as follows:
the gradient direction is first calculated:
wherein theta (i, j) represents the gradient direction of the pixel point (i, j),andand (3) representing the horizontal and vertical gradient values of the pixel point (i, j). And calculating corresponding image edge characteristic information by using a Sobel edge detection operator, normalizing the edge information, and recording the normalized edge information as E (i, j). The gradient direction is divided into the following regions in the range of [0,360 °), as shown in the following equation:
θ(i,j)∈{0°,180°,45°,225°,90°,270°,135°,315°} (3)
where 0 ° and 180 °,45 ° and 225 °,90 ° and 270 °,135 ° and 315 ° are symmetric about the origin, respectively, i.e. there are 4 different directional regions.
Calculating the complexity of the texture:
let a1The number of direction types, i.e., the number of types of θ (i, j), a2The number of edge points, i.e., the number of pixels having an E (i, j) ═ 1 is calculated. When a is2Smaller than a set threshold value, a20, otherwise a2The threshold value was set to 40 by experimental testing as 1. Therefore, we apply the texture complexity function Q of a certain pixel point (i, j) in the imageTIs defined as follows:
edge complexity:
first, three vectors P ═ 1,0,2,0,1, L ═ 1,4,6,4,1, and E ═ 1, -2,0,2,1 are defined. Wherein, P represents a "point" feature descriptor, L represents a "line" feature descriptor, E represents an "edge" feature descriptor, and 6 masks can be obtained by using these 3 operators: l isT×E,LT×P,ET×L,ET×P,PT×L,PTAnd (4) multiplied by E. Let f be the responses of the 6 masks at a certain pixel point (i, j) in the graphi,j(LT×E),fi,j(LT×P),fi,j(ET×L),fi,j(ET×P),fi,j(PT×L),fi,j(PTXe), the edge complexity of the pixel (i, j) is:
texture edge complexity function for pixel (i, j):
QTE=QT×QE (6)
the larger the calculation result is, the weaker the masking effect is, the simpler the texture is, and the human eye can see clearly, namely the visual effect on the human eye is stronger.
(3) Local variance
The multi-channel nature of the human visual system means that the human eye observes images in different channels, and only general outlines can be distinguished when the resolution is low, and detailed information can be distinguished when the resolution is high. The detail information of the image can be represented by the local variance of the image, so the local variance of the image is used as a means for describing and analyzing the content information of the image, and some important structural information of the image can be summarized by the local variance distribution of the image.
By QVThe variance of the local region of the image I centered on the pixel point (I, j), i.e., the local variance, is represented. The embodiment firstly needs to take the original reference image and the distortion map to be evaluatedThe image is converted from RGB space to YUV color space, using Y (representative luminance) classification to calculate local variance. And (3) partitioning the Y component of the image in a non-overlapping way by adopting a sliding window to obtain the variance of each partition, namely the local variance of the image. For each image block Ii,jComprising L pixels, using etapEach pixel point inside the local variance can be expressed as:
wherein,is divided into blocks Ii,jIs measured.
Since the size and mode of each sub-block affect the structure of the image, the size and mode of each sub-block are related to the image structure included in Ii,jPixel η within a tilepThe literature (Z Wang, Z Bovik, et al. image quality assessmentnt from error measurement to structuralsimilarity[J]The Gaussian weighting method mentioned in IEEE Transactions on Image processing, 2004,13(4): 600-:
block mean value: <math>
<mrow>
<mover>
<msub>
<mi>I</mi>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
</msub>
<mo>‾</mo>
</mover>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>p</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>L</mi>
</munderover>
<msub>
<mi>X</mi>
<mi>p</mi>
</msub>
<msub>
<mi>η</mi>
<mi>p</mi>
</msub>
</mrow>
<mrow>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>p</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>L</mi>
</munderover>
<msub>
<mi>η</mi>
<mi>p</mi>
</msub>
</mrow>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>8</mn>
<mo>)</mo>
</mrow>
</mrow>
</math>
block local variance: <math>
<mrow>
<msub>
<mi>Q</mi>
<mi>V</mi>
</msub>
<mrow>
<mo>(</mo>
<msub>
<mi>I</mi>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
</msub>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>p</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>L</mi>
</munderover>
<msub>
<mi>X</mi>
<mi>p</mi>
</msub>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>η</mi>
<mi>p</mi>
</msub>
<mo>-</mo>
<mover>
<msub>
<mi>I</mi>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
</msub>
<mo>‾</mo>
</mover>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
<mrow>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>p</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>L</mi>
</munderover>
<msub>
<mi>X</mi>
<mi>p</mi>
</msub>
</mrow>
</mfrac>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>9</mn>
<mo>)</mo>
</mrow>
<mo>.</mo>
</mrow>
</math>
in the formula XpIs a pixel point etapThe number of (2).
(4) Color information
Hue, saturation, and brightness are three attributes of color, also referred to as three elements of color. They are inherent characteristics of color and are different from each other. Hue and saturation may be represented by chroma. The only feature of a grayscale image is luminance, while a color image also has chrominance features.
The brightness is a physical quantity, which is the feeling of human light intensity and reflects the intensity of light emitted (reflected) from the surface of a luminous body (reflector). Hue refers to the general tendency of picture color in a picture, and is a large color effect. The saturation is also referred to as color purity, and refers to the degree of vividness of a color, and represents the ratio of color components contained in the color. The saturation of the colors increases with the increase of the color ratio, and is directly related to the light irradiation condition and the surface structure of the shot object. Since hue and saturation can be represented by chromaticity uniformity, the present embodiment represents essential attributes of color using brightness and chromaticity.
The sensitivity of the human visual system to brightness is higher than that to chroma, and the present embodiment uses a weighting method to represent the color information of an image, that is, for different color images, the weight proportion of brightness and chroma is different, and the specific calculation relationship is as follows:
QC=αQL+βQU (10)
wherein Q isLAs luminance information of the image, QUAlpha and beta are respectively the proportion of brightness and chroma, and the alpha and beta are respectively 1.063 and 0.937 optimal according to the test of experimental results.
Step two, respectively constructing quaternion matrixes of the original reference image and the distorted image to be evaluated, and performing singular value decomposition on the quaternion matrixes to obtain singular value eigenvectors of the image:
(a) quaternion
In 1843, hamilton (w.r. hamilton) a british mathematician created quaternions. A quaternionContains 4 parts, 1 real part plus 3 imaginary parts, the basic form of which is:
wherein q isr,qi,qj,qkFor four real numbers, primitivesSatisfies the following conditions:
quaternion matrixIn the real number domain, it can be decomposed into the following forms:
the quaternion matrix singular value decomposition theorem can be expressed as: for any quaternion matrix Qe(q)=U(q)Λ'V(q)x, let rank (a) ═ r, then there is a quaternion unitary matrix U(q)And V(q)So that
Q(q)=U(q)ΛV(q)λ
Wherein,
and satisfy lambdai∈R,|λ1|≥|λ2|≥…≥|λr|>0,λiAre non-zero singular values.
(b) Quaternion representation
In this embodiment, the four pieces of feature information of the color image obtained by the above analysis are integrated into a quaternion form, as follows:
Q=QC+QLi+QTEj+QVk (11)
wherein Q isCBeing color information of the image, QLBeing spatial position information of the image, QTEFor texture edge information of images, QVIs the local variance of the image.
In this way, an M × N color image can be regarded as a quaternion matrix, and the singular value eigenvectors of the quaternion matrix represent the energy characteristics of the quaternion, so that the quaternion matrix obtained from the color image can also be used to represent the energy characteristics of the corresponding color image.
Since a quaternion matrix q-qr+qii+qjj+qkk, can be represented by its real matrix, so this embodiment converts Q to its corresponding real matrix form for Singular Value Decomposition (SVD). Each quaternion matrix can obtain a singular value eigenvector through singular value decomposition, and each element of the eigenvector is a real number larger than 0. It should be noted that, since the theoretical research on singular value decomposition of the quaternion matrix is mature, the embodiment is considered from the viewpoint of reducing the space and is not described herein again.
Measuring the distortion degree of the image by using Euclidean distances of singular value feature vectors of the original reference image and the distortion image to be evaluated:
the embodiment measures the corresponding image distortion by using Euclidean distance (Euclidean distance) of singular value feature vectors of an original reference image and a distorted image to be evaluated, namely measuring the corresponding image distortion
Wherein λ isiAndfor singular value eigenvectors corresponding to the original reference image and the distortion image to be evaluated obtained by calculation, K is the minimum value of the eigenvalues of the two singular value eigenvectors, namely the minimum value of the two quaternion matrix ranks:
according to the color image quality evaluation method based on the HVS and the quaternion, the visual characteristics of human eyes and the quaternion are combined, the evaluation result is more consistent with the effect of the human eyes for perceiving the image, the traditional method of cutting R, G, B three channels is improved, the integrity of color information is well guaranteed, the extracted image information comprises global and local information, and the evaluation result can represent all information of the image more completely. The evaluation result is superior to the conventional SSIM and other typical image quality evaluation algorithms, and the experimental result of this embodiment will be analyzed from two aspects as follows:
1) the quality evaluation method comprises the following steps of carrying out nonlinear fitting on the PSNR, SSIM, MS-SSIM, Y-SVD, GMSD and DMOS values; 2) the quality evaluation method is compared with the performance evaluation of PSNR, SSIM, MS-SSIM, Y-SVD and GMSD.
The quality evaluation pictures are an Image quality evaluation Database 2 Image Database provided by the Laboratory for Image and Video Engineering (LIVE) of austin university of TEXAS (TEXAS) state, united states, and total 982 pictures have five distortion types of JPEG2000, JPEG, white gaussian noise, gaussian blur, Fast Fading rayleigh channel distortion. When algorithm comparisons are made, differences in the dimensions and units of the respective algorithms are produced. Therefore, the objective image quality scores obtained by the algorithm to be evaluated are subjected to nonlinear regression. And carrying out nonlinear regression on the objective image quality original score obtained by the algorithm to be evaluated by using a Logistic function as a nonlinear mapping function:
wherein x is the original quality score, alpha, obtained by the algorithm to be evaluated provided by the invention1,α2,α3,α4Is a parameter which is self-adaptive and adjusted in the nonlinear regression process. The indexes of the quantitative test evaluation result are MAE/RMSE/CC/SROCC/OR with high recognition degree and high citation times.
1) The Mean Absolute Error (MAE) between scores after subjective and objective nonlinear regression reflects the Mean Error level of objective quality evaluation results and subjective evaluation results, and the smaller the Mean Absolute Error level is, the higher the accuracy of image quality evaluation results is, and the definition formula is as follows:
2) the Root Mean Square Error (RMSE) between scores after subjective and objective nonlinear regression reflects the accuracy of objective evaluation results, and the smaller the Error is, the higher the accuracy of image quality evaluation results is, the definition formula is as follows:
3) the Pearson linear Correlation Coefficient (CC) among scores after the subjective and objective nonlinear regression reflects the consistency and accuracy of objective evaluation results, the value range is [ -1,1], the closer the absolute value of the result is to 1, the better the Correlation of the subjective and objective evaluation method is, and the definition formula is as follows:
4) the Spearman Rank Order Correlation Coefficient (SROCC) among scores after subjective and objective nonlinear regression is a widely applied nonparametric statistical analysis method, reflects monotonicity of objective quality evaluation results and subjective evaluation results, has a value range of [ -1,1], and the closer the absolute value of the results is to 1, the better the consistency of the subjective and objective evaluation methods is, the better the definition formula is as follows:
5) the separation Rate (OR) between scores after the subjective and objective nonlinear regression reflects the stability and the predictability of the objective evaluation model, the value range is [0,1], the smaller the numerical value is, the better the consistency of the subjective and objective evaluation is, the better the predictability of the evaluation model is, and the definition formula is as follows:
where N is the total number of image databases, i.e. 982, xi、yiRespectively representing the subjective and objective evaluation values u of the ith image after nonlinear regressioni、viRespectively representing the ranking of the subjective and objective evaluation value of the ith image in all evaluation values of the whole image database, NoutThe number of images representing an objective evaluation value greater than twice the standard deviation of the subjective evaluation value.
Fig. 3 shows a scatter diagram of each algorithm and subjective evaluation DMOS, respectively, where each point in the diagram represents an image, the abscissa of the point is the objective quality evaluation score of the algorithm on the image, the ordinate is the subjective evaluation DMOS value of the image, and the solid line represents a fitting curve. The closer the scatter points are distributed near the fitted curve, the better the consistency of the algorithm and the subjective evaluation result is, and the better the algorithm is. It can be seen that the 982 scatter diagrams provided by the method are closest to the fitted curve, which shows that the effect of the method provided by the invention is better than that of the other four methods after nonlinear fitting.
TABLE 1 LIVE image database image quality evaluation method Performance comparison
From the experimental data in table 1, we found that the quality evaluation method of the present invention is the best among 6 evaluation indexes, the average absolute error and the root mean square error are the smallest, the correlation with subjective visual perception is the highest, and the separation rate is the lowest. Because the PSNR algorithm does not consider the correlation between the respective pixel points, each pixel is treated equally, and the overall performance is the worst among the 6 comparison algorithms. The SSIM algorithm uses structural information of an image to evaluate image quality, and is related to a human visual perception mode. The MS-SSIM algorithm utilizes a multi-resolution analysis technology to evaluate the quality of multi-scale images on the basis of the SSIM algorithm, so that the performance of the MS-SSIM algorithm is superior to that of PSNR and SSIM. The SVD algorithm which only carries out singular value decomposition on the brightness component has obviously better performance than the PSNR, and the algorithm which carries out singular value decomposition on the image has certain superiority. Visually, a curve obtained by nonlinear fitting of the GMSD algorithm and the DMOS value is closest to a straight line, but some scattered points are scattered and are far away from the curve. The last line of table 1 shows that the quality evaluation method of the present invention is significantly superior to the conventional PSNR algorithm, structure similarity SSIM algorithm, multi-scale structure similarity MS-SSIM algorithm, singular value decomposition algorithm SVD, gradient magnitude similarity deviation GMSD algorithm, which indicates that the image quality evaluation algorithm based on quaternion and human eye visual characteristics of the present invention can better reflect the subjective visual perception of human eyes on images.
Because 982 images are composed of 5 image sub-libraries with different distortion types, in order to further prove the superiority of the quality evaluation method of the present invention, the performance comparison between the HVS-QSVD algorithm and the GMSD algorithm, SSIM algorithm is performed for the 5 image sub-libraries in this embodiment. As shown in fig. 4, each three of the graphs are a group, which are respectively the non-linear fitting graphs of the HVS-QSVD algorithm, the GMSD algorithm, and the SSIM algorithm, and the total is five groups. The first group of fig. 4 (a) to the fifth group of fig. 4 (e) are JPEG2000, JPEG, white gaussian noise, gaussian blur, Fast Fading rayleigh channel distortion, respectively. It can be seen that the fitting effect of the HVS-QSVD algorithm provided by the invention and the subjective evaluation value under different distortion types is better than that of GMSD and SSIM algorithms.
In the color image quality evaluation method based on HVS and quaternion described in embodiment 1, in order to make the evaluation result more consistent with human eye perception, a mathematical model is constructed using human eye visual characteristics, and in order to improve the traditional method of cutting R, G, B three channels, feature information of an image is extracted using quaternion singular value decomposition. The experimental result shows that the evaluation result is more consistent with the effect of human eyes for perceiving the image.
The present invention and its embodiments have been described above schematically, without limitation, and what is shown in the drawings is only one of the embodiments of the present invention and is not actually limited thereto. Therefore, if the person skilled in the art receives the teaching, without departing from the spirit of the invention, the person skilled in the art shall not inventively design the similar structural modes and embodiments to the technical solution, but shall fall within the scope of the invention.
Claims (7)
1. A color image quality evaluation method based on HVS and quaternion comprises the following steps:
step one, constructing a mathematical evaluation model of an original reference image and a distorted image to be evaluated by analyzing human visual characteristics, wherein the mathematical evaluation model comprises a spatial position function Q of the imageLLocal variance QVTexture edge complexity function QTEAnd a color function QC;
Step two, Q is addedL、QV、QTEAs the imaginary part of a quaternion, QCAsRespectively constructing quaternion matrixes of an original reference image and a distorted image to be evaluated, and performing singular value decomposition on the quaternion matrixes to obtain singular value eigenvectors of the image;
and step three, measuring the image distortion degree by utilizing Euclidean distances of singular value feature vectors of the original reference image and the distorted image to be evaluated.
2. A HVS and quaternion based color image quality assessment method according to claim 1, characterized by: the specific process of constructing the mathematical evaluation model comprises the following steps:
(1) acquiring RGB tristimulus values of an original reference image and a distorted image to be evaluated;
(2) extracting the spatial position information of the original reference image and the distorted image to be evaluated, and constructing a spatial position function QLAnd texture edge complexity function QTE;
(3) Converting an original reference image and a distorted image to be evaluated from an RGB space into a YUV color space, extracting image brightness information and constructing a local variance QVExtracting image brightness and chroma information to construct color function QC。
3. The HVS and quaternion based color image quality assessment method according to claim 2, characterized in that: step one, constructing a spatial position function Q by utilizing the foveal characteristic of the human visual systemLSaid spatial position function
In the formula, eLThe distance from the pixel point (i, j) for human visual observation to the image center pixel point (M/2, N/2) andquotient of (d); e.g. of the typecIs a constant.
4. An HVS and quaternion based color image quality assessment method according to claim 2 or 3, characterized in that: step one, constructing a texture edge complexity function Q by utilizing the covering effect of a human visual systemTESaid texture edge complexity function
QTE=QT×QE
In the formula, QTIs the texture complexity function, Q, of the pixel point (i, j)EIs the edge complexity function of pixel point (i, j).
5. The HVS and quaternion-based color image quality evaluation method according to claim 4, wherein: step one, constructing a local variance Q by using the multichannel characteristics of the human visual systemVThe local variance of
Wherein, non-overlapping blocks are divided according to the brightness component of the image to obtain Ii,jL is an image partition Ii,jContains the pixel point etapThe number of the (c) is, <math>
<mrow>
<mover>
<msub>
<mi>I</mi>
<mrow>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
</mrow>
</msub>
<mo>‾</mo>
</mover>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>L</mi>
</mfrac>
<munderover>
<mi>Σ</mi>
<mrow>
<mi>p</mi>
<mo>=</mo>
<mn>1</mn>
</mrow>
<mi>L</mi>
</munderover>
<msub>
<mi>η</mi>
<mi>p</mi>
</msub>
<mo>.</mo>
</mrow>
</math>
6. the HVS and quaternion based color image quality assessment method according to claim 5, characterized in that: said color function
QC=αQL+βQU
In the formula, QLAs luminance information of the image, QUAlpha and beta are the proportion of brightness and chroma respectively, which are the chroma information of the image.
7. The HVS and quaternion based color image quality assessment method according to claim 6, characterized in that: euclidean distance in step three
In the formula, λiIs a singular value feature vector of an original reference image,and K is the singular value eigenvector of the distorted image to be evaluated, and is the minimum value of the eigenvalues of the two singular value eigenvectors, namely the minimum value of the two quaternion matrix ranks:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410650245.9A CN104361593B (en) | 2014-11-14 | 2014-11-14 | A kind of color image quality evaluation method based on HVS and quaternary number |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410650245.9A CN104361593B (en) | 2014-11-14 | 2014-11-14 | A kind of color image quality evaluation method based on HVS and quaternary number |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104361593A true CN104361593A (en) | 2015-02-18 |
CN104361593B CN104361593B (en) | 2017-09-19 |
Family
ID=52528851
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410650245.9A Active CN104361593B (en) | 2014-11-14 | 2014-11-14 | A kind of color image quality evaluation method based on HVS and quaternary number |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104361593B (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105528776A (en) * | 2015-08-07 | 2016-04-27 | 上海仙梦软件技术有限公司 | SDP quality evaluation method for image format JPEG |
CN105574854A (en) * | 2015-12-10 | 2016-05-11 | 小米科技有限责任公司 | Method and device for determining image oneness |
CN106600597A (en) * | 2016-12-22 | 2017-04-26 | 华中科技大学 | Non-reference color image quality evaluation method based on local binary pattern |
CN106683082A (en) * | 2016-12-19 | 2017-05-17 | 华中科技大学 | Method for evaluating quality of full reference color image based on quaternion |
CN107862678A (en) * | 2017-10-19 | 2018-03-30 | 宁波大学 | A kind of eye fundus image reference-free quality evaluation method |
WO2018140158A1 (en) * | 2017-01-30 | 2018-08-02 | Euclid Discoveries, Llc | Video characterization for smart enconding based on perceptual quality optimization |
CN109191431A (en) * | 2018-07-27 | 2019-01-11 | 天津大学 | High dynamic color image quality evaluation method based on characteristic similarity |
CN109345520A (en) * | 2018-09-20 | 2019-02-15 | 江苏商贸职业学院 | A kind of quality evaluating method of image definition |
CN109389591A (en) * | 2018-09-30 | 2019-02-26 | 西安电子科技大学 | Color image quality evaluation method based on colored description |
CN109643125A (en) * | 2016-06-28 | 2019-04-16 | 柯尼亚塔有限公司 | For training the 3D virtual world true to nature of automated driving system to create and simulation |
CN109903247A (en) * | 2019-02-22 | 2019-06-18 | 西安工程大学 | Color image high accuracy grey scale method based on Gauss color space correlation |
CN110793472A (en) * | 2019-11-11 | 2020-02-14 | 桂林理工大学 | Grinding surface roughness detection method based on quaternion singular value entropy index |
CN112771570A (en) * | 2018-08-29 | 2021-05-07 | 瑞典爱立信有限公司 | Video fidelity metric |
CN112950723A (en) * | 2021-03-05 | 2021-06-11 | 湖南大学 | Robot camera calibration method based on edge scale self-adaptive defocus fuzzy estimation |
CN116152249A (en) * | 2023-04-20 | 2023-05-23 | 济宁立德印务有限公司 | Intelligent digital printing quality detection method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020043280A1 (en) * | 2018-08-29 | 2020-03-05 | Telefonaktiebolaget Lm Ericsson (Publ) | Image fidelity measure |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030076990A1 (en) * | 2001-08-08 | 2003-04-24 | Mitsubishi Electric Research Laboratories, Inc. | Rendering deformable 3D models recovered from videos |
CN1897634A (en) * | 2006-06-08 | 2007-01-17 | 复旦大学 | Image-quality estimation based on supercomplex singular-value decomposition |
-
2014
- 2014-11-14 CN CN201410650245.9A patent/CN104361593B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030076990A1 (en) * | 2001-08-08 | 2003-04-24 | Mitsubishi Electric Research Laboratories, Inc. | Rendering deformable 3D models recovered from videos |
CN1897634A (en) * | 2006-06-08 | 2007-01-17 | 复旦大学 | Image-quality estimation based on supercomplex singular-value decomposition |
Non-Patent Citations (3)
Title |
---|
REN TONGQUN等: "3-D Free-form Shape Measuring System Using Unconstrained Range Sensor", 《CHINESE JOURNAL OF MECHANICAL ENGINEERING》 * |
何叶明等: "基于HVS特征参数提取的视频质量评价四元数模型", 《计算机应用与软件》 * |
王宇庆等: "基于四元数的彩色图像质量评价方法", 《中北大学学报(自然科学版)》 * |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105528776A (en) * | 2015-08-07 | 2016-04-27 | 上海仙梦软件技术有限公司 | SDP quality evaluation method for image format JPEG |
CN105574854A (en) * | 2015-12-10 | 2016-05-11 | 小米科技有限责任公司 | Method and device for determining image oneness |
CN105574854B (en) * | 2015-12-10 | 2019-02-12 | 小米科技有限责任公司 | Determine the monistic method and device of image |
CN109643125A (en) * | 2016-06-28 | 2019-04-16 | 柯尼亚塔有限公司 | For training the 3D virtual world true to nature of automated driving system to create and simulation |
CN109643125B (en) * | 2016-06-28 | 2022-11-15 | 柯尼亚塔有限公司 | Realistic 3D virtual world creation and simulation for training an autonomous driving system |
CN106683082A (en) * | 2016-12-19 | 2017-05-17 | 华中科技大学 | Method for evaluating quality of full reference color image based on quaternion |
CN106683082B (en) * | 2016-12-19 | 2019-08-13 | 华中科技大学 | It is a kind of complete with reference to color image quality evaluation method based on quaternary number |
CN106600597B (en) * | 2016-12-22 | 2019-04-12 | 华中科技大学 | It is a kind of based on local binary patterns without reference color image quality evaluation method |
CN106600597A (en) * | 2016-12-22 | 2017-04-26 | 华中科技大学 | Non-reference color image quality evaluation method based on local binary pattern |
US11228766B2 (en) | 2017-01-30 | 2022-01-18 | Euclid Discoveries, Llc | Dynamic scaling for consistent video quality in multi-frame size encoding |
US11350105B2 (en) | 2017-01-30 | 2022-05-31 | Euclid Discoveries, Llc | Selection of video quality metrics and models to optimize bitrate savings in video encoding applications |
WO2018140158A1 (en) * | 2017-01-30 | 2018-08-02 | Euclid Discoveries, Llc | Video characterization for smart enconding based on perceptual quality optimization |
US11159801B2 (en) | 2017-01-30 | 2021-10-26 | Euclid Discoveries, Llc | Video characterization for smart encoding based on perceptual quality optimization |
US10757419B2 (en) | 2017-01-30 | 2020-08-25 | Euclid Discoveries, Llc | Video characterization for smart encoding based on perceptual quality optimization |
CN107862678A (en) * | 2017-10-19 | 2018-03-30 | 宁波大学 | A kind of eye fundus image reference-free quality evaluation method |
CN107862678B (en) * | 2017-10-19 | 2020-03-17 | 宁波大学 | Fundus image non-reference quality evaluation method |
CN109191431A (en) * | 2018-07-27 | 2019-01-11 | 天津大学 | High dynamic color image quality evaluation method based on characteristic similarity |
CN112771570A (en) * | 2018-08-29 | 2021-05-07 | 瑞典爱立信有限公司 | Video fidelity metric |
US11394978B2 (en) | 2018-08-29 | 2022-07-19 | Telefonaktiebolaget Lm Ericsson (Publ) | Video fidelity measure |
CN109345520A (en) * | 2018-09-20 | 2019-02-15 | 江苏商贸职业学院 | A kind of quality evaluating method of image definition |
CN109389591B (en) * | 2018-09-30 | 2020-11-20 | 西安电子科技大学 | Color descriptor-based color image quality evaluation method |
CN109389591A (en) * | 2018-09-30 | 2019-02-26 | 西安电子科技大学 | Color image quality evaluation method based on colored description |
CN109903247A (en) * | 2019-02-22 | 2019-06-18 | 西安工程大学 | Color image high accuracy grey scale method based on Gauss color space correlation |
CN110793472B (en) * | 2019-11-11 | 2021-07-27 | 桂林理工大学 | Grinding surface roughness detection method based on quaternion singular value entropy index |
CN110793472A (en) * | 2019-11-11 | 2020-02-14 | 桂林理工大学 | Grinding surface roughness detection method based on quaternion singular value entropy index |
CN112950723A (en) * | 2021-03-05 | 2021-06-11 | 湖南大学 | Robot camera calibration method based on edge scale self-adaptive defocus fuzzy estimation |
CN116152249A (en) * | 2023-04-20 | 2023-05-23 | 济宁立德印务有限公司 | Intelligent digital printing quality detection method |
Also Published As
Publication number | Publication date |
---|---|
CN104361593B (en) | 2017-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104361593B (en) | A kind of color image quality evaluation method based on HVS and quaternary number | |
CN111709902B (en) | Infrared and visible light image fusion method based on self-attention mechanism | |
CN110046673B (en) | No-reference tone mapping image quality evaluation method based on multi-feature fusion | |
Panetta et al. | No reference color image contrast and quality measures | |
Panetta et al. | Human-visual-system-inspired underwater image quality measures | |
Yue et al. | Biologically inspired blind quality assessment of tone-mapped images | |
Mohammadi et al. | Subjective and objective quality assessment of image: A survey | |
Zheng et al. | Qualitative and quantitative comparisons of multispectral night vision colorization techniques | |
Amirshahi et al. | Image quality assessment by comparing CNN features between images | |
Lee et al. | Toward a no-reference image quality assessment using statistics of perceptual color descriptors | |
He et al. | Image quality assessment based on S-CIELAB model | |
CN109255358B (en) | 3D image quality evaluation method based on visual saliency and depth map | |
Yue et al. | Blind stereoscopic 3D image quality assessment via analysis of naturalness, structure, and binocular asymmetry | |
CN106934770B (en) | A kind of method and apparatus for evaluating haze image defog effect | |
CN103780895B (en) | A kind of three-dimensional video quality evaluation method | |
Liu et al. | No-reference image quality assessment method based on visual parameters | |
Geng et al. | A stereoscopic image quality assessment model based on independent component analysis and binocular fusion property | |
Ernawan et al. | TMT quantization table generation based on psychovisual threshold for image compression | |
CN108010023B (en) | High dynamic range image quality evaluation method based on tensor domain curvature analysis | |
Chen et al. | Blind stereo image quality assessment based on binocular visual characteristics and depth perception | |
Chang et al. | Sparse correlation coefficient for objective image quality assessment | |
CN106960432B (en) | A kind of no reference stereo image quality evaluation method | |
Toprak et al. | A new full-reference image quality metric based on just noticeable difference | |
Jadhav et al. | Performance evaluation of structural similarity index metric in different colorspaces for HVS based assessment of quality of colour images | |
Yuan et al. | Color image quality assessment with multi deep convolutional networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |