CN104282019A - Blind image quality evaluation method based on natural scene statistics and perceived quality propagation - Google Patents

Blind image quality evaluation method based on natural scene statistics and perceived quality propagation Download PDF

Info

Publication number
CN104282019A
CN104282019A CN201410473339.3A CN201410473339A CN104282019A CN 104282019 A CN104282019 A CN 104282019A CN 201410473339 A CN201410473339 A CN 201410473339A CN 104282019 A CN104282019 A CN 104282019A
Authority
CN
China
Prior art keywords
image
test pattern
quality
sampling
undistorted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410473339.3A
Other languages
Chinese (zh)
Other versions
CN104282019B (en
Inventor
李宏亮
吴庆波
熊健
李威
罗冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201410473339.3A priority Critical patent/CN104282019B/en
Publication of CN104282019A publication Critical patent/CN104282019A/en
Application granted granted Critical
Publication of CN104282019B publication Critical patent/CN104282019B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a blind image quality evaluation method based on natural scene statistics and perceived quality propagation. The method includes the steps of calculating the fields of experts (FoE) gradient response values of a test image and a large number of undistorted natural images, conducting statistics on the histogram distribution of the response values, calculating the KL divergence of the histogram distribution of the response value of the test image and the histogram distribution of the response values of the undistorted natural images to obtain the absolute distortion degree of the test image, extracting the quality perception characteristics from the test image and marked distorted images, finding N marked images most similar to the test image according to the chi-square distances between the characteristics, conducting weighted summation on the quality grades of the marked images to obtain the relative distortion degree of the test image, and obtaining the final prediction image quality grade by combining the two predicted grades. Compared with an existing representativeness reference-free image quality evaluation method, the method is simple and efficient, and a user does not need to mark a large number of samples through manual work.

Description

Based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated
Technical field
The present invention relates to image processing techniques, particularly perception visual signal treatment technology.
Background technology
Efficient image perception quality evaluating method is then the gordian technique in multimedia service quality monitoring field.At present, reliable image quality evaluating method is mainly full reference and weak reference type.These methods require to access undistorted former figure information completely.But in the middle of many applied environments, this requirement often cannot meet.
Blind image (non-reference picture) quality evaluating method only needs the information of distorted image self and its perceived quality measurable.The black box projection model that existing blind image quality evaluating method has the learning method of supervision direct training image characteristic sum perceived quality to give a mark often through supporting vector recurrence etc. is for image quality estimation.In order to ensure the robustness of model, these methods need a large amount of handmarking's images for training.Meanwhile, because its black box projects, these methods cannot relation between clear Description Image characteristic sum perceived quality.
Summary of the invention
Technical matters to be solved by this invention is, provides the blind reference image quality appraisement method of the relation between a kind of energy Description Image characteristic sum quality score.
The present invention for solving the problems of the technologies described above adopted technical scheme is, based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated, comprises the following steps:
Step 1) natural scene statistics:
Calculate expert field FoE (Fields of Experts) the gradient response of test pattern and all undistorted images, and statistics obtains the FoE gradient response histogram distribution P of test pattern respectively dand the FoE response histogram distribution P of i-th undistorted image u(i), i=1,2 ..., K, K are undistorted image sum, calculate the relative entropy KL divergence of test pattern and undistorted image Q NSS = Σ i = 1 K P d log 2 ( P d P u ( i ) ) ;
Step 2) perceived quality propagation:
1-1: image texture characteristic is extracted to test pattern, overall Gradient Features and the boundary intensity feature based on down-sampling; The extracting method of the boundary intensity feature of described down-sampling is: carry out 1/8 down-sampling to image, after down-sampling the boundary strength value of each point of image by it in vertical and horizontal direction the maximal value of gradient represent, again to boundary strength value a little carry out statistics with histogram, the histogram after normalization is the boundary intensity feature based on down-sampling;
1-2: the feature cards side distance D calculating test pattern and the distorted image to have marked, the described distorted image marked is carried out by the mode of handmarking the distorted image that picture quality gives a mark; wherein F i qand F i rrepresent i-th proper vector of test pattern respectively, mark i-th proper vector of undistorted image, i={1,2,3} be correspondence image textural characteristics respectively, overall Gradient Features and the boundary intensity feature based on down-sampling;
1-3: mark distorted image to little to large select progressively top n according to feature cards side distance D; And calculate respective weight w according to this N number of feature cards side distance D n, d nrepresent test pattern and the little feature cards side's distance having marked undistorted image to n-th of large select progressively;
1-4: utilize weight w nto N number of image quality score DMOS having marked distorted image nbe weighted the prediction mark Q that summation obtains test pattern pQP, Q PQP = Σ n = 1 N w n · DMOS n ;
Step 3) relative entropy KL divergence Q is set nSSand prediction mark Q pQPweight parameter, with reference to relative entropy KL divergence Q nSSand prediction mark Q pQPobtain final prediction marking Q.
The present invention calculates expert field FoE gradient response to test pattern and a large amount of undistorted natural image, the response histogram distribution of statistical test image and all undistorted images respectively.Again by the KL divergence of both calculating distribution, we can obtain the absolute distortion level of test pattern.Secondly, we to test pattern and the distorted image extraction quality Perception Features that marked, and according to the card side's distance between feature, find the N number of marking image the most similar to test pattern.By the quality score weighted sum by these marking images, the relative distortion level of test pattern can be obtained.Finally, final predicted picture massfraction can be obtained by first two steps prediction marking is grouped together.
The invention has the beneficial effects as follows, compared with in existing representative non-reference picture quality appraisement method, the method is simply efficient, and without the need to a large amount of handmarking's sample.
Accompanying drawing explanation
Fig. 1: block schematic illustration of the present invention.
Embodiment
For effectively carrying out quality assessment to non-reference picture, the present invention is made up of three steps: natural scene statistic procedure, perceived quality propagation steps, comprehensively to give a mark step.Wherein, natural scene adds up the statistical discrepancy by compare test image and a large amount of undistorted natural image, obtains the assessment of absolute distortion information.And mass propagation is by propagating to test pattern similarly by the quality score of portion markings image, the assessment of relative distortion information can be obtained.Finally, by two module prediction marking are combined to obtain final prediction marking.
The present embodiment realizes on matlab2009b software platform, specifically as shown in Figure 1:
The FoE gradient response of step one, calculating test pattern and selected undistorted image, and the histogram distribution both adding up respectively.Allow P drepresent the response distribution of test pattern, P urepresent the response distribution of whole undistorted image.Then the quality score of natural scene statistical module can be expressed as both KL divergences, namely
Difference is distributed with as test pattern distortion then responds with the FoE gradient of undistorted image, when test pattern is fuzzy, then the FoE gradient response distribution of test pattern is flat compared with undistorted image, and when test pattern has noise, then its FoE gradient response distribution there will be folder peak.
Step 2, perceived quality are propagated and are formed primarily of following three steps:
1st step: feature test pattern being extracted to mass-sensitive, comprise image texture characteristic, overall situation Gradient Features and the boundary intensity DSBS feature based on down-sampling, wherein image texture characteristic can pass through SFTA (Segmentation-based Fractal Texture Analysis) feature instantiation, and overall Gradient Features is by GIST feature instantiation.For the extraction of the boundary intensity feature based on down-sampling, first 1/8 down-sampling is carried out to image, after sampling the boundary intensity of each point of image by it in vertical and horizontal direction the maximal value of gradient represent.Then, to boundary strength value a little carry out statistics with histogram, the histogram after normalization is DSBS proper vector, and DSBS feature is for reflecting the blocking effect after compression of images.
2nd step: allow F i qand F i ri-th proper vector of expression test pattern and marking image, d (F i q, F i r) card side's distance of both expressions, then test pattern and the total characteristic distance of marking image can be expressed as wherein i={1,2,3} be corresponding SFTA, GIST and DSBS feature respectively.
3rd step: find the 5 width marking image making D minimum, and calculate respective weight w according to their characteristic distance n:
d nrepresent test pattern and the little feature cards side's distance having marked undistorted image to n-th of large select progressively;
Then, the prediction of this module marking can be expressed as the weighted sum of the massfraction DMOS of marking image, and the span of massfraction DMOS 0 to 100,0 represents best here, and 100 representatives are the poorest:
Q PQP = Σ n = 1 N w n · DMOS n
Step 3, by being combined by first two steps quality score, final prediction marking can be obtained
Q = Q NSS γ · Q PQP 1 - γ
Wherein γ is the weight parameter of two modules, and here, we are set to 0.2.

Claims (5)

1., based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated, it is characterized in that, comprise the following steps:
Step 1) natural scene statistics:
Calculate the expert field FoE gradient response of test pattern and all undistorted images, and statistics obtains the FoE gradient response histogram distribution P of test pattern respectively dand the FoE response histogram distribution P of i-th undistorted image u(i), i=1,2 ..., K, K are undistorted image sum, calculate the relative entropy KL divergence of test pattern and undistorted image Q NSS = Σ i = 1 K P d log 2 ( P d P u ( i ) ) ;
Step 2) perceived quality propagation:
1-1: image texture characteristic is extracted to test pattern, overall Gradient Features and the boundary intensity feature based on down-sampling; The extracting method of the boundary intensity feature of described down-sampling is: carry out 1/8 down-sampling to image, after down-sampling the boundary strength value of each point of image by it in vertical and horizontal direction the maximal value of gradient represent, again to boundary strength value a little carry out statistics with histogram, the histogram after normalization is the boundary intensity feature based on down-sampling;
1-2: the feature cards side distance D calculating test pattern and the distorted image to have marked, the described distorted image marked is carried out by the mode of handmarking the distorted image that picture quality gives a mark; wherein F i qand F i rrepresent i-th proper vector of test pattern respectively, mark i-th proper vector of undistorted image, i={1,2,3} be correspondence image textural characteristics respectively, overall Gradient Features and the boundary intensity feature based on down-sampling;
1-3: mark distorted image to little to large select progressively top n according to feature cards side distance D; And calculate respective weight w according to this N number of feature cards side distance D n, d nrepresent test pattern and the little feature cards side's distance having marked undistorted image to n-th of large select progressively;
1-4: utilize weight w nto N number of image quality score DMOS having marked distorted image nbe weighted the prediction mark Q that summation obtains test pattern pQP, Q PQP = Σ n = 1 N w n · DMOS n ;
Step 3) relative entropy KL divergence Q is set nSSand prediction mark Q pQPweight parameter, with reference to relative entropy KL divergence Q nSSand prediction mark Q pQPobtain final prediction marking Q.
2. as claimed in claim 1 based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated, it is characterized in that, described prediction marking Q is finally γ is weight parameter.
3., as claimed in claim 2 based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated, it is characterized in that, γ=0.2.
4., as claimed in claim 1 based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated, it is characterized in that, described down-sampling is 1/8 down-sampling.
5., as claimed in claim 1 based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated, it is characterized in that, N=5.
CN201410473339.3A 2014-09-16 2014-09-16 Based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated Expired - Fee Related CN104282019B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410473339.3A CN104282019B (en) 2014-09-16 2014-09-16 Based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410473339.3A CN104282019B (en) 2014-09-16 2014-09-16 Based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated

Publications (2)

Publication Number Publication Date
CN104282019A true CN104282019A (en) 2015-01-14
CN104282019B CN104282019B (en) 2017-06-13

Family

ID=52256869

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410473339.3A Expired - Fee Related CN104282019B (en) 2014-09-16 2014-09-16 Based on the blind image quality evaluating method that natural scene statistics and perceived quality are propagated

Country Status (1)

Country Link
CN (1) CN104282019B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104902277A (en) * 2015-06-08 2015-09-09 浙江科技学院 Non-reference image quality evaluation method based on monogenic binary coding
CN106815839A (en) * 2017-01-18 2017-06-09 中国科学院上海高等研究院 A kind of image quality blind evaluation method
WO2017107867A1 (en) * 2015-12-22 2017-06-29 成都理想境界科技有限公司 Image quality evaluation method and apparatus
CN109584242A (en) * 2018-11-24 2019-04-05 天津大学 Maximum entropy and KL divergence are without reference contrast distorted image quality evaluating method
CN109635142A (en) * 2018-11-15 2019-04-16 北京市商汤科技开发有限公司 Image-selecting method and device, electronic equipment and storage medium
CN111932521A (en) * 2020-08-13 2020-11-13 Oppo(重庆)智能科技有限公司 Image quality testing method and device, server and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090108388A (en) * 2008-04-11 2009-10-15 엔에이치엔(주) Method and System for Computing Quality Value of Image
CN102930545A (en) * 2012-11-07 2013-02-13 复旦大学 Statistical measure method for image quality blind estimation
CN103258326A (en) * 2013-04-19 2013-08-21 复旦大学 Information fidelity method for image quality blind evaluation
CN103778636A (en) * 2014-01-22 2014-05-07 上海交通大学 Feature construction method for non-reference image quality evaluation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090108388A (en) * 2008-04-11 2009-10-15 엔에이치엔(주) Method and System for Computing Quality Value of Image
CN102930545A (en) * 2012-11-07 2013-02-13 复旦大学 Statistical measure method for image quality blind estimation
CN103258326A (en) * 2013-04-19 2013-08-21 复旦大学 Information fidelity method for image quality blind evaluation
CN103778636A (en) * 2014-01-22 2014-05-07 上海交通大学 Feature construction method for non-reference image quality evaluation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANUSH KRISHNA MOORTHY, ET AL.: "Blind Image Quality Assessment: From Natural Scene Statistics to Perceptual Quality", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 *
QINGBO WU, ET AL.: "No Reference Image Quality Metric via Distortion Identification and Multi-Channel Label Transfer", 《2014 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS)》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104902277A (en) * 2015-06-08 2015-09-09 浙江科技学院 Non-reference image quality evaluation method based on monogenic binary coding
WO2017107867A1 (en) * 2015-12-22 2017-06-29 成都理想境界科技有限公司 Image quality evaluation method and apparatus
CN106910180A (en) * 2015-12-22 2017-06-30 成都理想境界科技有限公司 A kind of image quality measure method and device
CN106910180B (en) * 2015-12-22 2019-08-20 成都理想境界科技有限公司 A kind of image quality measure method and device
CN106815839A (en) * 2017-01-18 2017-06-09 中国科学院上海高等研究院 A kind of image quality blind evaluation method
CN106815839B (en) * 2017-01-18 2019-11-15 中国科学院上海高等研究院 A kind of image quality blind evaluation method
CN109635142A (en) * 2018-11-15 2019-04-16 北京市商汤科技开发有限公司 Image-selecting method and device, electronic equipment and storage medium
CN109584242A (en) * 2018-11-24 2019-04-05 天津大学 Maximum entropy and KL divergence are without reference contrast distorted image quality evaluating method
CN111932521A (en) * 2020-08-13 2020-11-13 Oppo(重庆)智能科技有限公司 Image quality testing method and device, server and computer readable storage medium
CN111932521B (en) * 2020-08-13 2023-01-03 Oppo(重庆)智能科技有限公司 Image quality testing method and device, server and computer readable storage medium

Also Published As

Publication number Publication date
CN104282019B (en) 2017-06-13

Similar Documents

Publication Publication Date Title
CN103996192B (en) Non-reference image quality evaluation method based on high-quality natural image statistical magnitude model
CN104282019A (en) Blind image quality evaluation method based on natural scene statistics and perceived quality propagation
CN102333233B (en) Stereo image quality objective evaluation method based on visual perception
CN105208374B (en) A kind of non-reference picture assessment method for encoding quality based on deep learning
CN104902267B (en) No-reference image quality evaluation method based on gradient information
CN102421007B (en) Image quality evaluating method based on multi-scale structure similarity weighted aggregate
CN102413328B (en) Double compression detection method and system of joint photographic experts group (JPEG) image
CN103475898A (en) Non-reference image quality assessment method based on information entropy characters
CN103200421B (en) No-reference image quality evaluation method based on Curvelet transformation and phase coincidence
CN101819638B (en) Establishment method of pornographic detection model and pornographic detection method
CN106447646A (en) Quality blind evaluation method for unmanned aerial vehicle image
CN101561932B (en) Method and device for detecting real-time movement target under dynamic and complicated background
CN103955926A (en) Method for remote sensing image change detection based on Semi-NMF
CN104658001A (en) Non-reference asymmetric distorted stereo image objective quality assessment method
CN106447647A (en) No-reference quality evaluation method of compression perception recovery images
CN104268590A (en) Blind image quality evaluation method based on complementarity combination characteristics and multiphase regression
CN106097315A (en) A kind of underwater works crack extract method based on sonar image
CN110766658B (en) Non-reference laser interference image quality evaluation method
CN102722888A (en) Stereoscopic image objective quality evaluation method based on physiological and psychological stereoscopic vision
CN103745466A (en) Image quality evaluation method based on independent component analysis
CN105528776A (en) SDP quality evaluation method for image format JPEG
CN104680180A (en) Polarimetric SAR image classification method on basis of K-Means and sparse own coding
CN104376565A (en) Non-reference image quality evaluation method based on discrete cosine transform and sparse representation
CN103679719A (en) Image segmentation method
CN105427313A (en) Deconvolutional network and adaptive inference network based SAR image segmentation method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170613

Termination date: 20190916