CN105741274A - Advanced visual characteristic based non-reference image definition evaluation method - Google Patents

Advanced visual characteristic based non-reference image definition evaluation method Download PDF

Info

Publication number
CN105741274A
CN105741274A CN201610051353.3A CN201610051353A CN105741274A CN 105741274 A CN105741274 A CN 105741274A CN 201610051353 A CN201610051353 A CN 201610051353A CN 105741274 A CN105741274 A CN 105741274A
Authority
CN
China
Prior art keywords
rate
change
illustrative plates
image
advanced visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610051353.3A
Other languages
Chinese (zh)
Other versions
CN105741274B (en
Inventor
应凌楷
李子印
张聪聪
张刘刘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University
Original Assignee
China Jiliang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University filed Critical China Jiliang University
Priority to CN201610051353.3A priority Critical patent/CN105741274B/en
Publication of CN105741274A publication Critical patent/CN105741274A/en
Application granted granted Critical
Publication of CN105741274B publication Critical patent/CN105741274B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The present invention discloses an advanced visual characteristic based non-reference image definition evaluation method, and relates to image quality evaluation technologies. The method comprises: firstly, calculating a color change rate map of each pixel point, wherein the color change rate map is used for describing local definition and global structure information; and simulating an advanced visual activity by combining a psychological visual redundancy characteristic with a locally excitatory globally inhibitory mechanism of a neuron activity. The proposed method coincides with subjective evaluation, and has more excellent accuracy and robustness compared with the existing method. An optimized calculation formula is used in the method, so that the method is simpler and more efficient for implementation and lower in calculation complexity and has relatively great values for theoretical researches such as image quality evaluation and practical engineering application such as automated production and the like.

Description

Non-reference picture definition evaluation methodology based on advanced visual properties
Technical field
The present invention relates to image quality evaluation field, particularly relate to a kind of Measurement for Digital Image Definition without reference.
Background technology
Digital picture is likely to produce distortion from the links gathering, process, store and being transferred to display.These distortions not only can affect visual experience, but also can affect higher semantic hierarchies epigraph analysis and the effect understanding algorithm, and its definition need to carry out accurate objective evaluation.Meanwhile, the definition of image is increasingly becoming and weighs the leading indicator that digital imaging system is good and bad.Owing in most of the cases hardly resulting in the non-distorted raw image corresponding to distorted image, so the definition evaluation of non-reference picture becomes a research topic, and Research Challenges and the focus in image quality evaluation field are become at present.
Owing to there is no original image as reference, construct more difficult than the full reference method based on original image without the Measurement for Digital Image Definition of reference.In recent years, different non-reference picture definition evaluation methodologys is suggested, and can be largely classified into spatial domain and the big class of transform domain two.Widely used this characteristics of image of rate of gray level collection of illustrative plates of current existing spatial domain method, but owing to coloured image often can cause the loss of very important visual information in carrying out gradation conversion process, reduce the accuracy of image definition evaluation.It addition, current existing transform domain method is restricted because the shortcomings such as its computation complexity is high result in its range of application mostly.
Summary of the invention
The present invention is directed to the deficiencies in the prior art, it is provided that a kind of non-reference picture definition evaluation methodology based on advanced visual properties, it is possible to realize fast and accurately without with reference to definition evaluation.
The technical solution used in the present invention is: first passes through the colored rate of change collection of illustrative plates of calculating and obtains the initiating structure information characteristics of input picture, then again through de-redundancy wave filter and neural impulse predictive filter, this colour rate of change collection of illustrative plates processing two kinds of features respectively that obtain advanced visual properties, both features carry out pond and obtain the articulation index of input picture the most at last;Specifically comprise the following steps that
1. the colored rate of change collection of illustrative plates of calculating input image
For the input picture g to be evaluated of wide and high respectively M and N pixel, the spatial domain coordinate of its pixel is with (x, y) represents, (x, y) the spatial domain coordinate of the neighborhood territory pixel of position is with (i j) represents.
Coloured image is decomposed into R, G, channel B image, each passage uses following single channel rate of change operator calculate obtain each channel image rate of change V (x, y):
V (x, y)=max{ | gI, j-gX, y|, i=x-1, x, x+1, j=y-1, y, y+1};
Two norms calculating triple channel weighting rate of change obtain the colored rate of change collection of illustrative plates of image g:
V C = ( w R V R ) 2 + ( w G V G ) 2 + ( w B V B ) 2 ,
In above formula, VR、VGAnd VBFor by operator V, (x, y) tri-passages of R, G, B calculated single channel rate of change respectively in g, represents the rate of change collection of illustrative plates of three channel image, wR、wG、wBThe respectively weight of each passage.
2. use de-redundancy wave filter that colored rate of change collection of illustrative plates is carried out refine and obtain the fisrt feature of advanced visual properties,
For the colored rate of change collection of illustrative plates V obtainedCFirst carried out discrete cosine transform F obtain F (u, v), part low frequency component set S therein is carried out zero setting process obtain refine frequency spectrum R, then R is carried out inverse discrete cosine transform iF with obtain reconstructed image R RSI (x, y):
RRSI (x, y)=iF{R [F (VC)]};
3. use neural impulse predictive filter to obtain the second feature of advanced visual properties
For the colored rate of change collection of illustrative plates V obtainedC, first calculate the normalization rate of change of each point:
Vc' (x, y)=Vc(x, y)/Vcmax,
In above formula, molecular moiety is colored rate of change collection of illustrative plates (x, y) value put, denominator part is whole colored rate of change collection of illustrative plates VCIn maximum, then pass through neural impulse NIPF that following high pass filter prediction each point rate of change causes (x, y):
N I P F ( x , y ; α , σ ) = α 2 σ Γ ( 1 / α ) Γ ( 1 / α ) Γ ( 3 / α ) · e [ - ( V c ′ - 1 α Γ ( 1 / α ) Γ ( 3 / α ) ) α ] ,
In formula, α is form parameter, and σ is standard deviation, and Γ () is gamma function.
4. pond obtains the articulation index of input picture g
According to step 2 and 3 obtain respectively advanced visual properties two width characteristic pattern RRSI (x, y) and NIPF (x, y), calculates the articulation index of input picture g by equation below:
S I N I = Σ y = 0 N - 1 Σ x = 0 M - 1 N I P F ( x , y ) · R R S I ( x , y ) .
Compared with prior art, the invention has the beneficial effects as follows: the method for proposition combines two kinds of advanced visual properties, meets subjective assessment, has more superior accuracy and robustness;Employ the computing formula of optimization, it is achieved simpler efficiently have very low computation complexity, for practical engineering application such as the research of image quality evaluation scheduling theory and automated productions, all there is bigger value.
Accompanying drawing explanation
Fig. 1 be the present invention method proposed implement flow chart;
Fig. 2 is input picture;
Fig. 3 is the colored rate of change collection of illustrative plates that input picture is corresponding;
Fig. 4 is fisrt feature figure RRSI;
Fig. 5 is neural impulse predictive filter functional image in definition territory [0,1];
Fig. 6 a~Fig. 6 d is other test image.
Detailed description of the invention
Below in conjunction with accompanying drawing, by specific embodiment, technical scheme is carried out clear, complete description.
The operating process of the non-reference picture definition evaluation methodology based on advanced visual properties that the present invention proposes is as it is shown in figure 1, Fig. 2 is input picture g, and key step is as follows:
1. the colored rate of change collection of illustrative plates of calculating input image
For being sized to wide 512 pixels and the input picture g of high 512 pixels, the spatial domain coordinate of its pixel is with (x y) represents;The 3 of this coordinate take advantage of the spatial domain coordinate of 3 window neighborhood territory pixels with (i j) represents.
First use following single channel rate of change operator calculate on each passage of R, G, B obtain rate of change V (x, y):
V (x, y)=max{ | gX, y-gI, j|, i=x-1, x, x+1, j=y-1, y, y+1},
Then two norms calculating triple channel weighting rate of change obtain the colored rate of change collection of illustrative plates of image g:
V C = ( 0.299 V R ) 2 + ( 0.587 V G ) 2 + ( 0.114 V B ) 2 ,
V in formulaR、VGAnd VBRespectively in input picture g the single channel rate of change V of tri-passages of R, G, B (x, y), the weight value of each passage is 0.299,0.587,0.114, Fig. 3 be V corresponding to input picture gCFigure.
2. use de-redundancy wave filter that colored rate of change collection of illustrative plates is carried out refine and obtain the fisrt feature of advanced visual properties,
For the colored rate of change collection of illustrative plates V obtainedC, first carried out discrete cosine transform F obtain F (u, v);Then DC component is carried out zero setting process obtain refine frequency spectrum R, then R is carried out inverse discrete cosine transform iF with obtain reconstructed image R RSI (x, y), Fig. 4 be this fisrt feature figure RRSI (x, y).
3. use neural impulse predictive filter to obtain the second feature of advanced visual properties
For the colored rate of change collection of illustrative plates V obtainedC, first calculate the normalization rate of change of each point:
Vc' (x, y)=Vc(x, y)/Vcmax,
In formula, molecular moiety be colored rate of change collection of illustrative plates (x, y) value put, denominator part is the maximum in whole colored rate of change collection of illustrative plates, then pass through neural impulse NIPF that following high pass filter prediction each point rate of change causes (x, y):
N I P F ( x , y ; α = 5 , σ = 0.2 ) = α 2 σ Γ ( 1 / α ) Γ ( 1 / α ) Γ ( 3 / α ) · e [ - ( V c ′ - 1 α Γ ( 1 / α ) Γ ( 3 / α ) ) α ] ,
In formula, Γ () is gamma function;Fig. 5 is this neural impulse predictive filter functional image in definition territory [0,1].
4. pond obtains the articulation index of input picture g
According to step 2 and 3 obtain respectively advanced visual properties two width characteristic pattern RRSI (x, y) and NIPF (x, y), calculates the articulation index SINI of input picture g by equation below:
S I N I = Σ y = 0 N - 1 Σ x = 0 M - 1 N I P F ( x , y ) · R R S I ( x , y ) ,
It is SINI=0.3417 in the definition evaluation of estimate being that of obtaining input picture Fig. 2.
In this embodiment of the invention, need the relevant parameter indicated as follows used in:
Local window size The spectrum component that zero setting processes Form parameter α Standard deviation sigma
3×3 Only DC component 5 0.2
The method adopting the embodiment of the present invention same is applied to Fig. 6 a to Fig. 6 d, evaluates gained as shown in the table:
Test image Fig. 6 a Fig. 6 b Fig. 6 c Fig. 6 d
SINI value 0.2428 0.1864 0.0987 0.0242
In table, SINI desired value is more big, illustrates that picture quality is more good, and the method that the numerical value display present invention proposes meets human eye subjective assessment.
The foregoing is only the preferably case study on implementation of the present invention, but it be not for limiting the present invention, all within the spirit and principles in the present invention, any amendment of making, equivalent replacement, improvement etc., should be included within protection scope of the present invention.

Claims (5)

1. the non-reference picture definition evaluation methodology based on advanced visual properties, it is characterized in that, first the colored rate of change collection of illustrative plates obtaining input picture is calculated, then again through de-redundancy wave filter and neural impulse predictive filter, this collection of illustrative plates is filtered obtaining two kinds of features of advanced visual properties respectively, just obtain the articulation index of input picture eventually through pond both features, specifically comprise the following steps that
(1), the colored rate of change collection of illustrative plates of calculating input image,
For the input picture g to be evaluated of wide and high respectively M and N pixel, the spatial domain coordinate of its pixel with (x, y) represents, (and x, y) the spatial domain coordinate of the neighborhood territory pixel of position with (i, j) represents:
Coloured image is decomposed into R, G, channel B image, each passage use single channel rate of change operator calculate the rate of change V (x obtaining each channel image, y), two norms then calculating triple channel weighting rate of change obtain the colored rate of change collection of illustrative plates of image g:
V C = ( w R V R ) 2 + ( w G V G ) 2 + ( w B V B ) 2 ,
V in formulaR、VGAnd VBFor by operator V, (x, y) tri-passages of R, G, B calculated single channel rate of change respectively in g, represents the rate of change collection of illustrative plates of three channel image, wR、wG、wBThe respectively weight of each passage;
(2), use de-redundancy wave filter that colored rate of change collection of illustrative plates is carried out the fisrt feature of refine acquisition advanced visual properties,
For the colored rate of change collection of illustrative plates V obtainedC, first carried out spectrum transformation F obtain F (u, v), part low frequency component set S therein is carried out zero setting process obtain refine frequency spectrum R, then R is carried out frequency spectrum inverse transformation iF with obtain reconstructed image R RSI (x, y):
RRSI (x, y)=iF{R [F (VC)]};
(3), neural impulse predictive filter is used to obtain the second feature of advanced visual properties,
For the colored rate of change collection of illustrative plates V obtainedC, first by its value normalization, then pass through high pass filter prediction each point rate of change cause neural impulse NIPF (x, y);
(4), pond obtain the articulation index of input picture g,
According to step (2) and (3) obtain respectively advanced visual properties two width characteristic pattern RRSI (x, y) and NIPF (x, y), calculates the articulation index of input picture g by equation below:
S I N I = Σ y = 0 N - 1 Σ x = 0 M - 1 N I P F ( x , y ) · R R S I ( x , y ) .
2. the non-reference picture definition evaluation methodology based on advanced visual properties according to claim 1, it is characterised in that described advanced visual properties includes two kinds, and one is visual redundancy characteristic;Another kind is Local repair mechanism.
3. the non-reference picture definition evaluation methodology based on advanced visual properties according to claim 1, it is characterized in that, described colored rate of change collection of illustrative plates is a kind of spatial domain secondary image, reflect the local definition of each pixel in described input picture g partially, reflect the structural information of this input picture on the whole.
4. the non-reference picture definition evaluation methodology based on advanced visual properties according to claim 1, it is characterized in that, described step (2) intermediate frequency spectrum is transformed to discrete cosine transform or discrete Fourier transform, and described set S has two restrictive conditions: (one) must comprise DC component, (2) in S the frequency of all frequency components (u, v) size is satisfied by u < M/2 and v < N/2.
5. the non-reference picture definition evaluation methodology based on advanced visual properties according to claim 1, it is characterized in that, in described step (3), the method for normalizing of neural impulse predictive filter is linear or nonlinear monotonic function, and the threshold value of the high pass filter used is 0.6~1.
CN201610051353.3A 2016-01-25 2016-01-25 Non-reference picture clarity evaluation method based on advanced visual properties Active CN105741274B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610051353.3A CN105741274B (en) 2016-01-25 2016-01-25 Non-reference picture clarity evaluation method based on advanced visual properties

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610051353.3A CN105741274B (en) 2016-01-25 2016-01-25 Non-reference picture clarity evaluation method based on advanced visual properties

Publications (2)

Publication Number Publication Date
CN105741274A true CN105741274A (en) 2016-07-06
CN105741274B CN105741274B (en) 2018-12-18

Family

ID=56246618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610051353.3A Active CN105741274B (en) 2016-01-25 2016-01-25 Non-reference picture clarity evaluation method based on advanced visual properties

Country Status (1)

Country Link
CN (1) CN105741274B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101562758A (en) * 2009-04-16 2009-10-21 浙江大学 Method for objectively evaluating image quality based on region weight and visual characteristics of human eyes
US9202141B2 (en) * 2009-08-03 2015-12-01 Indian Institute Of Technology Bombay System for creating a capsule representation of an instructional video
CN105118060A (en) * 2015-08-19 2015-12-02 杭州电子科技大学 Image sharpness measuring method combined with visual analysis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101562758A (en) * 2009-04-16 2009-10-21 浙江大学 Method for objectively evaluating image quality based on region weight and visual characteristics of human eyes
US9202141B2 (en) * 2009-08-03 2015-12-01 Indian Institute Of Technology Bombay System for creating a capsule representation of an instructional video
CN105118060A (en) * 2015-08-19 2015-12-02 杭州电子科技大学 Image sharpness measuring method combined with visual analysis

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ZHANG TONG等: "Method of Non-reference Image Quality Assessment of SSIM based on Regional Weighted Entropy", 《2014 TENTH INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND SECURITY》 *
应凌楷等: "融合梯度信息与HVS滤波器的无参考清晰度评价", 《中国图象图形学报》 *
范媛媛等: "基于对比度敏感度的无参考图像清晰度评价", 《光学 精密工程》 *

Also Published As

Publication number Publication date
CN105741274B (en) 2018-12-18

Similar Documents

Publication Publication Date Title
CN106600597B (en) It is a kind of based on local binary patterns without reference color image quality evaluation method
CN104036479B (en) Multi-focus image fusion method based on non-negative matrix factorization
CN103475897B (en) Adaptive image quality evaluation method based on distortion type judgment
CN108052980A (en) Air quality grade detection method based on image
CN106204447A (en) The super resolution ratio reconstruction method with convolutional neural networks is divided based on total variance
CN105118067A (en) Image segmentation method based on Gaussian smoothing filter
CN106127688A (en) A kind of super-resolution image reconstruction method and system thereof
CN104376565B (en) Based on discrete cosine transform and the non-reference picture quality appraisement method of rarefaction representation
CN103679661B (en) A kind of self adaptation remote sensing image fusion method based on significance analysis
CN105303561A (en) Image preprocessing grayscale space division method
CN104036493B (en) No-reference image quality evaluation method based on multifractal spectrum
CN102567973A (en) Image denoising method based on improved shape self-adaptive window
CN105160647A (en) Panchromatic multi-spectral image fusion method
CN110717892B (en) Tone mapping image quality evaluation method
CN105225238A (en) A kind of gray space division methods of the Image semantic classification based on mean filter
CN105282543A (en) Total blindness three-dimensional image quality objective evaluation method based on three-dimensional visual perception
CN104036502A (en) No-reference fuzzy distorted stereo image quality evaluation method
CN105607288A (en) Intelligent glasses omnibearing vehicle part completeness detection method based on acoustic detection assistance
CN103338380A (en) Adaptive image quality objective evaluation method
CN107194873A (en) Low-rank nuclear norm canonical facial image ultra-resolution method based on coupling dictionary learning
CN106651829A (en) Non-reference image objective quality evaluation method based on energy and texture analysis
CN104408473A (en) Distance metric learning-based cotton grading method and device
CN103700077B (en) A kind of method for adaptive image enhancement based on human-eye visual characteristic
CN107948635A (en) It is a kind of based on degenerate measurement without refer to sonar image quality evaluation method
CN107194896A (en) A kind of background suppression method and system based on neighbour structure

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 310018 Jianggan District, Zhejiang Province, School Street, No. 258 source

Applicant after: CHINA JILIANG UNIVERSITY

Address before: 310018 Jianggan District, Zhejiang Province, School Street, No. 258 source

Applicant before: China Jiliang University

C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant