CN102708567A - Visual perception-based three-dimensional image quality objective evaluation method - Google Patents

Visual perception-based three-dimensional image quality objective evaluation method Download PDF

Info

Publication number
CN102708567A
CN102708567A CN2012101440391A CN201210144039A CN102708567A CN 102708567 A CN102708567 A CN 102708567A CN 2012101440391 A CN2012101440391 A CN 2012101440391A CN 201210144039 A CN201210144039 A CN 201210144039A CN 102708567 A CN102708567 A CN 102708567A
Authority
CN
China
Prior art keywords
dis
pixel
org
coordinate position
expression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012101440391A
Other languages
Chinese (zh)
Other versions
CN102708567B (en
Inventor
邵枫
顾珊波
郁梅
蒋刚毅
李福翠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo University
Original Assignee
Ningbo University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo University filed Critical Ningbo University
Priority to CN201210144039.1A priority Critical patent/CN102708567B/en
Publication of CN102708567A publication Critical patent/CN102708567A/en
Application granted granted Critical
Publication of CN102708567B publication Critical patent/CN102708567B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a visual perception-based three-dimensional image quality objective evaluation method which comprises the steps of: firstly, obtaining an objective evaluation measurement value of each pixel point through calculating a local phase characteristic and a local amplitude characteristic of each pixel point in left and right viewpoint images of a three-dimensional image, dividing the three-dimensional image into a shielding region, a binocular inhibition region and a binocular fusion region by a region detection method; and secondly, evaluating all regions respectively and fusing evaluation results to obtain a final image quality objective evaluation prediction value. The visual perception-based three-dimensional image quality objective evaluation method has the advantages that the obtained information for reflecting local phase and local amplitude characteristics has strong stability and can well reflect quality change condition of the three-dimensional image, and the image quality evaluation method of the shielding region, the binocular inhibition region and the binocular fusion region can well reflect the visual perception characteristic of the human visual system, and is effectively improved in correlation of objective evaluation result and subjective perception.

Description

A kind of stereo image quality method for objectively evaluating based on visually-perceptible
Technical field
The present invention relates to a kind of image quality evaluating method, especially relate to a kind of stereo image quality method for objectively evaluating based on visually-perceptible.
Background technology
Along with developing rapidly of image coding technique and stereo display technique, the stereo-picture technology has received concern and application more and more widely, has become a current research focus.The binocular parallax principle of stereo-picture techniques make use human eye, binocular receive the left and right sides visual point image from Same Scene independently of one another, merge through brain and form binocular parallax, thereby enjoy the stereo-picture with depth perception and sense true to nature.Because the influence of acquisition system, store compressed and transmission equipment; Stereo-picture can be introduced a series of distortion inevitably; And compare with the single channel image, stereo-picture need guarantee two channel image quality simultaneously, it is carried out quality assessment have very important significance.Yet present stereoscopic image quality lacks effective method for objectively evaluating and estimates.Therefore, set up effective stereo image quality objective evaluation model and have crucial meaning.
Present stereo image quality method for objectively evaluating directly directly applies to the evaluation stereo image quality with the plane picture quality evaluating method; Yet; The left and right sides visual point image of stereoscopic image merges the process that relief process is not the visual point image stack of the simple left and right sides that produces; Also be difficult to represent with simple mathematic method; Therefore, how in the stereo image quality evaluation procedure, effectively binocular solid to be merged and simulate, how the objective evaluation result to be modulated according to the visual masking characteristic of human eye; Making the objective evaluation result feel to meet the human visual system more, all is to carry out the problem that need research and solve in the evaluating objective quality process in stereoscopic image.
Summary of the invention
Technical matters to be solved by this invention provides a kind of stereo image quality method for objectively evaluating based on visually-perceptible that can effectively improve the correlativity of objective evaluation result and subjective perception.
The present invention solves the problems of the technologies described above the technical scheme that is adopted: a kind of stereo image quality method for objectively evaluating based on visually-perceptible is characterized in that may further comprise the steps:
1. make S OrgUndistorted stereo-picture for original makes S DisFor the stereo-picture of distortion to be evaluated, with S OrgLeft visual point image be designated as { L Org(x, y) }, with S OrgRight visual point image be designated as { R Org(x, y) }, with S DisLeft visual point image be designated as { L Dis(x, y) }, with S DisRight visual point image be designated as { R Dis(x, y) }, wherein, (x, the y) coordinate position of the pixel in left visual point image of expression and the right visual point image, 1≤x≤W, 1≤y≤H, W represent the width of left visual point image and right visual point image, H representes the height of left visual point image and right visual point image, L Org(x, y) expression { L Org(x, y) } in coordinate position be (x, the pixel value of pixel y), R Org(x, y) expression { R Org(x, y) } in coordinate position be (x, the pixel value of pixel y), L Dis(x, y) expression { L Dis(x, y) } in coordinate position be (x, the pixel value of pixel y), R Dis(x, y) expression { R Dis(x, y) } in coordinate position be (x, the pixel value of pixel y);
2. utilize the visual masking effect of human stereoscopic vision perception, extract { L respectively background illumination and contrast Dis(x, y) } minimum discernable modified-image of binocular and { R Dis(x, y) } the minimum discernable modified-image of binocular, be designated as respectively
Figure BDA00001627411600021
With
Figure BDA00001627411600022
Wherein,
Figure BDA00001627411600023
Expression { L Dis(x, y) } the minimum discernable modified-image of binocular
Figure BDA00001627411600024
In coordinate position be (x, the pixel value of pixel y),
Figure BDA00001627411600025
Expression { R Dis(x, y) } the minimum discernable modified-image of binocular
Figure BDA00001627411600026
Middle coordinate position is (x, the pixel value of pixel y);
3. utilize regional detection algorithm, obtain { L respectively Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, be designated as p, wherein, p ∈ 1,2,3}, p=1 representes occlusion area, p=2 representes that binocular suppresses the zone, p=3 representes the binocular integration region; Then with { L Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
Figure BDA00001627411600027
With { L Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as With { L Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as With { R Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as With { R Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
Figure BDA000016274116000211
With { R Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
4. calculate { L respectively Org(x, y) }, { R Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, with { L Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure BDA00001627411600031
With
Figure BDA00001627411600032
With { R Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure BDA00001627411600033
With
Figure BDA00001627411600034
With { L Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure BDA00001627411600035
With
Figure BDA00001627411600036
With { R Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively With
Figure BDA00001627411600038
Wherein,
Figure BDA00001627411600039
Expression { L Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure BDA000016274116000310
Expression { L Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Figure BDA000016274116000311
Expression { R Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure BDA000016274116000312
Expression { R Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Figure BDA000016274116000313
Expression { L Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure BDA000016274116000314
Expression { L Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Figure BDA000016274116000315
Expression { R Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure BDA000016274116000316
Expression { R Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y);
5. according to { L Org(x, y) } and { L Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { L Dis(x, y) } in the objective evaluation metric of each pixel, with { L Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set L(x, y) }, Q L ( x , y ) = w LP × S L LP ( x , y ) + w LA × S L LA ( x , y ) + b , According to { R Org(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { R Dis(x, y) } in the objective evaluation metric of each pixel, with { R Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set R(x, y) }, Q R ( x , y ) = w LP × S R LP ( x , y ) + w LA × S R LA ( x , y ) + b , Wherein, Q L(x, y) expression { L Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y), S L LP ( x , y ) = 2 × LP L Org ( x , y ) × LP L Dis ( x , y ) + T 1 LP L Org ( x , y ) 2 + LP L Dis ( x , y ) 2 + T 1 , S L LA ( x , y ) = 2 × LA L Org ( x , y ) × LA L Dis ( x , y ) + T 2 LA L Org ( x , y ) 2 + LA L Dis ( x , y ) 2 + T 2 , Q R(x, y) expression { R Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y), S R LP ( x , y ) = 2 × LP R Org ( x , y ) × LP R Dis ( x , y ) + T 1 LP R Org ( x , y ) 2 + LP R Dis ( x , y ) 2 + T 1 , S R LA ( x , y ) = 2 × LA R Org ( x , y ) × LA R Dis ( x , y ) + T 2 LA R Org ( x , y ) 2 + LA R Dis ( x , y ) 2 + T 2 , w LP, w LAWith b be training parameter, T 1And T 2Be controlled variable;
6. according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to occlusion area, calculate S DisIn the objective evaluation metric of occlusion area, be designated as Q Nc, Q nc = Σ ( x , y ) ∈ Ω L nc Q L ( x , y ) + Σ ( x , y ) ∈ Ω R nc Q R ( x , y ) N L nc + N R nc , Wherein,
Figure BDA00001627411600044
Expression { L Dis(x, y) } in area type be the number of the pixel of p=1,
Figure BDA00001627411600045
Expression { R Dis(x, y) } in area type be the number of the pixel of p=1;
7. according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the human visual system that binocular is suppressed the vision perception characteristic in zone, calculate S DisIn binocular suppress the objective evaluation metric in zone, be designated as Q Bs, Q Bs = Max ( Q L Bs , Q R Bs ) , Wherein, max () is for getting max function, Q L Bs = Σ ( x , y ) ∈ Ω L Bs Q L ( x , y ) × ( 1 / J L Dis ( x , y ) ) Σ ( x , y ) ∈ Ω L Bs ( 1 / J L Dis ( x , y ) ) , Q R Bs = Σ ( x , y ) ∈ Ω R Bs Q R ( x , y ) × ( 1 / J R Dis ( x , y ) ) Σ ( x , y ) ∈ Ω R Bs ( 1 / J R Dis ( x , y ) ) ;
8. according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to the binocular integration region, calculate S DisIn the objective evaluation metric of binocular integration region, be designated as Q Bf, Q Bf = 0.7 × ( Q L Bf + Q R Bf ) , Wherein, Q L Bf = Σ ( x , y ) ∈ Ω L Bf Q L ( x , y ) × ( 1 / J L Dis ( x , y ) ) Σ ( x , y ) ∈ Ω L Bf ( 1 / J L Dis ( x , y ) ) , Q R Bf = Σ ( x , y ) ∈ Ω R Bf Q R ( x , y ) × ( 1 / J R Dis ( x , y ) ) Σ ( x , y ) ∈ Ω R Bf ( 1 / J R Dis ( x , y ) ) ;
9. to S DisIn the objective evaluation metric Q of occlusion area Nc, S DisIn binocular suppress the objective evaluation metric Q in zone BsAnd S DisIn the objective evaluation metric Q of binocular integration region BfMerge, obtain S DisPicture quality objective evaluation predicted value, be designated as Q, Q=w Nc* Q Nc+ w Bs* Q Bs+ w Bf* Q Bf, wherein, w Nc, w BfAnd w BsBe weighting parameters.
Described step detailed process 2. is:
2.-1, calculate { L Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T l(x, y) },
Figure BDA00001627411600051
Wherein, T l(x, y) expression { L Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg l(x, y) expression { L Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position;
2.-2, calculate { L Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T c(x, y) }, T c(x, y)=K (bg l(x, y))+eh l(x, y), wherein, T c(x, y) expression { L Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh l(x, y) expression is to { L Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively l(x, y))=-10 -6* (0.7 * bg l(x, y) 2+ 32 * bg l(x, y))+0.07;
2.-3, to { L Dis(x, y) } the visual threshold value set { T of brightness masking effect l(x, y) } and the visual threshold value set { T of contrast masking effect c(x, y) } merge, obtain { L Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
Figure BDA00001627411600052
J L dis ( x , y ) = T l ( x , y ) + T c ( x , y ) ;
2.-4, calculate { R Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T r(x, y) },
Figure BDA00001627411600054
Wherein, T r(x, y) expression { R Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg r(x, y) expression { R Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position;
2.-5, calculate { R Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T c' (x, y) }, T c' (x, y)=K (bg r(x, y))+eh r(x, y), wherein, T c' (x, y) expression { R Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh r(x, y) expression is to { R Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively r(x, y))=-10 -6* (0.7 * bg r(x, y) 2+ 32 * bg r(x, y))+0.07;
2.-6, to { R Dis(x, y) } the visual threshold value set { T of brightness masking effect r(x, y) } and the visual threshold value set { T of contrast masking effect c' (x, y) } merge, obtain { R Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
Figure BDA00001627411600061
J R dis ( x , y ) = T r ( x , y ) + T c ′ ( x , y ) .
Described step utilizes regional detection algorithm to obtain { L respectively in 3. Dis(x, y) } and { R Dis(x, y) } in the detailed process of area type of each pixel be:
3.-1, adopt BMA to calculate { L Org(x, y) } and { R Org(x, y) } between anaglyph, be designated as Wherein,
Figure BDA00001627411600064
Expression
Figure BDA00001627411600065
Middle coordinate position is (x, the pixel value of pixel y);
3.-2, adopt BMA to calculate { L Dis(x, y) } and { R Dis(x, y) } between anaglyph, be designated as
Figure BDA00001627411600066
Wherein,
Figure BDA00001627411600067
Expression
Figure BDA00001627411600068
Middle coordinate position is (x, the pixel value of pixel y);
3.-3, judge
Figure BDA00001627411600069
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116000610
Whether be 255, if, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=1, then execution in step 3.-6, otherwise, execution in step 3.-4, wherein, 1≤x 1≤W, 1≤y 1≤H;
3.-4, judge
Figure BDA000016274116000611
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116000612
Whether greater than
Figure BDA000016274116000613
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116000614
If, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=2, then execution in step 3.-6, otherwise execution in step is 3.-5;
3.-5, judge
Figure BDA000016274116000615
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116000616
Whether be less than or equal to
Figure BDA000016274116000617
Middle coordinate position is (x1, the pixel value of pixel y1)
Figure BDA000016274116000618
If, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=3;
3.-6, return step and 3.-3 continue to confirm { L Dis(x, y) } in the area type of remaining pixel, until { L Dis(x, y) } in the area type of all pixels confirm to finish;
3.-7, adopt BMA to calculate { R Org(x, y) } and { L Org(x, y) } between anaglyph, be designated as Wherein, Expression Middle coordinate position is (x, the pixel value of pixel y);
3.-8, adopt BMA to calculate { R Dis(x, y) } and { L Dis(x, y) } between anaglyph, be designated as
Figure BDA00001627411600074
Wherein,
Figure BDA00001627411600075
Expression
Figure BDA00001627411600076
Middle coordinate position is (x, the pixel value of pixel y);
3.-9, judge
Figure BDA00001627411600077
Middle coordinate position is (x 1, y 1) the pixel value of pixel Whether be 255, if, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=1, then execution in step 3.-12, otherwise, execution in step 3.-10, wherein, 1≤x 1≤W, 1≤y 1≤H;
3.-10, judge
Figure BDA00001627411600079
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116000710
Whether greater than
Figure BDA000016274116000711
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116000712
If, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=2, then execution in step 3.-12, otherwise execution in step is 3.-11;
3.-11, judge
Figure BDA000016274116000713
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116000714
Whether be less than or equal to
Figure BDA000016274116000715
Middle coordinate position is (x 1, y 1) the pixel value of pixel If, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=3;
3.-12, return step and 3.-9 continue to confirm { R Dis(x, y) } in the area type of remaining pixel, until { R Dis(x, y) } in the area type of all pixels confirm to finish.
Described step 4. in { L Org(x, y) }, { R Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in local phase characteristic and the acquisition process of local amplitude characteristic of each pixel be:
4.-1, to { L Org(x, y) } in each pixel carry out the phase equalization conversion, obtain { L Org(x, y) } in each pixel in the even symmetry frequency response and the odd symmetry frequency response of different scale and direction, with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as e in the even symmetry frequency response of different scale and direction α, θ(x, y), with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as o in the odd symmetry frequency response of different scale and direction α, θ(x, y), wherein, α representes the scale factor of wave filter, 1≤α≤4, θ representes the direction factor of wave filter, 1≤θ≤4;
4.-2, calculate { L Org(x, y) } in each pixel in the phase equalization characteristic of different directions, with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as PC in the phase equalization characteristic of different directions θ(x, y), PC θ ( x , y ) = E θ ( x , y ) Σ α = 1 4 A α , θ ( x , y ) , Wherein, A α , θ ( x , y ) = e α , θ ( x , y ) 2 + o α , θ ( x , y ) 2 , E θ ( x , y ) = F θ ( x , y ) 2 + H θ ( x , y ) 2 , F θ ( x , y ) = Σ α = 1 4 e α , θ ( x , y ) , H θ ( x , y ) = Σ α = 1 4 o α , θ ( x , y ) ;
4.-3, according to { L Org(x, y) } in the corresponding direction of maximum phase consistance characteristic of each pixel, calculate { L Org(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, for { L Org(x, y) } in coordinate position be that (x, pixel y) are at first found out its phase equalization characteristic PC at different directions θ(next finds out the corresponding direction of this maximum phase consistance characteristic, is designated as θ for x, the maximum phase consistance characteristic in y) m, once more according to θ mCalculate { L Org(x, y) } in coordinate position be that (correspondence is designated as respectively for x, the local phase characteristic of pixel y) and local amplitude characteristic
Figure BDA00001627411600086
With LP L Org ( x , y ) = Arctan ( H θ m ( x , y ) , F θ m ( x , y ) ) , LA L Org ( x , y ) = Σ α = 1 4 A α , θ m ( x , y ) , Wherein, F θ m ( x , y ) = Σ α = 1 4 e α , θ m ( x , y ) , H θ m ( x , y ) = Σ α = 1 4 o α , θ m ( x , y ) , A α , θ m ( x , y ) = e α , θ m ( x , y ) 2 + o α , θ m ( x , y ) 2 ,
Figure BDA000016274116000813
Expression { L Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence mThe even symmetry frequency response,
Figure BDA000016274116000814
Expression { L Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence mThe odd symmetry frequency response, arctan () is the negate cosine function;
4.-4,4.-1 4.-3 obtain { L to step according to step Org(x, y) } in local phase characteristic and the operation of local amplitude characteristic of each pixel, obtain { R in an identical manner Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel.
Compared with prior art, the invention has the advantages that:
1) the inventive method considers that the perception of zones of different stereo has different responses; Stereo-picture is divided into occlusion area, binocular inhibition zone and binocular integration region also to be estimated respectively; And each evaluation result merged obtain final evaluation score value, make evaluation result feel to meet the human visual system more.
2) the inventive method is through the local phase characteristic and the local amplitude characteristic of each pixel in the left and right sides visual point image that calculates stereo-picture; To obtain the objective evaluation metric of each pixel; Because local phase characteristic that obtains and local amplitude characteristic have stronger stability and can reflect the mass change situation of stereo-picture preferably, have therefore improved the correlativity of objective evaluation result and subjective perception effectively.
3) the inventive method obtains the minimum discernable modified-image of binocular according to the stereoscopic vision characteristic of human eye; The objective evaluation metric that occlusion area, binocular is suppressed each pixel in zone and the binocular integration region carries out weighting in various degree; Make evaluation result feel to meet the human visual system more, improve the correlativity of objective evaluation result and subjective perception.
Description of drawings
Fig. 1 is the overall realization block diagram of the inventive method;
Fig. 2 a is the left visual point image of Akko (being of a size of 640 * 480) stereo-picture;
Fig. 2 b is the right visual point image of Akko (being of a size of 640 * 480) stereo-picture;
Fig. 3 a is the left visual point image of Altmoabit (being of a size of 1024 * 768) stereo-picture;
Fig. 3 b is the right visual point image of Altmoabit (being of a size of 1024 * 768) stereo-picture;
Fig. 4 a is the left visual point image of Balloons (being of a size of 1024 * 768) stereo-picture;
Fig. 4 b is the right visual point image of Balloons (being of a size of 1024 * 768) stereo-picture;
Fig. 5 a is the left visual point image of Doorflower (being of a size of 1024 * 768) stereo-picture;
Fig. 5 b is the right visual point image of Doorflower (being of a size of 1024 * 768) stereo-picture;
Fig. 6 a is the left visual point image of Kendo (being of a size of 1024 * 768) stereo-picture;
Fig. 6 b is the right visual point image of Kendo (being of a size of 1024 * 768) stereo-picture;
Fig. 7 a is the left visual point image of LeaveLaptop (being of a size of 1024 * 768) stereo-picture;
Fig. 7 b is the right visual point image of LeaveLaptop (being of a size of 1024 * 768) stereo-picture;
Fig. 8 a is the left visual point image of Lovebierd1 (being of a size of 1024 * 768) stereo-picture;
Fig. 8 b is the right visual point image of Lovebierd1 (being of a size of 1024 * 768) stereo-picture;
Fig. 9 a is the left visual point image of Newspaper (being of a size of 1024 * 768) stereo-picture;
Fig. 9 b is the right visual point image of Newspaper (being of a size of 1024 * 768) stereo-picture;
Figure 10 a is the left visual point image of Puppy (being of a size of 720 * 480) stereo-picture;
Figure 10 b is the right visual point image of Puppy (being of a size of 720 * 480) stereo-picture;
Figure 11 a is the left visual point image of Soccer2 (being of a size of 720 * 480) stereo-picture;
Figure 11 b is the right visual point image of Soccer2 (being of a size of 720 * 480) stereo-picture;
Figure 12 a is the left visual point image of Horse (being of a size of 720 * 480) stereo-picture;
Figure 12 b is the right visual point image of Horse (being of a size of 720 * 480) stereo-picture;
Figure 13 a is the left visual point image of Xmas (being of a size of 640 * 480) stereo-picture;
Figure 13 b is the right visual point image of Xmas (being of a size of 640 * 480) stereo-picture;
Figure 14 is the scatter diagram of image quality of stereoscopic images objective evaluation predicted value with the average subjective scoring difference of each distortion in the set of distortion stereo-picture.
Embodiment
Embodiment describes in further detail the present invention below in conjunction with accompanying drawing.
A kind of stereo image quality method for objectively evaluating that the present invention proposes based on visually-perceptible, it realizes that totally block diagram is as shown in Figure 1, it may further comprise the steps:
1. make S OrgUndistorted stereo-picture for original makes S DisFor the stereo-picture of distortion to be evaluated, with S OrgLeft visual point image be designated as { L Org(x, y) }, with S OrgRight visual point image be designated as { R Org(x, y) }, with S DisLeft visual point image be designated as { L Dis(x, y) }, with S DisRight visual point image be designated as { R Dis(x, y) }, wherein, (x, the y) coordinate position of the pixel in left visual point image of expression and the right visual point image, 1≤x≤W, 1≤y≤H, W represent the width of left visual point image and right visual point image, H representes the height of left visual point image and right visual point image, L Org(x, y) expression { L Org(x, y) } in coordinate position be (x, the pixel value of pixel y), R Org(x, y) expression { R Org(x, y) } in coordinate position be (x, the pixel value of pixel y), L Dis(x, y) expression { L Dis(x, y) } in coordinate position be (x, the pixel value of pixel y), R Dis(x, y) expression { R Dis(x, y) } in coordinate position be (x, the pixel value of pixel y).
2. human visual system shows; Human eye is non to changing less attribute or noise in the image; Only if the change intensity of this attribute or noise surpasses a certain threshold value, this threshold value be exactly minimum discernable distortion (Just noticeable difference, JND).Yet the visual masking effect of human eye is a kind of local effect, and it receives the influence of factors such as background illuminance, texture complexity, and background is bright more, and texture is more complicated, and boundary value is just high more.Therefore the present invention utilizes the visual masking effect of human stereoscopic vision perception to background illumination and contrast, extracts { L respectively Dis(x, y) } minimum discernable modified-image of binocular and { R Dis(x, y) } the minimum discernable modified-image of binocular, be designated as respectively
Figure BDA00001627411600111
With
Figure BDA00001627411600112
Wherein, Expression { L Dis(x, y) } the minimum discernable modified-image of binocular
Figure BDA00001627411600114
In coordinate position be (x, the pixel value of pixel y),
Figure BDA00001627411600115
Expression { R Dis(x, y) } the minimum discernable modified-image of binocular Middle coordinate position is (x, the pixel value of pixel y).
In this specific embodiment, step detailed process 2. is:
2.-1, calculate { L Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T l(x, y) }, Wherein, T l(x, y) expression { L Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg l(x, y) expression { L Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position.
2.-2, calculate { L Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T c(x, y) }, T c(x, y)=K (bg l(x, y))+eh l(x, y), wherein, T c(x, y) expression { L Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh l(x, y) expression is to { L Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively l(x, y))=-10 -6* (0.7 * bg l(x, y) 2+ 32 * bg l(x, y))+0.07.
2.-3, to { L Dis(x, y) } the visual threshold value set { T of brightness masking effect l(x, y) } and the visual threshold value set { T of contrast masking effect c(x, y) } merge, obtain { L Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
Figure BDA00001627411600118
J L dis ( x , y ) = T l ( x , y ) + T c ( x , y ) .
2.-4, calculate { R Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T r(x, y) },
Figure BDA000016274116001110
Wherein, T r(x, y) expression { R Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg r(x, y) expression { R Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position.
2.-5, calculate { R Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T c' (x, y) }, T c' (x, y)=K (bg r(x, y))+eh r(x, y), wherein, T c' (x, y) expression { R Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh r(x, y) expression is to { R Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively r(x, y))=-10 -6* (0.7 * bg r(x, y) 2+ 32 * bg r(x, y))+0.07.
2.-6, to { R Dis(x, y) } the visual threshold value set { T of brightness masking effect r(x, y) } and the visual threshold value set { T of contrast masking effect c' (x, y) } merge, obtain { R Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
Figure BDA00001627411600121
J R dis ( x , y ) = T r ( x , y ) + T c ′ ( x , y ) .
3. utilize regional detection algorithm, obtain { L respectively Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, be designated as p, wherein, p ∈ 1,2,3}, p=1 representes occlusion area, p=2 representes that binocular suppresses the zone, p=3 representes the binocular integration region; Then with { L Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
Figure BDA00001627411600123
With { L Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as With { L Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
Figure BDA00001627411600125
With { R Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
Figure BDA00001627411600126
With { R Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
Figure BDA00001627411600127
With { R Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
Figure BDA00001627411600128
In this specific embodiment, step utilizes regional detection algorithm to obtain { L respectively in 3. Dis(x, y) } and { R Dis(x, y) } in the detailed process of area type of each pixel be:
3.-1, adopt BMA to calculate { L Org(x, y) } and { R Org(x, y) } between anaglyph, be designated as
Figure BDA00001627411600129
Wherein, Expression
Figure BDA00001627411600132
Middle coordinate position is (x, the pixel value of pixel y).
3.-2, adopt BMA to calculate { L Dis(x, y) } and { R Dis(x, y) } between anaglyph, be designated as
Figure BDA00001627411600133
Wherein,
Figure BDA00001627411600134
Expression
Figure BDA00001627411600135
Middle coordinate position is (x, the pixel value of pixel y).
3.-3, judge Middle coordinate position is (x 1, y 1) the pixel value of pixel Whether be 255, if, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=1, then execution in step 3.-6, otherwise, execution in step 3.-4, wherein, 1≤x 1≤W, 1≤y 1≤H.
3.-4, judge
Figure BDA00001627411600138
Middle coordinate position is (x 1, y 1) the pixel value of pixel Whether greater than
Figure BDA000016274116001310
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116001311
If, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=2, then execution in step 3.-6, otherwise execution in step is 3.-5.
3.-5, judge Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116001313
Whether be less than or equal to
Figure BDA000016274116001314
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116001315
If, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=3.
3.-6, return step and 3.-3 continue to confirm { L Dis(x, y) } in the area type of remaining pixel, until { L Dis(x, y) } in the area type of all pixels confirm to finish.
3.-7, adopt BMA to calculate { R Org(x, y) } and { L Org(x, y) } between anaglyph, be designated as
Figure BDA000016274116001316
Wherein, Expression
Figure BDA000016274116001318
Middle coordinate position is (x, the pixel value of pixel y).
3.-8, adopt BMA to calculate { R Dis(x, y) } and { L Dis(x, y) } between anaglyph, be designated as Wherein,
Figure BDA000016274116001320
Expression
Figure BDA000016274116001321
Middle coordinate position is (x, the pixel value of pixel y).
3.-9, judge
Figure BDA000016274116001322
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA000016274116001323
Whether be 255, if, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=1, then execution in step 3.-12, otherwise, execution in step 3.-10, wherein, 1≤x 1≤W, 1≤y 1≤H.
3.-10, judge
Figure BDA00001627411600141
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA00001627411600142
Whether greater than
Figure BDA00001627411600143
Middle coordinate position is (x 1, y 1) the pixel value of pixel If, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=2, then execution in step 3.-12, otherwise execution in step is 3.-11.
3.-11, judge Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA00001627411600146
Whether be less than or equal to
Figure BDA00001627411600147
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure BDA00001627411600148
If, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=3.
3.-12, return step and 3.-9 continue to confirm { R Dis(x, y) } in the area type of remaining pixel, until { R Dis(x, y) } in the area type of all pixels confirm to finish.
At this; BMA adopts the BMA of existing classics; Its basic thought is with the image morsel; To each fritter of left visual point image (right visual point image), in right visual point image (left visual point image), seek the maximum fritter of correlativity, the space displacement amount between two fritters is exactly a parallax.
4. calculate { L respectively Org(x, y) }, { R Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, with { L Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure BDA00001627411600149
With
Figure BDA000016274116001410
With { R Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure BDA000016274116001411
With
Figure BDA000016274116001412
With { L Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure BDA000016274116001413
With
Figure BDA000016274116001414
With { R Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively With
Figure BDA000016274116001416
Wherein,
Figure BDA000016274116001417
Expression { L Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure BDA000016274116001418
Expression { L Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y), Expression { R Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y), Expression { R Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Figure BDA000016274116001421
Expression { L Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure BDA000016274116001422
Expression { L Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y), Expression { R Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure BDA00001627411600152
Expression { R Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y).
In this specific embodiment, step 4. in { L Org(x, y) }, { R Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in local phase characteristic and the acquisition process of local amplitude characteristic of each pixel be:
4.-1, to { L Org(x, y) } in each pixel carry out the phase equalization conversion, obtain { L Org(x, y) } in each pixel in the even symmetry frequency response and the odd symmetry frequency response of different scale and direction, with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as e in the even symmetry frequency response of different scale and direction α, θ(x, y), with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as o in the odd symmetry frequency response of different scale and direction α, θ(x, y), wherein, α representes the scale factor of wave filter, 1≤α≤4, θ representes the direction factor of wave filter, 1≤θ≤4.
4.-2, calculate { L Org(x, y) } in each pixel in the phase equalization characteristic of different directions, with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as PC in the phase equalization characteristic of different directions θ(x, y), PC θ ( x , y ) = E θ ( x , y ) Σ α = 1 4 A α , θ ( x , y ) , Wherein, A α , θ ( x , y ) = e α , θ ( x , y ) 2 + o α , θ ( x , y ) 2 , E θ ( x , y ) = F θ ( x , y ) 2 + H θ ( x , y ) 2 , F θ ( x , y ) = Σ α = 1 4 e α , θ ( x , y ) , H θ ( x , y ) = Σ α = 1 4 o α , θ ( x , y ) .
4.-3, according to { L Org(x, y) } in the corresponding direction of maximum phase consistance characteristic of each pixel, calculate { L Org(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, for { L Org(x, y) } in coordinate position be that (x, pixel y) are at first found out its phase equalization characteristic PC at different directions θ(next finds out the corresponding direction of this maximum phase consistance characteristic, is designated as θ for x, the maximum phase consistance characteristic in y) m, once more according to θ mCalculate { L Org(x, y) } in coordinate position be that (correspondence is designated as respectively for x, the local phase characteristic of pixel y) and local amplitude characteristic
Figure BDA00001627411600158
With
Figure BDA00001627411600159
LP L Org ( x , y ) = Arctan ( H θ m ( x , y ) , F θ m ( x , y ) ) , LA L Org ( x , y ) = Σ α = 1 4 A α , θ m ( x , y ) , Wherein, F θ m ( x , y ) = Σ α = 1 4 e α , θ m ( x , y ) , H θ m ( x , y ) = Σ α = 1 4 o α , θ m ( x , y ) , A α , θ m ( x , y ) = e α , θ m ( x , y ) 2 + o α , θ m ( x , y ) 2 , Expression { L Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence mThe even symmetry frequency response, Expression { L Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence mThe odd symmetry frequency response, arctan () is the negate cosine function.
4.-4,4.-1 4.-3 obtain { L to step according to step Org(x, y) } in local phase characteristic and the operation of local amplitude characteristic of each pixel, obtain { R in an identical manner Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel.
5. according to { L Org(x, y) } and { L Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { L Dis(x, y) } in the objective evaluation metric of each pixel, with { L Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set L(x, y) }, Q L ( x , y ) = w LP × S L LP ( x , y ) + w LA × S L LA ( x , y ) + b , According to { R Org(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { R Dis(x, y) } in the objective evaluation metric of each pixel, with { R Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set R(x, y) }, Q R ( x , y ) = w LP × S R LP ( x , y ) + w LA × S R LA ( x , y ) + b , Wherein, Q L(x, y) expression { L Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y), S L LP ( x , y ) = 2 × LP L Org ( x , y ) × LP L Dis ( x , y ) + T 1 LP L Org ( x , y ) 2 + LP L Dis ( x , y ) 2 + T 1 , S L LA ( x , y ) = 2 × LA L Org ( x , y ) × LA L Dis ( x , y ) + T 2 LA L Org ( x , y ) 2 + LA L Dis ( x , y ) 2 + T 2 , Q R(x, y) expression { R Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y), S R LP ( x , y ) = 2 × LP R Org ( x , y ) × LP R Dis ( x , y ) + T 1 LP R Org ( x , y ) 2 + LP R Dis ( x , y ) 2 + T 1 , S R LA ( x , y ) = 2 × LA R Org ( x , y ) × LA R Dis ( x , y ) + T 2 LA R Org ( x , y ) 2 + LA R Dis ( x , y ) 2 + T 2 , w LP, w LAWith b be training parameter, T 1And T 2Be controlled variable.
In this specific embodiment,, get w according to the Different Effects of local phase characteristic and local amplitude characteristic stereoscopic image mass change LP=0.9834, w LA=0.2915, b=0 gets T 1=0.85, T 2=160.
6. occlusion area mainly is to mate unsuccessful pixel by parallax in the stereo-picture of distortion to constitute; Comprise the occlusion area of left visual point image and the occlusion area of right visual point image, the human visual system system mainly is that monocular vision plays a major role to the perception of this occlusion area.Therefore the present invention is according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to occlusion area, calculate S DisIn the objective evaluation metric of occlusion area, be designated as Q Nc, Q nc = Σ ( x , y ) ∈ Ω L nc Q L ( x , y ) + Σ ( x , y ) ∈ Ω R nc Q R ( x , y ) N L nc + N R nc , Wherein,
Figure BDA00001627411600172
Expression { L Dis(x, y) } in area type be the number of the pixel of p=1,
Figure BDA00001627411600173
Expression { R Dis(x, y) } in area type be the number of the pixel of p=1.
7. human visual system's characteristic shows; If the picture material in right and left eyes corresponding retinal areas territory differs greatly or parallax value between the two is bigger; Then the human visual system often can't carry out the information of two conflicts is carried out the binocular mixing operation; Go to binocular and shelter processing, shelter in the processing procedure at binocular, the viewpoint that quality is high tends to suppress low-quality viewpoint.Therefore the present invention is according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the human visual system that binocular is suppressed the vision perception characteristic in zone, calculate S DisIn binocular suppress the objective evaluation metric in zone, be designated as Q Bs,
Figure BDA00001627411600174
Wherein, max () is for getting max function, Q L bs = Σ ( x , y ) ∈ Ω L bs Q L ( x , y ) × ( 1 / J L dis ( x , y ) ) Σ ( x , y ) ∈ Ω L bs ( 1 / J L dis ( x , y ) ) , Q R bs = Σ ( x , y ) ∈ Ω R bs Q R ( x , y ) × ( 1 / J R dis ( x , y ) ) Σ ( x , y ) ∈ Ω R bs ( 1 / J R dis ( x , y ) ) .
8. human visual system shows, if the picture material difference in right and left eyes corresponding retinal areas territory is less, then the human visual system can carry out binocular stack (fusion) operation to this zone, is simple eye 1.4 times thereby cause eyes to this regional visual sensitivity.Therefore the present invention is according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to the binocular integration region, calculate S DisIn the objective evaluation metric of binocular integration region, be designated as Q Bf, Q Bf = 0.7 × ( Q L Bf + Q R Bf ) , Wherein, Q L Bf = Σ ( x , y ) ∈ Ω L Bf Q L ( x , y ) × ( 1 / J L Dis ( x , y ) ) Σ ( x , y ) ∈ Ω L Bf ( 1 / J L Dis ( x , y ) ) , Q R Bf = Σ ( x , y ) ∈ Ω R Bf Q R ( x , y ) × ( 1 / J R Dis ( x , y ) ) Σ ( x , y ) ∈ Ω R Bf ( 1 / J R Dis ( x , y ) ) .
9. to S DisIn the objective evaluation metric Q of occlusion area Nc, S DisIn binocular suppress the objective evaluation metric Q in zone BsAnd S DisIn the objective evaluation metric Q of binocular integration region BfMerge, obtain S DisPicture quality objective evaluation predicted value, be designated as Q, Q=w Nc* Q Nc+ w Bs* Q Bs+ w Bf* Q Bf, wherein, w Nc, w BfAnd w BsBe weighting parameters.In this specific embodiment, get w Nc=0.1163, w Bf=0.4119 and w Bs=0.4718.
In the present embodiment, utilize 12 undistorted stereo-pictures shown in Fig. 2 a and Fig. 2 b, Fig. 3 a and Fig. 3 b, Fig. 4 a and Fig. 4 b, Fig. 5 a and Fig. 5 b, Fig. 6 a and Fig. 6 b, Fig. 7 a and Fig. 7 b, Fig. 8 a and Fig. 8 b, Fig. 9 a and Fig. 9 b, Figure 10 a and Figure 10 b, Figure 11 a and Figure 11 b, Figure 12 a and Figure 12 b, Figure 13 a and Figure 13 b to set up the stereo-picture S that its stereo-picture in various degree JPEG compression, JPEG2000 compression, Gaussian Blur, white noise and H.264 312 width of cloth distortions under the coding distortion situation is analyzed the distortion that obtains through the inventive method DisPicture quality objective evaluation predicted value and the correlativity between the average subjective scoring difference; The stereo-picture of the distortion of JPEG compression totally 60 width of cloth wherein; The stereo-picture of the distortion of JPEG2000 compression is totally 60 width of cloth; The stereo-picture of the distortion of Gaussian Blur (Gaussian Blur) is totally 60 width of cloth, and the stereo-picture of the distortion of white noise (White Noise) is totally 60 width of cloth, and the stereo-picture of the distortion of H.264 encoding is totally 72 width of cloth.And utilize existing subjective quality evaluation method to obtain the average subjective scoring difference of the stereo-picture of every width of cloth distortion in the set of distortion stereo-picture respectively, and be designated as DMOS, DMOS=100-MOS, wherein, MOS representes the subjective scoring average, DMOS ∈ [0,100].
In the present embodiment; 4 objective parameters commonly used that utilize the evaluate image quality evaluating method are as evaluation index; Be under the non-linear regression condition Pearson correlation coefficient (Pearson linear correlation coefficient, PLCC), the Spearman related coefficient (Spearman rank order correlation coefficient, SROCC); Kendall related coefficient (Kendall rank-order correlation coefficient; KROCC), and square error (root mean squared error, RMSE); The stereo-picture of PLCC and RMSE reflection distortion is estimated the accuracy of objective models, SROCC and its monotonicity of KROCC reflection.The image quality of stereoscopic images objective evaluation predicted value of the distortion that will calculate by the inventive method is done Wucan and is counted Logistic function nonlinear fitting; PLCC, SROCC and KROCC value are high more, and the low more explanation method for objectively evaluating of RMSE value is good more with average subjective scoring difference correlativity.PLCC, SROCC, KROCC and the RMSE coefficient of reflection three-dimensional image objective evaluation model performance are as shown in table 1; Can know from the data that table 1 is listed; Correlativity between the final picture quality objective evaluation predicted value of the stereo-picture of the distortion that obtains by the inventive method and the average subjective scoring difference is very high; The result who shows objective evaluation result and human eye subjective perception is more consistent, is enough to explain the validity of the inventive method.
Figure 14 has provided the scatter diagram of image quality of stereoscopic images objective evaluation predicted value with the average subjective scoring difference of each distortion in the set of distortion stereo-picture, and diffusing point is concentrated more, explains that the consistance of objective models and subjective perception is good more.As can be seen from Figure 14, the scatter diagram that adopts the inventive method to obtain is more concentrated, and the goodness of fit between the subjective assessment data is higher.
Table 1 utilizes the image quality of stereoscopic images objective evaluation predicted value of the distortion that the inventive method obtains and the correlativity between the subjective scoring
Figure BDA00001627411600191

Claims (4)

1. stereo image quality method for objectively evaluating based on visually-perceptible is characterized in that may further comprise the steps:
1. make S OrgUndistorted stereo-picture for original makes S DisFor the stereo-picture of distortion to be evaluated, with S OrgLeft visual point image be designated as { L Org(x, y) }, with S OrgRight visual point image be designated as { R Org(x, y) }, with S DisLeft visual point image be designated as { L Dis(x, y) }, with S DisRight visual point image be designated as { R Dis(x, y) }, wherein, (x, the y) coordinate position of the pixel in left visual point image of expression and the right visual point image, 1≤x≤W, 1≤y≤H, W represent the width of left visual point image and right visual point image, H representes the height of left visual point image and right visual point image, L Org(x, y) expression { L Org(x, y) } in coordinate position be (x, the pixel value of pixel y), R Org(x, y) expression { R Org(x, y) } in coordinate position be (x, the pixel value of pixel y), L Dis(x, y) expression { L Dis(x, y) } in coordinate position be (x, the pixel value of pixel y), R Dis(x, y) expression { R Dis(x, y) } in coordinate position be (x, the pixel value of pixel y);
2. utilize the visual masking effect of human stereoscopic vision perception, extract { L respectively background illumination and contrast Dis(x, y) } minimum discernable modified-image of binocular and { R Dis(x, y) } the minimum discernable modified-image of binocular, be designated as respectively
Figure FDA00001627411500011
With Wherein,
Figure FDA00001627411500013
Expression { L Dis(x, y) } the minimum discernable modified-image of binocular
Figure FDA00001627411500014
In coordinate position be (x, the pixel value of pixel y),
Figure FDA00001627411500015
Expression { R Dis(x, y) } the minimum discernable modified-image of binocular
Figure FDA00001627411500016
Middle coordinate position is (x, the pixel value of pixel y);
3. utilize regional detection algorithm, obtain { L respectively Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, be designated as p, wherein, p ∈ 1,2,3}, p=1 representes occlusion area, p=2 representes that binocular suppresses the zone, p=3 representes the binocular integration region; Then with { L Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
Figure FDA00001627411500017
With { L Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
Figure FDA00001627411500018
With { L Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
Figure FDA00001627411500019
With { R Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
Figure FDA000016274115000110
With { R Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
Figure FDA000016274115000111
With { R Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
Figure FDA00001627411500021
4. calculate { L respectively Org(x, y) }, { R Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, with { L Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure FDA00001627411500022
With
Figure FDA00001627411500023
With { R Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure FDA00001627411500024
With
Figure FDA00001627411500025
With { L Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively With
Figure FDA00001627411500027
With { R Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
Figure FDA00001627411500028
With
Figure FDA00001627411500029
Wherein,
Figure FDA000016274115000210
Expression { L Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure FDA000016274115000211
Expression { L Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y), Expression { R Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure FDA000016274115000213
Expression { R Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Figure FDA000016274115000214
Expression { L Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y), Expression { L Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y), Expression { R Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Figure FDA000016274115000217
Expression { R Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y);
5. according to { L Org(x, y) } and { L Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { L Dis(x, y) } in the objective evaluation metric of each pixel, with { L Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set L(x, y) }, Q L ( x , y ) = w LP × S L LP ( x , y ) + w LA × S L LA ( x , y ) + b , According to { R Org(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { R Dis(x, y) } in the objective evaluation metric of each pixel, with { R Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set R(x, y) }, Q R ( x , y ) = w LP × S R LP ( x , y ) + w LA × S R LA ( x , y ) + b , Wherein, Q L(x, y) expression { L Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y), S L LP ( x , y ) = 2 × LP L Org ( x , y ) × LP L Dis ( x , y ) + T 1 LP L Org ( x , y ) 2 + LP L Dis ( x , y ) 2 + T 1 , S L LA ( x , y ) = 2 × LA L Org ( x , y ) × LA L Dis ( x , y ) + T 2 LA L Org ( x , y ) 2 + LA L Dis ( x , y ) 2 + T 2 , Q R(x, y) expression { R Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y), S R LP ( x , y ) = 2 × LP R Org ( x , y ) × LP R Dis ( x , y ) + T 1 LP R Org ( x , y ) 2 + LP R Dis ( x , y ) 2 + T 1 , S R LA ( x , y ) = 2 × LA R Org ( x , y ) × LA R Dis ( x , y ) + T 2 LA R Org ( x , y ) 2 + LA R Dis ( x , y ) 2 + T 2 , w LP, w LAWith b be training parameter, T 1And T 2Be controlled variable;
6. according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to occlusion area, calculate S DisIn the objective evaluation metric of occlusion area, be designated as Q Nc, Q nc = Σ ( x , y ) ∈ Ω L nc Q L ( x , y ) + Σ ( x , y ) ∈ Ω R nc Q R ( x , y ) N L nc + N R nc , Wherein,
Figure FDA00001627411500036
Expression { L Dis(x, y) } in area type be the number of the pixel of p=1,
Figure FDA00001627411500037
Expression { R Dis(x, y) } in area type be the number of the pixel of p=1;
7. according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the human visual system that binocular is suppressed the vision perception characteristic in zone, calculate S DisIn binocular suppress the objective evaluation metric in zone, be designated as Q Bs, Q Bs = Max ( Q L Bs , Q R Bs ) , Wherein, max () is for getting max function, Q L Bs = Σ ( x , y ) ∈ Ω L Bs Q L ( x , y ) × ( 1 / J L Dis ( x , y ) ) Σ ( x , y ) ∈ Ω L Bs ( 1 / J L Dis ( x , y ) ) , Q R Bs = Σ ( x , y ) ∈ Ω R Bs Q R ( x , y ) × ( 1 / J R Dis ( x , y ) ) Σ ( x , y ) ∈ Ω R Bs ( 1 / J R Dis ( x , y ) ) ;
8. according to { L Dis(x, y) } and { R Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to the binocular integration region, calculate S DisIn the objective evaluation metric of binocular integration region, be designated as Q Bf, Q Bf = 0.7 × ( Q L Bf + Q R Bf ) , Wherein, Q L Bf = Σ ( x , y ) ∈ Ω L Bf Q L ( x , y ) × ( 1 / J L Dis ( x , y ) ) Σ ( x , y ) ∈ Ω L Bf ( 1 / J L Dis ( x , y ) ) , Q R Bf = Σ ( x , y ) ∈ Ω R Bf Q R ( x , y ) × ( 1 / J R Dis ( x , y ) ) Σ ( x , y ) ∈ Ω R Bf ( 1 / J R Dis ( x , y ) ) ;
9. to S DisIn the objective evaluation metric Q of occlusion area Nc, S DisIn binocular suppress the objective evaluation metric Q in zone BsAnd S DisIn the objective evaluation metric of binocular integration region QbfMerge, obtain S DisPicture quality objective evaluation predicted value, be designated as Q, Q=w Nc* Q Nc+ w Bs* Q Bs+ w Bf* Q Bf, wherein, w Nc, w BfAnd w BsBe weighting parameters.
2. a kind of stereo image quality method for objectively evaluating based on visually-perceptible according to claim 1 is characterized in that described step detailed process 2. is:
2.-1, calculate { L Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T l(x, y) },
Figure FDA00001627411500041
Wherein, T l(x, y) expression { L Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg l(x, y) expression { L Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position;
2.-2, calculate { L Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T c(x, y) }, T c(x, y)=K (bg l(x, y))+eh l(x, y), wherein, T c(x, y) expression { L Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh l(x, y) expression is to { L Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively l(x, y))=-10 -6* (0.7 * bg l(x, y) 2+ 32 * bg l(x, y))+0.07;
2.-3, to { L Dis(x, y) } the visual threshold value set { T of brightness masking effect l(x, y) } and the visual threshold value set { T of contrast masking effect c(x, y) } merge, obtain { L Dis(x, y) } the minimum discernable modified-image of binocular, be designated as J L dis ( x , y ) = T l ( x , y ) + T c ( x , y ) ;
2.-4, calculate { R Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T r(x, y) },
Figure FDA00001627411500044
Wherein, T r(x, y) expression { R Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg r(x, y) expression { R Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position;
2.-5, calculate { R Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T c' (x, y) }, T c' (x, y)=K (bg r(x, y))+eh r(x, y), wherein, T c' (x, y) expression { R Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh r(x, y) expression is to { R Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively r(x, y))=-10 -6* (0.7 * bg r(x, y) 2+ 32 * bg r(x, y))+0.07;
2.-6, to { R Dis(x, y) } the visual threshold value set { T of brightness masking effect r(x, y) } and the visual threshold value set { T of contrast masking effect c' (x, y) } merge, obtain { R Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
Figure FDA00001627411500051
J R dis ( x , y ) = T r ( x , y ) + T c ′ ( x , y ) .
3. a kind of stereo image quality method for objectively evaluating based on visually-perceptible according to claim 1 and 2 is characterized in that utilizing regional detection algorithm to obtain { L respectively during described step 3. Dis(x, y) } and { R Dis(x, y) } in the detailed process of area type of each pixel be:
3.-1, adopt BMA to calculate { L Org(x, y) } and { R Org(x, y) } between anaglyph, be designated as
Figure FDA00001627411500053
Wherein,
Figure FDA00001627411500054
Expression
Figure FDA00001627411500055
Middle coordinate position is (x, the pixel value of pixel y);
3.-2, adopt BMA to calculate { L Dis(x, y) } and { R Dis(x, y) } between anaglyph, be designated as
Figure FDA00001627411500056
Wherein,
Figure FDA00001627411500057
Expression
Figure FDA00001627411500058
Middle coordinate position is (x, the pixel value of pixel y);
3.-3, judge
Figure FDA00001627411500059
Middle coordinate position is (x 1, y 1) the pixel value of pixel Whether be 255, if, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=1, then execution in step 3.-6, otherwise, execution in step 3.-4, wherein, 1≤x 1≤W, 1≤y 1≤H;
3.-4, judge Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure FDA000016274115000512
Whether greater than Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure FDA000016274115000514
If, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=2, then execution in step 3.-6, otherwise execution in step is 3.-5;
3.-5, judge Middle coordinate position is (x1, the pixel value of pixel y1)
Figure FDA00001627411500062
Whether be less than or equal to Middle coordinate position is (x 1, y 1) the pixel value of pixel If, then with { L Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=3;
3.-6, return step and 3.-3 continue to confirm { L Dis(x, y) } in the area type of remaining pixel, until { L Dis(x, y) } in the area type of all pixels confirm to finish;
3.-7, adopt BMA to calculate { R Org(x, y) } and { L Org(x, y) } between anaglyph, be designated as Wherein,
Figure FDA00001627411500066
Expression Middle coordinate position is (x, the pixel value of pixel y);
3.-8, adopt BMA to calculate { R Dis(x, y) } and { L Dis(x, y) } between anaglyph, be designated as Wherein,
Figure FDA00001627411500069
Expression
Figure FDA000016274115000610
Middle coordinate position is (x, the pixel value of pixel y);
3.-9, judge
Figure FDA000016274115000611
Middle coordinate position is (x 1, y 1) the pixel value of pixel Whether be 255, if, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=1, then execution in step 3.-12, otherwise, execution in step 3.-10, wherein, 1≤x 1≤W, 1≤y 1≤H;
3.-10, judge
Figure FDA000016274115000613
Middle coordinate position is (x 1, y 1) the pixel value of pixel Whether greater than
Figure FDA000016274115000615
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure FDA000016274115000616
If, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=2, then execution in step 3.-12, otherwise execution in step is 3.-11;
3.-11, judge
Figure FDA000016274115000617
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure FDA000016274115000618
Whether be less than or equal to
Figure FDA000016274115000619
Middle coordinate position is (x 1, y 1) the pixel value of pixel
Figure FDA000016274115000620
If, then with { R Dis(x, y) } in coordinate position be (x 1, y 1) the area type of pixel be labeled as p=3;
3.-12, return step and 3.-9 continue to confirm { R Dis(x, y) } in the area type of remaining pixel, until { R Dis(x, y) } in the area type of all pixels confirm to finish.
4. a kind of stereo image quality method for objectively evaluating based on visually-perceptible according to claim 3 is characterized in that { L during described step 4. Org(x, y) }, { R Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in local phase characteristic and the acquisition process of local amplitude characteristic of each pixel be:
4.-1, to { L Org(x, y) } in each pixel carry out the phase equalization conversion, obtain { L Org(x, y) } in each pixel in the even symmetry frequency response and the odd symmetry frequency response of different scale and direction, with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as e in the even symmetry frequency response of different scale and direction α, θ(x, y), with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as o in the odd symmetry frequency response of different scale and direction α, θ(x, y), wherein, α representes the scale factor of wave filter, 1≤α≤4, θ representes the direction factor of wave filter, 1≤θ≤4;
4.-2, calculate { L Org(x, y) } in each pixel in the phase equalization characteristic of different directions, with { L Org(x, y) } in coordinate position be that (x, pixel y) is designated as PC in the phase equalization characteristic of different directions θ(x, y), PC θ ( x , y ) = E θ ( x , y ) Σ α = 1 4 A α , θ ( x , y ) , Wherein, A α , θ ( x , y ) = e α , θ ( x , y ) 2 + o α , θ ( x , y ) 2 , E θ ( x , y ) = F θ ( x , y ) 2 + H θ ( x , y ) 2 , F θ ( x , y ) = Σ α = 1 4 e α , θ ( x , y ) , H θ ( x , y ) = Σ α = 1 4 o α , θ ( x , y ) ;
4.-3, according to { L Org(x, y) } in the corresponding direction of maximum phase consistance characteristic of each pixel, calculate { L Org(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, for { L Org(x, y) } in coordinate position be that (x, pixel y) are at first found out its phase equalization characteristic PC at different directions θ(next finds out the corresponding direction of this maximum phase consistance characteristic, is designated as θ for x, the maximum phase consistance characteristic in y) m, once more according to θ mCalculate { L Org(x, y) } in coordinate position be that (correspondence is designated as respectively for x, the local phase characteristic of pixel y) and local amplitude characteristic
Figure FDA00001627411500076
With
Figure FDA00001627411500077
LP L Org ( x , y ) = Arctan ( H θ m ( x , y ) , F θ m ( x , y ) ) , LA L Org ( x , y ) = Σ α = 1 4 A α , θ m ( x , y ) , Wherein, F θ m ( x , y ) = Σ α = 1 4 e α , θ m ( x , y ) , H θ m ( x , y ) = Σ α = 1 4 o α , θ m ( x , y ) , A α , θ m ( x , y ) = e α , θ m ( x , y ) 2 + o α , θ m ( x , y ) 2 ,
Figure FDA000016274115000713
Expression { L Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence mThe even symmetry frequency response,
Figure FDA000016274115000714
Expression { L Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence mThe odd symmetry frequency response, arctan () is the negate cosine function;
4.-4,4.-1 4.-3 obtain { L to step according to step Org(x, y) } in local phase characteristic and the operation of local amplitude characteristic of each pixel, obtain { R in an identical manner Org(x, y) }, { L Dis(x, y) } and { R Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel.
CN201210144039.1A 2012-05-11 2012-05-11 Visual perception-based three-dimensional image quality objective evaluation method Expired - Fee Related CN102708567B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210144039.1A CN102708567B (en) 2012-05-11 2012-05-11 Visual perception-based three-dimensional image quality objective evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210144039.1A CN102708567B (en) 2012-05-11 2012-05-11 Visual perception-based three-dimensional image quality objective evaluation method

Publications (2)

Publication Number Publication Date
CN102708567A true CN102708567A (en) 2012-10-03
CN102708567B CN102708567B (en) 2014-12-10

Family

ID=46901287

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210144039.1A Expired - Fee Related CN102708567B (en) 2012-05-11 2012-05-11 Visual perception-based three-dimensional image quality objective evaluation method

Country Status (1)

Country Link
CN (1) CN102708567B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103096125A (en) * 2013-02-22 2013-05-08 吉林大学 Stereoscopic video visual comfort evaluation method based on region segmentation
CN103108209A (en) * 2012-12-28 2013-05-15 宁波大学 Stereo image objective quality evaluation method based on integration of visual threshold value and passage
CN103413298A (en) * 2013-07-17 2013-11-27 宁波大学 Three-dimensional image objective evaluation method based on visual characteristics
CN103914835A (en) * 2014-03-20 2014-07-09 宁波大学 Non-reference quality evaluation method for fuzzy distortion three-dimensional images
WO2014113915A1 (en) * 2013-01-22 2014-07-31 Silicon Image, Inc. Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams
CN104574399A (en) * 2015-01-06 2015-04-29 天津大学 Image quality evaluation method based on multi-scale vision significance and gradient magnitude
CN105376563A (en) * 2015-11-17 2016-03-02 浙江科技学院 No-reference three-dimensional image quality evaluation method based on binocular fusion feature similarity
CN105828061A (en) * 2016-05-11 2016-08-03 宁波大学 Virtual viewpoint quality evaluation method based on visual masking effect
CN106022362A (en) * 2016-05-13 2016-10-12 天津大学 Reference-free image quality objective evaluation method for JPEG2000 compression distortion
CN113362315A (en) * 2021-06-22 2021-09-07 中国科学技术大学 Image quality evaluation method and evaluation model based on multi-algorithm fusion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089246A1 (en) * 2003-10-27 2005-04-28 Huitao Luo Assessing image quality
WO2008115410A2 (en) * 2007-03-16 2008-09-25 Sti Medical Systems, Llc A method to provide automated quality feedback to imaging devices to achieve standardized imaging data
US20090116713A1 (en) * 2007-10-18 2009-05-07 Michelle Xiao-Hong Yan Method and system for human vision model guided medical image quality assessment
CN101833766A (en) * 2010-05-11 2010-09-15 天津大学 Stereo image objective quality evaluation algorithm based on GSSIM
CN101841726A (en) * 2010-05-24 2010-09-22 宁波大学 Three-dimensional video asymmetrical coding method
CN101872479A (en) * 2010-06-09 2010-10-27 宁波大学 Three-dimensional image objective quality evaluation method
CN102271279A (en) * 2011-07-22 2011-12-07 宁波大学 Objective analysis method for just noticeable change step length of stereo images

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050089246A1 (en) * 2003-10-27 2005-04-28 Huitao Luo Assessing image quality
WO2008115410A2 (en) * 2007-03-16 2008-09-25 Sti Medical Systems, Llc A method to provide automated quality feedback to imaging devices to achieve standardized imaging data
US20090116713A1 (en) * 2007-10-18 2009-05-07 Michelle Xiao-Hong Yan Method and system for human vision model guided medical image quality assessment
CN101833766A (en) * 2010-05-11 2010-09-15 天津大学 Stereo image objective quality evaluation algorithm based on GSSIM
CN101841726A (en) * 2010-05-24 2010-09-22 宁波大学 Three-dimensional video asymmetrical coding method
CN101872479A (en) * 2010-06-09 2010-10-27 宁波大学 Three-dimensional image objective quality evaluation method
CN102271279A (en) * 2011-07-22 2011-12-07 宁波大学 Objective analysis method for just noticeable change step length of stereo images

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103108209B (en) * 2012-12-28 2015-03-11 宁波大学 Stereo image objective quality evaluation method based on integration of visual threshold value and passage
CN103108209A (en) * 2012-12-28 2013-05-15 宁波大学 Stereo image objective quality evaluation method based on integration of visual threshold value and passage
CN105122305A (en) * 2013-01-22 2015-12-02 美国莱迪思半导体公司 Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams
WO2014113915A1 (en) * 2013-01-22 2014-07-31 Silicon Image, Inc. Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams
US8941780B2 (en) 2013-01-22 2015-01-27 Silicon Image, Inc. Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams
CN105122305B (en) * 2013-01-22 2018-10-30 美国莱迪思半导体公司 Mechanism for benefiting the dynamic phasing detection to the high-jitter-tolerance of the image of Media Stream
US9392145B2 (en) 2013-01-22 2016-07-12 Lattice Semiconductor Corporation Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams
CN103096125A (en) * 2013-02-22 2013-05-08 吉林大学 Stereoscopic video visual comfort evaluation method based on region segmentation
CN103413298B (en) * 2013-07-17 2016-02-24 宁波大学 A kind of objective evaluation method for quality of stereo images of view-based access control model characteristic
CN103413298A (en) * 2013-07-17 2013-11-27 宁波大学 Three-dimensional image objective evaluation method based on visual characteristics
CN103914835A (en) * 2014-03-20 2014-07-09 宁波大学 Non-reference quality evaluation method for fuzzy distortion three-dimensional images
CN103914835B (en) * 2014-03-20 2016-08-17 宁波大学 A kind of reference-free quality evaluation method for fuzzy distortion stereo-picture
CN104574399A (en) * 2015-01-06 2015-04-29 天津大学 Image quality evaluation method based on multi-scale vision significance and gradient magnitude
CN105376563A (en) * 2015-11-17 2016-03-02 浙江科技学院 No-reference three-dimensional image quality evaluation method based on binocular fusion feature similarity
CN105828061A (en) * 2016-05-11 2016-08-03 宁波大学 Virtual viewpoint quality evaluation method based on visual masking effect
CN106022362A (en) * 2016-05-13 2016-10-12 天津大学 Reference-free image quality objective evaluation method for JPEG2000 compression distortion
CN113362315A (en) * 2021-06-22 2021-09-07 中国科学技术大学 Image quality evaluation method and evaluation model based on multi-algorithm fusion

Also Published As

Publication number Publication date
CN102708567B (en) 2014-12-10

Similar Documents

Publication Publication Date Title
CN102708567B (en) Visual perception-based three-dimensional image quality objective evaluation method
CN102333233B (en) Stereo image quality objective evaluation method based on visual perception
CN102595185B (en) Stereo image quality objective evaluation method
CN104394403B (en) A kind of stereoscopic video quality method for objectively evaluating towards compression artefacts
CN103413298B (en) A kind of objective evaluation method for quality of stereo images of view-based access control model characteristic
CN102209257A (en) Stereo image quality objective evaluation method
CN103136748B (en) The objective evaluation method for quality of stereo images of a kind of feature based figure
CN102843572B (en) Phase-based stereo image quality objective evaluation method
CN104954778A (en) Objective stereo image quality assessment method based on perception feature set
CN103400378A (en) Method for objectively evaluating quality of three-dimensional image based on visual characteristics of human eyes
CN102999911B (en) Three-dimensional image quality objective evaluation method based on energy diagrams
CN104202594A (en) Video quality evaluation method based on three-dimensional wavelet transform
CN103369348B (en) Three-dimensional image quality objective evaluation method based on regional importance classification
CN102903107A (en) Three-dimensional picture quality objective evaluation method based on feature fusion
CN103745457B (en) A kind of three-dimensional image objective quality evaluation method
CN102999912B (en) A kind of objective evaluation method for quality of stereo images based on distortion map
CN103108209B (en) Stereo image objective quality evaluation method based on integration of visual threshold value and passage
CN102737380B (en) Stereo image quality objective evaluation method based on gradient structure tensor
CN102708568A (en) Stereoscopic image objective quality evaluation method on basis of structural distortion
CN104243974B (en) A kind of stereoscopic video quality method for objectively evaluating based on Three-dimensional DCT
CN103914835A (en) Non-reference quality evaluation method for fuzzy distortion three-dimensional images
CN105898279A (en) Stereoscopic image quality objective evaluation method
CN103200420A (en) Three-dimensional picture quality objective evaluation method based on three-dimensional visual attention
CN105069794A (en) Binocular rivalry based totally blind stereo image quality evaluation method
CN102271279B (en) Objective analysis method for just noticeable change step length of stereo images

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141210

Termination date: 20170511