CN102708567A - Visual perception-based three-dimensional image quality objective evaluation method - Google Patents
Visual perception-based three-dimensional image quality objective evaluation method Download PDFInfo
- Publication number
- CN102708567A CN102708567A CN2012101440391A CN201210144039A CN102708567A CN 102708567 A CN102708567 A CN 102708567A CN 2012101440391 A CN2012101440391 A CN 2012101440391A CN 201210144039 A CN201210144039 A CN 201210144039A CN 102708567 A CN102708567 A CN 102708567A
- Authority
- CN
- China
- Prior art keywords
- dis
- pixel
- org
- coordinate position
- expression
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a visual perception-based three-dimensional image quality objective evaluation method which comprises the steps of: firstly, obtaining an objective evaluation measurement value of each pixel point through calculating a local phase characteristic and a local amplitude characteristic of each pixel point in left and right viewpoint images of a three-dimensional image, dividing the three-dimensional image into a shielding region, a binocular inhibition region and a binocular fusion region by a region detection method; and secondly, evaluating all regions respectively and fusing evaluation results to obtain a final image quality objective evaluation prediction value. The visual perception-based three-dimensional image quality objective evaluation method has the advantages that the obtained information for reflecting local phase and local amplitude characteristics has strong stability and can well reflect quality change condition of the three-dimensional image, and the image quality evaluation method of the shielding region, the binocular inhibition region and the binocular fusion region can well reflect the visual perception characteristic of the human visual system, and is effectively improved in correlation of objective evaluation result and subjective perception.
Description
Technical field
The present invention relates to a kind of image quality evaluating method, especially relate to a kind of stereo image quality method for objectively evaluating based on visually-perceptible.
Background technology
Along with developing rapidly of image coding technique and stereo display technique, the stereo-picture technology has received concern and application more and more widely, has become a current research focus.The binocular parallax principle of stereo-picture techniques make use human eye, binocular receive the left and right sides visual point image from Same Scene independently of one another, merge through brain and form binocular parallax, thereby enjoy the stereo-picture with depth perception and sense true to nature.Because the influence of acquisition system, store compressed and transmission equipment; Stereo-picture can be introduced a series of distortion inevitably; And compare with the single channel image, stereo-picture need guarantee two channel image quality simultaneously, it is carried out quality assessment have very important significance.Yet present stereoscopic image quality lacks effective method for objectively evaluating and estimates.Therefore, set up effective stereo image quality objective evaluation model and have crucial meaning.
Present stereo image quality method for objectively evaluating directly directly applies to the evaluation stereo image quality with the plane picture quality evaluating method; Yet; The left and right sides visual point image of stereoscopic image merges the process that relief process is not the visual point image stack of the simple left and right sides that produces; Also be difficult to represent with simple mathematic method; Therefore, how in the stereo image quality evaluation procedure, effectively binocular solid to be merged and simulate, how the objective evaluation result to be modulated according to the visual masking characteristic of human eye; Making the objective evaluation result feel to meet the human visual system more, all is to carry out the problem that need research and solve in the evaluating objective quality process in stereoscopic image.
Summary of the invention
Technical matters to be solved by this invention provides a kind of stereo image quality method for objectively evaluating based on visually-perceptible that can effectively improve the correlativity of objective evaluation result and subjective perception.
The present invention solves the problems of the technologies described above the technical scheme that is adopted: a kind of stereo image quality method for objectively evaluating based on visually-perceptible is characterized in that may further comprise the steps:
1. make S
OrgUndistorted stereo-picture for original makes S
DisFor the stereo-picture of distortion to be evaluated, with S
OrgLeft visual point image be designated as { L
Org(x, y) }, with S
OrgRight visual point image be designated as { R
Org(x, y) }, with S
DisLeft visual point image be designated as { L
Dis(x, y) }, with S
DisRight visual point image be designated as { R
Dis(x, y) }, wherein, (x, the y) coordinate position of the pixel in left visual point image of expression and the right visual point image, 1≤x≤W, 1≤y≤H, W represent the width of left visual point image and right visual point image, H representes the height of left visual point image and right visual point image, L
Org(x, y) expression { L
Org(x, y) } in coordinate position be (x, the pixel value of pixel y), R
Org(x, y) expression { R
Org(x, y) } in coordinate position be (x, the pixel value of pixel y), L
Dis(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the pixel value of pixel y), R
Dis(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the pixel value of pixel y);
2. utilize the visual masking effect of human stereoscopic vision perception, extract { L respectively background illumination and contrast
Dis(x, y) } minimum discernable modified-image of binocular and { R
Dis(x, y) } the minimum discernable modified-image of binocular, be designated as respectively
With
Wherein,
Expression { L
Dis(x, y) } the minimum discernable modified-image of binocular
In coordinate position be (x, the pixel value of pixel y),
Expression { R
Dis(x, y) } the minimum discernable modified-image of binocular
Middle coordinate position is (x, the pixel value of pixel y);
3. utilize regional detection algorithm, obtain { L respectively
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, be designated as p, wherein, p ∈ 1,2,3}, p=1 representes occlusion area, p=2 representes that binocular suppresses the zone, p=3 representes the binocular integration region; Then with { L
Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
With { L
Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
With { L
Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
With { R
Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
With { R
Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
With { R
Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
4. calculate { L respectively
Org(x, y) }, { R
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, with { L
Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { R
Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { L
Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { R
Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
Wherein,
Expression { L
Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { L
Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { R
Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { R
Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { L
Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { L
Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { R
Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { R
Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y);
5. according to { L
Org(x, y) } and { L
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { L
Dis(x, y) } in the objective evaluation metric of each pixel, with { L
Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set
L(x, y) },
According to { R
Org(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { R
Dis(x, y) } in the objective evaluation metric of each pixel, with { R
Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set
R(x, y) },
Wherein, Q
L(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y),
Q
R(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y),
w
LP, w
LAWith b be training parameter, T
1And T
2Be controlled variable;
6. according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to occlusion area, calculate S
DisIn the objective evaluation metric of occlusion area, be designated as Q
Nc,
Wherein,
Expression { L
Dis(x, y) } in area type be the number of the pixel of p=1,
Expression { R
Dis(x, y) } in area type be the number of the pixel of p=1;
7. according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the human visual system that binocular is suppressed the vision perception characteristic in zone, calculate S
DisIn binocular suppress the objective evaluation metric in zone, be designated as Q
Bs,
Wherein, max () is for getting max function,
8. according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to the binocular integration region, calculate S
DisIn the objective evaluation metric of binocular integration region, be designated as Q
Bf,
Wherein,
9. to S
DisIn the objective evaluation metric Q of occlusion area
Nc, S
DisIn binocular suppress the objective evaluation metric Q in zone
BsAnd S
DisIn the objective evaluation metric Q of binocular integration region
BfMerge, obtain S
DisPicture quality objective evaluation predicted value, be designated as Q, Q=w
Nc* Q
Nc+ w
Bs* Q
Bs+ w
Bf* Q
Bf, wherein, w
Nc, w
BfAnd w
BsBe weighting parameters.
Described step detailed process 2. is:
2.-1, calculate { L
Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T
l(x, y) },
Wherein, T
l(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg
l(x, y) expression { L
Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position;
2.-2, calculate { L
Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T
c(x, y) }, T
c(x, y)=K (bg
l(x, y))+eh
l(x, y), wherein, T
c(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh
l(x, y) expression is to { L
Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively
l(x, y))=-10
-6* (0.7 * bg
l(x, y)
2+ 32 * bg
l(x, y))+0.07;
2.-3, to { L
Dis(x, y) } the visual threshold value set { T of brightness masking effect
l(x, y) } and the visual threshold value set { T of contrast masking effect
c(x, y) } merge, obtain { L
Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
2.-4, calculate { R
Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T
r(x, y) },
Wherein, T
r(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg
r(x, y) expression { R
Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position;
2.-5, calculate { R
Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T
c' (x, y) }, T
c' (x, y)=K (bg
r(x, y))+eh
r(x, y), wherein, T
c' (x, y) expression { R
Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh
r(x, y) expression is to { R
Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively
r(x, y))=-10
-6* (0.7 * bg
r(x, y)
2+ 32 * bg
r(x, y))+0.07;
2.-6, to { R
Dis(x, y) } the visual threshold value set { T of brightness masking effect
r(x, y) } and the visual threshold value set { T of contrast masking effect
c' (x, y) } merge, obtain { R
Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
Described step utilizes regional detection algorithm to obtain { L respectively in 3.
Dis(x, y) } and { R
Dis(x, y) } in the detailed process of area type of each pixel be:
3.-1, adopt BMA to calculate { L
Org(x, y) } and { R
Org(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y);
3.-2, adopt BMA to calculate { L
Dis(x, y) } and { R
Dis(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y);
3.-3, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be 255, if, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=1, then execution in step 3.-6, otherwise, execution in step 3.-4, wherein, 1≤x
1≤W, 1≤y
1≤H;
3.-4, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether greater than
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=2, then execution in step 3.-6, otherwise execution in step is 3.-5;
3.-5, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be less than or equal to
Middle coordinate position is (x1, the pixel value of pixel y1)
If, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=3;
3.-6, return step and 3.-3 continue to confirm { L
Dis(x, y) } in the area type of remaining pixel, until { L
Dis(x, y) } in the area type of all pixels confirm to finish;
3.-7, adopt BMA to calculate { R
Org(x, y) } and { L
Org(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y);
3.-8, adopt BMA to calculate { R
Dis(x, y) } and { L
Dis(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y);
3.-9, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be 255, if, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=1, then execution in step 3.-12, otherwise, execution in step 3.-10, wherein, 1≤x
1≤W, 1≤y
1≤H;
3.-10, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether greater than
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=2, then execution in step 3.-12, otherwise execution in step is 3.-11;
3.-11, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be less than or equal to
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=3;
3.-12, return step and 3.-9 continue to confirm { R
Dis(x, y) } in the area type of remaining pixel, until { R
Dis(x, y) } in the area type of all pixels confirm to finish.
Described step 4. in { L
Org(x, y) }, { R
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in local phase characteristic and the acquisition process of local amplitude characteristic of each pixel be:
4.-1, to { L
Org(x, y) } in each pixel carry out the phase equalization conversion, obtain { L
Org(x, y) } in each pixel in the even symmetry frequency response and the odd symmetry frequency response of different scale and direction, with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as e in the even symmetry frequency response of different scale and direction
α, θ(x, y), with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as o in the odd symmetry frequency response of different scale and direction
α, θ(x, y), wherein, α representes the scale factor of wave filter, 1≤α≤4, θ representes the direction factor of wave filter, 1≤θ≤4;
4.-2, calculate { L
Org(x, y) } in each pixel in the phase equalization characteristic of different directions, with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as PC in the phase equalization characteristic of different directions
θ(x, y),
Wherein,
4.-3, according to { L
Org(x, y) } in the corresponding direction of maximum phase consistance characteristic of each pixel, calculate { L
Org(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, for { L
Org(x, y) } in coordinate position be that (x, pixel y) are at first found out its phase equalization characteristic PC at different directions
θ(next finds out the corresponding direction of this maximum phase consistance characteristic, is designated as θ for x, the maximum phase consistance characteristic in y)
m, once more according to θ
mCalculate { L
Org(x, y) } in coordinate position be that (correspondence is designated as respectively for x, the local phase characteristic of pixel y) and local amplitude characteristic
With
Wherein,
Expression { L
Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence
mThe even symmetry frequency response,
Expression { L
Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence
mThe odd symmetry frequency response, arctan () is the negate cosine function;
4.-4,4.-1 4.-3 obtain { L to step according to step
Org(x, y) } in local phase characteristic and the operation of local amplitude characteristic of each pixel, obtain { R in an identical manner
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel.
Compared with prior art, the invention has the advantages that:
1) the inventive method considers that the perception of zones of different stereo has different responses; Stereo-picture is divided into occlusion area, binocular inhibition zone and binocular integration region also to be estimated respectively; And each evaluation result merged obtain final evaluation score value, make evaluation result feel to meet the human visual system more.
2) the inventive method is through the local phase characteristic and the local amplitude characteristic of each pixel in the left and right sides visual point image that calculates stereo-picture; To obtain the objective evaluation metric of each pixel; Because local phase characteristic that obtains and local amplitude characteristic have stronger stability and can reflect the mass change situation of stereo-picture preferably, have therefore improved the correlativity of objective evaluation result and subjective perception effectively.
3) the inventive method obtains the minimum discernable modified-image of binocular according to the stereoscopic vision characteristic of human eye; The objective evaluation metric that occlusion area, binocular is suppressed each pixel in zone and the binocular integration region carries out weighting in various degree; Make evaluation result feel to meet the human visual system more, improve the correlativity of objective evaluation result and subjective perception.
Description of drawings
Fig. 1 is the overall realization block diagram of the inventive method;
Fig. 2 a is the left visual point image of Akko (being of a size of 640 * 480) stereo-picture;
Fig. 2 b is the right visual point image of Akko (being of a size of 640 * 480) stereo-picture;
Fig. 3 a is the left visual point image of Altmoabit (being of a size of 1024 * 768) stereo-picture;
Fig. 3 b is the right visual point image of Altmoabit (being of a size of 1024 * 768) stereo-picture;
Fig. 4 a is the left visual point image of Balloons (being of a size of 1024 * 768) stereo-picture;
Fig. 4 b is the right visual point image of Balloons (being of a size of 1024 * 768) stereo-picture;
Fig. 5 a is the left visual point image of Doorflower (being of a size of 1024 * 768) stereo-picture;
Fig. 5 b is the right visual point image of Doorflower (being of a size of 1024 * 768) stereo-picture;
Fig. 6 a is the left visual point image of Kendo (being of a size of 1024 * 768) stereo-picture;
Fig. 6 b is the right visual point image of Kendo (being of a size of 1024 * 768) stereo-picture;
Fig. 7 a is the left visual point image of LeaveLaptop (being of a size of 1024 * 768) stereo-picture;
Fig. 7 b is the right visual point image of LeaveLaptop (being of a size of 1024 * 768) stereo-picture;
Fig. 8 a is the left visual point image of Lovebierd1 (being of a size of 1024 * 768) stereo-picture;
Fig. 8 b is the right visual point image of Lovebierd1 (being of a size of 1024 * 768) stereo-picture;
Fig. 9 a is the left visual point image of Newspaper (being of a size of 1024 * 768) stereo-picture;
Fig. 9 b is the right visual point image of Newspaper (being of a size of 1024 * 768) stereo-picture;
Figure 10 a is the left visual point image of Puppy (being of a size of 720 * 480) stereo-picture;
Figure 10 b is the right visual point image of Puppy (being of a size of 720 * 480) stereo-picture;
Figure 11 a is the left visual point image of Soccer2 (being of a size of 720 * 480) stereo-picture;
Figure 11 b is the right visual point image of Soccer2 (being of a size of 720 * 480) stereo-picture;
Figure 12 a is the left visual point image of Horse (being of a size of 720 * 480) stereo-picture;
Figure 12 b is the right visual point image of Horse (being of a size of 720 * 480) stereo-picture;
Figure 13 a is the left visual point image of Xmas (being of a size of 640 * 480) stereo-picture;
Figure 13 b is the right visual point image of Xmas (being of a size of 640 * 480) stereo-picture;
Figure 14 is the scatter diagram of image quality of stereoscopic images objective evaluation predicted value with the average subjective scoring difference of each distortion in the set of distortion stereo-picture.
Embodiment
Embodiment describes in further detail the present invention below in conjunction with accompanying drawing.
A kind of stereo image quality method for objectively evaluating that the present invention proposes based on visually-perceptible, it realizes that totally block diagram is as shown in Figure 1, it may further comprise the steps:
1. make S
OrgUndistorted stereo-picture for original makes S
DisFor the stereo-picture of distortion to be evaluated, with S
OrgLeft visual point image be designated as { L
Org(x, y) }, with S
OrgRight visual point image be designated as { R
Org(x, y) }, with S
DisLeft visual point image be designated as { L
Dis(x, y) }, with S
DisRight visual point image be designated as { R
Dis(x, y) }, wherein, (x, the y) coordinate position of the pixel in left visual point image of expression and the right visual point image, 1≤x≤W, 1≤y≤H, W represent the width of left visual point image and right visual point image, H representes the height of left visual point image and right visual point image, L
Org(x, y) expression { L
Org(x, y) } in coordinate position be (x, the pixel value of pixel y), R
Org(x, y) expression { R
Org(x, y) } in coordinate position be (x, the pixel value of pixel y), L
Dis(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the pixel value of pixel y), R
Dis(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the pixel value of pixel y).
2. human visual system shows; Human eye is non to changing less attribute or noise in the image; Only if the change intensity of this attribute or noise surpasses a certain threshold value, this threshold value be exactly minimum discernable distortion (Just noticeable difference, JND).Yet the visual masking effect of human eye is a kind of local effect, and it receives the influence of factors such as background illuminance, texture complexity, and background is bright more, and texture is more complicated, and boundary value is just high more.Therefore the present invention utilizes the visual masking effect of human stereoscopic vision perception to background illumination and contrast, extracts { L respectively
Dis(x, y) } minimum discernable modified-image of binocular and { R
Dis(x, y) } the minimum discernable modified-image of binocular, be designated as respectively
With
Wherein,
Expression { L
Dis(x, y) } the minimum discernable modified-image of binocular
In coordinate position be (x, the pixel value of pixel y),
Expression { R
Dis(x, y) } the minimum discernable modified-image of binocular
Middle coordinate position is (x, the pixel value of pixel y).
In this specific embodiment, step detailed process 2. is:
2.-1, calculate { L
Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T
l(x, y) },
Wherein, T
l(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg
l(x, y) expression { L
Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position.
2.-2, calculate { L
Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T
c(x, y) }, T
c(x, y)=K (bg
l(x, y))+eh
l(x, y), wherein, T
c(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh
l(x, y) expression is to { L
Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively
l(x, y))=-10
-6* (0.7 * bg
l(x, y)
2+ 32 * bg
l(x, y))+0.07.
2.-3, to { L
Dis(x, y) } the visual threshold value set { T of brightness masking effect
l(x, y) } and the visual threshold value set { T of contrast masking effect
c(x, y) } merge, obtain { L
Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
2.-4, calculate { R
Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T
r(x, y) },
Wherein, T
r(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg
r(x, y) expression { R
Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position.
2.-5, calculate { R
Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T
c' (x, y) }, T
c' (x, y)=K (bg
r(x, y))+eh
r(x, y), wherein, T
c' (x, y) expression { R
Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh
r(x, y) expression is to { R
Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively
r(x, y))=-10
-6* (0.7 * bg
r(x, y)
2+ 32 * bg
r(x, y))+0.07.
2.-6, to { R
Dis(x, y) } the visual threshold value set { T of brightness masking effect
r(x, y) } and the visual threshold value set { T of contrast masking effect
c' (x, y) } merge, obtain { R
Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
3. utilize regional detection algorithm, obtain { L respectively
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, be designated as p, wherein, p ∈ 1,2,3}, p=1 representes occlusion area, p=2 representes that binocular suppresses the zone, p=3 representes the binocular integration region; Then with { L
Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
With { L
Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
With { L
Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
With { R
Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
With { R
Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
With { R
Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
In this specific embodiment, step utilizes regional detection algorithm to obtain { L respectively in 3.
Dis(x, y) } and { R
Dis(x, y) } in the detailed process of area type of each pixel be:
3.-1, adopt BMA to calculate { L
Org(x, y) } and { R
Org(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y).
3.-2, adopt BMA to calculate { L
Dis(x, y) } and { R
Dis(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y).
3.-3, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be 255, if, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=1, then execution in step 3.-6, otherwise, execution in step 3.-4, wherein, 1≤x
1≤W, 1≤y
1≤H.
3.-4, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether greater than
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=2, then execution in step 3.-6, otherwise execution in step is 3.-5.
3.-5, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be less than or equal to
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=3.
3.-6, return step and 3.-3 continue to confirm { L
Dis(x, y) } in the area type of remaining pixel, until { L
Dis(x, y) } in the area type of all pixels confirm to finish.
3.-7, adopt BMA to calculate { R
Org(x, y) } and { L
Org(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y).
3.-8, adopt BMA to calculate { R
Dis(x, y) } and { L
Dis(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y).
3.-9, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be 255, if, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=1, then execution in step 3.-12, otherwise, execution in step 3.-10, wherein, 1≤x
1≤W, 1≤y
1≤H.
3.-10, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether greater than
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=2, then execution in step 3.-12, otherwise execution in step is 3.-11.
3.-11, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be less than or equal to
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=3.
3.-12, return step and 3.-9 continue to confirm { R
Dis(x, y) } in the area type of remaining pixel, until { R
Dis(x, y) } in the area type of all pixels confirm to finish.
At this; BMA adopts the BMA of existing classics; Its basic thought is with the image morsel; To each fritter of left visual point image (right visual point image), in right visual point image (left visual point image), seek the maximum fritter of correlativity, the space displacement amount between two fritters is exactly a parallax.
4. calculate { L respectively
Org(x, y) }, { R
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, with { L
Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { R
Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { L
Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { R
Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
Wherein,
Expression { L
Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { L
Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { R
Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { R
Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { L
Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { L
Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { R
Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { R
Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y).
In this specific embodiment, step 4. in { L
Org(x, y) }, { R
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in local phase characteristic and the acquisition process of local amplitude characteristic of each pixel be:
4.-1, to { L
Org(x, y) } in each pixel carry out the phase equalization conversion, obtain { L
Org(x, y) } in each pixel in the even symmetry frequency response and the odd symmetry frequency response of different scale and direction, with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as e in the even symmetry frequency response of different scale and direction
α, θ(x, y), with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as o in the odd symmetry frequency response of different scale and direction
α, θ(x, y), wherein, α representes the scale factor of wave filter, 1≤α≤4, θ representes the direction factor of wave filter, 1≤θ≤4.
4.-2, calculate { L
Org(x, y) } in each pixel in the phase equalization characteristic of different directions, with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as PC in the phase equalization characteristic of different directions
θ(x, y),
Wherein,
4.-3, according to { L
Org(x, y) } in the corresponding direction of maximum phase consistance characteristic of each pixel, calculate { L
Org(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, for { L
Org(x, y) } in coordinate position be that (x, pixel y) are at first found out its phase equalization characteristic PC at different directions
θ(next finds out the corresponding direction of this maximum phase consistance characteristic, is designated as θ for x, the maximum phase consistance characteristic in y)
m, once more according to θ
mCalculate { L
Org(x, y) } in coordinate position be that (correspondence is designated as respectively for x, the local phase characteristic of pixel y) and local amplitude characteristic
With
Wherein,
Expression { L
Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence
mThe even symmetry frequency response,
Expression { L
Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence
mThe odd symmetry frequency response, arctan () is the negate cosine function.
4.-4,4.-1 4.-3 obtain { L to step according to step
Org(x, y) } in local phase characteristic and the operation of local amplitude characteristic of each pixel, obtain { R in an identical manner
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel.
5. according to { L
Org(x, y) } and { L
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { L
Dis(x, y) } in the objective evaluation metric of each pixel, with { L
Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set
L(x, y) },
According to { R
Org(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { R
Dis(x, y) } in the objective evaluation metric of each pixel, with { R
Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set
R(x, y) },
Wherein, Q
L(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y),
Q
R(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y),
w
LP, w
LAWith b be training parameter, T
1And T
2Be controlled variable.
In this specific embodiment,, get w according to the Different Effects of local phase characteristic and local amplitude characteristic stereoscopic image mass change
LP=0.9834, w
LA=0.2915, b=0 gets T
1=0.85, T
2=160.
6. occlusion area mainly is to mate unsuccessful pixel by parallax in the stereo-picture of distortion to constitute; Comprise the occlusion area of left visual point image and the occlusion area of right visual point image, the human visual system system mainly is that monocular vision plays a major role to the perception of this occlusion area.Therefore the present invention is according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to occlusion area, calculate S
DisIn the objective evaluation metric of occlusion area, be designated as Q
Nc,
Wherein,
Expression { L
Dis(x, y) } in area type be the number of the pixel of p=1,
Expression { R
Dis(x, y) } in area type be the number of the pixel of p=1.
7. human visual system's characteristic shows; If the picture material in right and left eyes corresponding retinal areas territory differs greatly or parallax value between the two is bigger; Then the human visual system often can't carry out the information of two conflicts is carried out the binocular mixing operation; Go to binocular and shelter processing, shelter in the processing procedure at binocular, the viewpoint that quality is high tends to suppress low-quality viewpoint.Therefore the present invention is according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the human visual system that binocular is suppressed the vision perception characteristic in zone, calculate S
DisIn binocular suppress the objective evaluation metric in zone, be designated as Q
Bs,
Wherein, max () is for getting max function,
8. human visual system shows, if the picture material difference in right and left eyes corresponding retinal areas territory is less, then the human visual system can carry out binocular stack (fusion) operation to this zone, is simple eye 1.4 times thereby cause eyes to this regional visual sensitivity.Therefore the present invention is according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to the binocular integration region, calculate S
DisIn the objective evaluation metric of binocular integration region, be designated as Q
Bf,
Wherein,
9. to S
DisIn the objective evaluation metric Q of occlusion area
Nc, S
DisIn binocular suppress the objective evaluation metric Q in zone
BsAnd S
DisIn the objective evaluation metric Q of binocular integration region
BfMerge, obtain S
DisPicture quality objective evaluation predicted value, be designated as Q, Q=w
Nc* Q
Nc+ w
Bs* Q
Bs+ w
Bf* Q
Bf, wherein, w
Nc, w
BfAnd w
BsBe weighting parameters.In this specific embodiment, get w
Nc=0.1163, w
Bf=0.4119 and w
Bs=0.4718.
In the present embodiment, utilize 12 undistorted stereo-pictures shown in Fig. 2 a and Fig. 2 b, Fig. 3 a and Fig. 3 b, Fig. 4 a and Fig. 4 b, Fig. 5 a and Fig. 5 b, Fig. 6 a and Fig. 6 b, Fig. 7 a and Fig. 7 b, Fig. 8 a and Fig. 8 b, Fig. 9 a and Fig. 9 b, Figure 10 a and Figure 10 b, Figure 11 a and Figure 11 b, Figure 12 a and Figure 12 b, Figure 13 a and Figure 13 b to set up the stereo-picture S that its stereo-picture in various degree JPEG compression, JPEG2000 compression, Gaussian Blur, white noise and H.264 312 width of cloth distortions under the coding distortion situation is analyzed the distortion that obtains through the inventive method
DisPicture quality objective evaluation predicted value and the correlativity between the average subjective scoring difference; The stereo-picture of the distortion of JPEG compression totally 60 width of cloth wherein; The stereo-picture of the distortion of JPEG2000 compression is totally 60 width of cloth; The stereo-picture of the distortion of Gaussian Blur (Gaussian Blur) is totally 60 width of cloth, and the stereo-picture of the distortion of white noise (White Noise) is totally 60 width of cloth, and the stereo-picture of the distortion of H.264 encoding is totally 72 width of cloth.And utilize existing subjective quality evaluation method to obtain the average subjective scoring difference of the stereo-picture of every width of cloth distortion in the set of distortion stereo-picture respectively, and be designated as DMOS, DMOS=100-MOS, wherein, MOS representes the subjective scoring average, DMOS ∈ [0,100].
In the present embodiment; 4 objective parameters commonly used that utilize the evaluate image quality evaluating method are as evaluation index; Be under the non-linear regression condition Pearson correlation coefficient (Pearson linear correlation coefficient, PLCC), the Spearman related coefficient (Spearman rank order correlation coefficient, SROCC); Kendall related coefficient (Kendall rank-order correlation coefficient; KROCC), and square error (root mean squared error, RMSE); The stereo-picture of PLCC and RMSE reflection distortion is estimated the accuracy of objective models, SROCC and its monotonicity of KROCC reflection.The image quality of stereoscopic images objective evaluation predicted value of the distortion that will calculate by the inventive method is done Wucan and is counted Logistic function nonlinear fitting; PLCC, SROCC and KROCC value are high more, and the low more explanation method for objectively evaluating of RMSE value is good more with average subjective scoring difference correlativity.PLCC, SROCC, KROCC and the RMSE coefficient of reflection three-dimensional image objective evaluation model performance are as shown in table 1; Can know from the data that table 1 is listed; Correlativity between the final picture quality objective evaluation predicted value of the stereo-picture of the distortion that obtains by the inventive method and the average subjective scoring difference is very high; The result who shows objective evaluation result and human eye subjective perception is more consistent, is enough to explain the validity of the inventive method.
Figure 14 has provided the scatter diagram of image quality of stereoscopic images objective evaluation predicted value with the average subjective scoring difference of each distortion in the set of distortion stereo-picture, and diffusing point is concentrated more, explains that the consistance of objective models and subjective perception is good more.As can be seen from Figure 14, the scatter diagram that adopts the inventive method to obtain is more concentrated, and the goodness of fit between the subjective assessment data is higher.
Table 1 utilizes the image quality of stereoscopic images objective evaluation predicted value of the distortion that the inventive method obtains and the correlativity between the subjective scoring
Claims (4)
1. stereo image quality method for objectively evaluating based on visually-perceptible is characterized in that may further comprise the steps:
1. make S
OrgUndistorted stereo-picture for original makes S
DisFor the stereo-picture of distortion to be evaluated, with S
OrgLeft visual point image be designated as { L
Org(x, y) }, with S
OrgRight visual point image be designated as { R
Org(x, y) }, with S
DisLeft visual point image be designated as { L
Dis(x, y) }, with S
DisRight visual point image be designated as { R
Dis(x, y) }, wherein, (x, the y) coordinate position of the pixel in left visual point image of expression and the right visual point image, 1≤x≤W, 1≤y≤H, W represent the width of left visual point image and right visual point image, H representes the height of left visual point image and right visual point image, L
Org(x, y) expression { L
Org(x, y) } in coordinate position be (x, the pixel value of pixel y), R
Org(x, y) expression { R
Org(x, y) } in coordinate position be (x, the pixel value of pixel y), L
Dis(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the pixel value of pixel y), R
Dis(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the pixel value of pixel y);
2. utilize the visual masking effect of human stereoscopic vision perception, extract { L respectively background illumination and contrast
Dis(x, y) } minimum discernable modified-image of binocular and { R
Dis(x, y) } the minimum discernable modified-image of binocular, be designated as respectively
With
Wherein,
Expression { L
Dis(x, y) } the minimum discernable modified-image of binocular
In coordinate position be (x, the pixel value of pixel y),
Expression { R
Dis(x, y) } the minimum discernable modified-image of binocular
Middle coordinate position is (x, the pixel value of pixel y);
3. utilize regional detection algorithm, obtain { L respectively
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, be designated as p, wherein, p ∈ 1,2,3}, p=1 representes occlusion area, p=2 representes that binocular suppresses the zone, p=3 representes the binocular integration region; Then with { L
Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
With { L
Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
With { L
Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
With { R
Dis(x, y) } in the All Ranges type be that the occlusion area that the pixel of p=1 constitutes is designated as
With { R
Dis(x, y) } in the All Ranges type be that binocular that the pixel of p=2 constitutes suppresses the zone and is designated as
With { R
Dis(x, y) } in the All Ranges type be that the binocular integration region that the pixel of p=3 constitutes is designated as
4. calculate { L respectively
Org(x, y) }, { R
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, with { L
Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { R
Org(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { L
Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
With { R
Dis(x, y) } in local phase characteristic and the local amplitude characteristic of all pixels be expressed as with set respectively
With
Wherein,
Expression { L
Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { L
Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { R
Org(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { R
Org(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { L
Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { L
Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y),
Expression { R
Dis(x, y) } in coordinate position be (x, the local phase characteristic of pixel y),
Expression { R
Dis(x, y) } in coordinate position be (x, the local amplitude characteristic of pixel y);
5. according to { L
Org(x, y) } and { L
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { L
Dis(x, y) } in the objective evaluation metric of each pixel, with { L
Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set
L(x, y) },
According to { R
Org(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, calculate { R
Dis(x, y) } in the objective evaluation metric of each pixel, with { R
Dis(x, y) } in the objective evaluation metric of all pixels be expressed as { Q with set
R(x, y) },
Wherein, Q
L(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y),
Q
R(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the objective evaluation metric of pixel y),
w
LP, w
LAWith b be training parameter, T
1And T
2Be controlled variable;
6. according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to occlusion area, calculate S
DisIn the objective evaluation metric of occlusion area, be designated as Q
Nc,
Wherein,
Expression { L
Dis(x, y) } in area type be the number of the pixel of p=1,
Expression { R
Dis(x, y) } in area type be the number of the pixel of p=1;
7. according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the human visual system that binocular is suppressed the vision perception characteristic in zone, calculate S
DisIn binocular suppress the objective evaluation metric in zone, be designated as Q
Bs,
Wherein, max () is for getting max function,
8. according to { L
Dis(x, y) } and { R
Dis(x, y) } in the area type of each pixel, and utilize the vision perception characteristic of human visual system to the binocular integration region, calculate S
DisIn the objective evaluation metric of binocular integration region, be designated as Q
Bf,
Wherein,
9. to S
DisIn the objective evaluation metric Q of occlusion area
Nc, S
DisIn binocular suppress the objective evaluation metric Q in zone
BsAnd S
DisIn the objective evaluation metric of binocular integration region
QbfMerge, obtain S
DisPicture quality objective evaluation predicted value, be designated as Q, Q=w
Nc* Q
Nc+ w
Bs* Q
Bs+ w
Bf* Q
Bf, wherein, w
Nc, w
BfAnd w
BsBe weighting parameters.
2. a kind of stereo image quality method for objectively evaluating based on visually-perceptible according to claim 1 is characterized in that described step detailed process 2. is:
2.-1, calculate { L
Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T
l(x, y) },
Wherein, T
l(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg
l(x, y) expression { L
Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position;
2.-2, calculate { L
Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T
c(x, y) }, T
c(x, y)=K (bg
l(x, y))+eh
l(x, y), wherein, T
c(x, y) expression { L
Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh
l(x, y) expression is to { L
Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively
l(x, y))=-10
-6* (0.7 * bg
l(x, y)
2+ 32 * bg
l(x, y))+0.07;
2.-3, to { L
Dis(x, y) } the visual threshold value set { T of brightness masking effect
l(x, y) } and the visual threshold value set { T of contrast masking effect
c(x, y) } merge, obtain { L
Dis(x, y) } the minimum discernable modified-image of binocular, be designated as
2.-4, calculate { R
Dis(x, y) } the visual threshold value set of brightness masking effect, be designated as { T
r(x, y) },
Wherein, T
r(x, y) expression { R
Dis(x, y) } in coordinate position be (x, the visual threshold value of the brightness masking effect of pixel y), bg
r(x, y) expression { R
Dis(x, y) } in be that (x, pixel y) they are the average brightness of all pixels in 5 * 5 windows at center with coordinate position;
2.-5, calculate { R
Dis(x, y) } the visual threshold value set of contrast masking effect, be designated as { T
c' (x, y) }, T
c' (x, y)=K (bg
r(x, y))+eh
r(x, y), wherein, T
c' (x, y) expression { R
Dis(x, y) } in coordinate position be (x, the visual threshold value of the contrast masking effect of pixel y), eh
r(x, y) expression is to { R
Dis(x, y) } in coordinate position be that (x, pixel y) carry out the average gradient value that obtains behind horizontal direction and the vertical direction edge filter, K (bg respectively
r(x, y))=-10
-6* (0.7 * bg
r(x, y)
2+ 32 * bg
r(x, y))+0.07;
3. a kind of stereo image quality method for objectively evaluating based on visually-perceptible according to claim 1 and 2 is characterized in that utilizing regional detection algorithm to obtain { L respectively during described step 3.
Dis(x, y) } and { R
Dis(x, y) } in the detailed process of area type of each pixel be:
3.-1, adopt BMA to calculate { L
Org(x, y) } and { R
Org(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y);
3.-2, adopt BMA to calculate { L
Dis(x, y) } and { R
Dis(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y);
3.-3, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be 255, if, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=1, then execution in step 3.-6, otherwise, execution in step 3.-4, wherein, 1≤x
1≤W, 1≤y
1≤H;
3.-4, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether greater than
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=2, then execution in step 3.-6, otherwise execution in step is 3.-5;
3.-5, judge
Middle coordinate position is (x1, the pixel value of pixel y1)
Whether be less than or equal to
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { L
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=3;
3.-6, return step and 3.-3 continue to confirm { L
Dis(x, y) } in the area type of remaining pixel, until { L
Dis(x, y) } in the area type of all pixels confirm to finish;
3.-7, adopt BMA to calculate { R
Org(x, y) } and { L
Org(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y);
3.-8, adopt BMA to calculate { R
Dis(x, y) } and { L
Dis(x, y) } between anaglyph, be designated as
Wherein,
Expression
Middle coordinate position is (x, the pixel value of pixel y);
3.-9, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be 255, if, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=1, then execution in step 3.-12, otherwise, execution in step 3.-10, wherein, 1≤x
1≤W, 1≤y
1≤H;
3.-10, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether greater than
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=2, then execution in step 3.-12, otherwise execution in step is 3.-11;
3.-11, judge
Middle coordinate position is (x
1, y
1) the pixel value of pixel
Whether be less than or equal to
Middle coordinate position is (x
1, y
1) the pixel value of pixel
If, then with { R
Dis(x, y) } in coordinate position be (x
1, y
1) the area type of pixel be labeled as p=3;
3.-12, return step and 3.-9 continue to confirm { R
Dis(x, y) } in the area type of remaining pixel, until { R
Dis(x, y) } in the area type of all pixels confirm to finish.
4. a kind of stereo image quality method for objectively evaluating based on visually-perceptible according to claim 3 is characterized in that { L during described step 4.
Org(x, y) }, { R
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in local phase characteristic and the acquisition process of local amplitude characteristic of each pixel be:
4.-1, to { L
Org(x, y) } in each pixel carry out the phase equalization conversion, obtain { L
Org(x, y) } in each pixel in the even symmetry frequency response and the odd symmetry frequency response of different scale and direction, with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as e in the even symmetry frequency response of different scale and direction
α, θ(x, y), with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as o in the odd symmetry frequency response of different scale and direction
α, θ(x, y), wherein, α representes the scale factor of wave filter, 1≤α≤4, θ representes the direction factor of wave filter, 1≤θ≤4;
4.-2, calculate { L
Org(x, y) } in each pixel in the phase equalization characteristic of different directions, with { L
Org(x, y) } in coordinate position be that (x, pixel y) is designated as PC in the phase equalization characteristic of different directions
θ(x, y),
Wherein,
4.-3, according to { L
Org(x, y) } in the corresponding direction of maximum phase consistance characteristic of each pixel, calculate { L
Org(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel, for { L
Org(x, y) } in coordinate position be that (x, pixel y) are at first found out its phase equalization characteristic PC at different directions
θ(next finds out the corresponding direction of this maximum phase consistance characteristic, is designated as θ for x, the maximum phase consistance characteristic in y)
m, once more according to θ
mCalculate { L
Org(x, y) } in coordinate position be that (correspondence is designated as respectively for x, the local phase characteristic of pixel y) and local amplitude characteristic
With
Wherein,
Expression { L
Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence
mThe even symmetry frequency response,
Expression { L
Org(x, y) } in coordinate position be that (x, pixel y) is at the direction θ of different scale and maximum phase consistance characteristic correspondence
mThe odd symmetry frequency response, arctan () is the negate cosine function;
4.-4,4.-1 4.-3 obtain { L to step according to step
Org(x, y) } in local phase characteristic and the operation of local amplitude characteristic of each pixel, obtain { R in an identical manner
Org(x, y) }, { L
Dis(x, y) } and { R
Dis(x, y) } in the local phase characteristic and the local amplitude characteristic of each pixel.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210144039.1A CN102708567B (en) | 2012-05-11 | 2012-05-11 | Visual perception-based three-dimensional image quality objective evaluation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210144039.1A CN102708567B (en) | 2012-05-11 | 2012-05-11 | Visual perception-based three-dimensional image quality objective evaluation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102708567A true CN102708567A (en) | 2012-10-03 |
CN102708567B CN102708567B (en) | 2014-12-10 |
Family
ID=46901287
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210144039.1A Expired - Fee Related CN102708567B (en) | 2012-05-11 | 2012-05-11 | Visual perception-based three-dimensional image quality objective evaluation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102708567B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103096125A (en) * | 2013-02-22 | 2013-05-08 | 吉林大学 | Stereoscopic video visual comfort evaluation method based on region segmentation |
CN103108209A (en) * | 2012-12-28 | 2013-05-15 | 宁波大学 | Stereo image objective quality evaluation method based on integration of visual threshold value and passage |
CN103413298A (en) * | 2013-07-17 | 2013-11-27 | 宁波大学 | Three-dimensional image objective evaluation method based on visual characteristics |
CN103914835A (en) * | 2014-03-20 | 2014-07-09 | 宁波大学 | Non-reference quality evaluation method for fuzzy distortion three-dimensional images |
WO2014113915A1 (en) * | 2013-01-22 | 2014-07-31 | Silicon Image, Inc. | Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams |
CN104574399A (en) * | 2015-01-06 | 2015-04-29 | 天津大学 | Image quality evaluation method based on multi-scale vision significance and gradient magnitude |
CN105376563A (en) * | 2015-11-17 | 2016-03-02 | 浙江科技学院 | No-reference three-dimensional image quality evaluation method based on binocular fusion feature similarity |
CN105828061A (en) * | 2016-05-11 | 2016-08-03 | 宁波大学 | Virtual viewpoint quality evaluation method based on visual masking effect |
CN106022362A (en) * | 2016-05-13 | 2016-10-12 | 天津大学 | Reference-free image quality objective evaluation method for JPEG2000 compression distortion |
CN113362315A (en) * | 2021-06-22 | 2021-09-07 | 中国科学技术大学 | Image quality evaluation method and evaluation model based on multi-algorithm fusion |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050089246A1 (en) * | 2003-10-27 | 2005-04-28 | Huitao Luo | Assessing image quality |
WO2008115410A2 (en) * | 2007-03-16 | 2008-09-25 | Sti Medical Systems, Llc | A method to provide automated quality feedback to imaging devices to achieve standardized imaging data |
US20090116713A1 (en) * | 2007-10-18 | 2009-05-07 | Michelle Xiao-Hong Yan | Method and system for human vision model guided medical image quality assessment |
CN101833766A (en) * | 2010-05-11 | 2010-09-15 | 天津大学 | Stereo image objective quality evaluation algorithm based on GSSIM |
CN101841726A (en) * | 2010-05-24 | 2010-09-22 | 宁波大学 | Three-dimensional video asymmetrical coding method |
CN101872479A (en) * | 2010-06-09 | 2010-10-27 | 宁波大学 | Three-dimensional image objective quality evaluation method |
CN102271279A (en) * | 2011-07-22 | 2011-12-07 | 宁波大学 | Objective analysis method for just noticeable change step length of stereo images |
-
2012
- 2012-05-11 CN CN201210144039.1A patent/CN102708567B/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050089246A1 (en) * | 2003-10-27 | 2005-04-28 | Huitao Luo | Assessing image quality |
WO2008115410A2 (en) * | 2007-03-16 | 2008-09-25 | Sti Medical Systems, Llc | A method to provide automated quality feedback to imaging devices to achieve standardized imaging data |
US20090116713A1 (en) * | 2007-10-18 | 2009-05-07 | Michelle Xiao-Hong Yan | Method and system for human vision model guided medical image quality assessment |
CN101833766A (en) * | 2010-05-11 | 2010-09-15 | 天津大学 | Stereo image objective quality evaluation algorithm based on GSSIM |
CN101841726A (en) * | 2010-05-24 | 2010-09-22 | 宁波大学 | Three-dimensional video asymmetrical coding method |
CN101872479A (en) * | 2010-06-09 | 2010-10-27 | 宁波大学 | Three-dimensional image objective quality evaluation method |
CN102271279A (en) * | 2011-07-22 | 2011-12-07 | 宁波大学 | Objective analysis method for just noticeable change step length of stereo images |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103108209B (en) * | 2012-12-28 | 2015-03-11 | 宁波大学 | Stereo image objective quality evaluation method based on integration of visual threshold value and passage |
CN103108209A (en) * | 2012-12-28 | 2013-05-15 | 宁波大学 | Stereo image objective quality evaluation method based on integration of visual threshold value and passage |
CN105122305A (en) * | 2013-01-22 | 2015-12-02 | 美国莱迪思半导体公司 | Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams |
WO2014113915A1 (en) * | 2013-01-22 | 2014-07-31 | Silicon Image, Inc. | Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams |
US8941780B2 (en) | 2013-01-22 | 2015-01-27 | Silicon Image, Inc. | Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams |
CN105122305B (en) * | 2013-01-22 | 2018-10-30 | 美国莱迪思半导体公司 | Mechanism for benefiting the dynamic phasing detection to the high-jitter-tolerance of the image of Media Stream |
US9392145B2 (en) | 2013-01-22 | 2016-07-12 | Lattice Semiconductor Corporation | Mechanism for facilitating dynamic phase detection with high jitter tolerance for images of media streams |
CN103096125A (en) * | 2013-02-22 | 2013-05-08 | 吉林大学 | Stereoscopic video visual comfort evaluation method based on region segmentation |
CN103413298B (en) * | 2013-07-17 | 2016-02-24 | 宁波大学 | A kind of objective evaluation method for quality of stereo images of view-based access control model characteristic |
CN103413298A (en) * | 2013-07-17 | 2013-11-27 | 宁波大学 | Three-dimensional image objective evaluation method based on visual characteristics |
CN103914835A (en) * | 2014-03-20 | 2014-07-09 | 宁波大学 | Non-reference quality evaluation method for fuzzy distortion three-dimensional images |
CN103914835B (en) * | 2014-03-20 | 2016-08-17 | 宁波大学 | A kind of reference-free quality evaluation method for fuzzy distortion stereo-picture |
CN104574399A (en) * | 2015-01-06 | 2015-04-29 | 天津大学 | Image quality evaluation method based on multi-scale vision significance and gradient magnitude |
CN105376563A (en) * | 2015-11-17 | 2016-03-02 | 浙江科技学院 | No-reference three-dimensional image quality evaluation method based on binocular fusion feature similarity |
CN105828061A (en) * | 2016-05-11 | 2016-08-03 | 宁波大学 | Virtual viewpoint quality evaluation method based on visual masking effect |
CN106022362A (en) * | 2016-05-13 | 2016-10-12 | 天津大学 | Reference-free image quality objective evaluation method for JPEG2000 compression distortion |
CN113362315A (en) * | 2021-06-22 | 2021-09-07 | 中国科学技术大学 | Image quality evaluation method and evaluation model based on multi-algorithm fusion |
Also Published As
Publication number | Publication date |
---|---|
CN102708567B (en) | 2014-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102708567B (en) | Visual perception-based three-dimensional image quality objective evaluation method | |
CN102333233B (en) | Stereo image quality objective evaluation method based on visual perception | |
CN102595185B (en) | Stereo image quality objective evaluation method | |
CN104394403B (en) | A kind of stereoscopic video quality method for objectively evaluating towards compression artefacts | |
CN103413298B (en) | A kind of objective evaluation method for quality of stereo images of view-based access control model characteristic | |
CN102209257A (en) | Stereo image quality objective evaluation method | |
CN103136748B (en) | The objective evaluation method for quality of stereo images of a kind of feature based figure | |
CN102843572B (en) | Phase-based stereo image quality objective evaluation method | |
CN104954778A (en) | Objective stereo image quality assessment method based on perception feature set | |
CN103400378A (en) | Method for objectively evaluating quality of three-dimensional image based on visual characteristics of human eyes | |
CN102999911B (en) | Three-dimensional image quality objective evaluation method based on energy diagrams | |
CN104202594A (en) | Video quality evaluation method based on three-dimensional wavelet transform | |
CN103369348B (en) | Three-dimensional image quality objective evaluation method based on regional importance classification | |
CN102903107A (en) | Three-dimensional picture quality objective evaluation method based on feature fusion | |
CN103745457B (en) | A kind of three-dimensional image objective quality evaluation method | |
CN102999912B (en) | A kind of objective evaluation method for quality of stereo images based on distortion map | |
CN103108209B (en) | Stereo image objective quality evaluation method based on integration of visual threshold value and passage | |
CN102737380B (en) | Stereo image quality objective evaluation method based on gradient structure tensor | |
CN102708568A (en) | Stereoscopic image objective quality evaluation method on basis of structural distortion | |
CN104243974B (en) | A kind of stereoscopic video quality method for objectively evaluating based on Three-dimensional DCT | |
CN103914835A (en) | Non-reference quality evaluation method for fuzzy distortion three-dimensional images | |
CN105898279A (en) | Stereoscopic image quality objective evaluation method | |
CN103200420A (en) | Three-dimensional picture quality objective evaluation method based on three-dimensional visual attention | |
CN105069794A (en) | Binocular rivalry based totally blind stereo image quality evaluation method | |
CN102271279B (en) | Objective analysis method for just noticeable change step length of stereo images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20141210 Termination date: 20170511 |