CN104754322A - Stereoscopic video comfort evaluation method and device - Google Patents
Stereoscopic video comfort evaluation method and device Download PDFInfo
- Publication number
- CN104754322A CN104754322A CN201310740605.XA CN201310740605A CN104754322A CN 104754322 A CN104754322 A CN 104754322A CN 201310740605 A CN201310740605 A CN 201310740605A CN 104754322 A CN104754322 A CN 104754322A
- Authority
- CN
- China
- Prior art keywords
- frame
- video
- frequency range
- stereopsis frequency
- parallax
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/144—Processing image signals for flicker reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Abstract
The invention discloses a stereoscopic video comfort evaluation method and a device. The method comprises steps: video frames of a stereoscopic video segment are acquired; spatial domain features and time domain features of the video frames of the stereoscopic video segment are extracted; and according to the spatial domain features and the time domain features of the video frames of the stereoscopic video segment, watching comfort of the stereoscopic video segment is determined. By adopting the method and the device of the invention, evaluation on comfort of the stereoscopic video can be realized.
Description
Technical field
The present invention relates to the video technique of the communications field, particularly relate to a kind of three-dimensional video-frequency Comfort Evaluation method and device.
Background technology
Three-dimensional video-frequency has two video channels usually, utilize anaglyph spectacles that right and left eyes can be made to see the picture that two width are different respectively, make the point (on screen) of eye focus and Binocular vison congruence accumulation (before or after screen) not in one plane, thus produce the 3D picture with certain depth of field.When this 3D Display Technique based on binocular parallax principle and people normally watch object, the physiological status focusing on and converge on a point is different, therefore can cause visual fatigue for a long time.
The comfortableness of three-dimensional video-frequency is one of subject matter affecting three-dimensional video-frequency development, and the prerequisite improving three-dimensional video-frequency viewing comfort level evaluates the comfort level of three-dimensional video-frequency.
Summary of the invention
Embodiments provide a kind of three-dimensional video-frequency Comfort Evaluation method and device, evaluate in order to the comfort level realizing stereoscopic video.
First aspect, provide a kind of three-dimensional video-frequency Comfort Evaluation method, the method comprises:
Obtain the frame of video of stereopsis frequency range;
Extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range;
According to Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range, determine the viewing comfort level of described stereopsis frequency range.
In conjunction with first aspect, in the first implementation, Spatial characteristic and the time domain specification of the frame of video of the described stereopsis frequency range of described extraction comprise:
Estimate the parallax of the frame of video of described stereopsis frequency range;
The visual focus position of the frame of video of described stereopsis frequency range is determined according to the parallax of the frame of video of described stereopsis frequency range and movable information;
The viewing area of the frame of video of described stereopsis frequency range is determined according to the visual focus position of the frame of video of described stereopsis frequency range;
According to the viewing area of the parallax of the frame of video of described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range.
In conjunction with the first implementation of first aspect, in the second implementation, the parallax of the described frame of video according to described stereopsis frequency range and movable information determine the visual focus position of the frame of video of described stereopsis frequency range, comprising:
In the frame of video determining described stereopsis frequency range, the weights of each pixel, are defined as the visual focus position of the frame of video of described stereopsis frequency range by the position of the pixel with maximum weights.
In conjunction with the second implementation of first aspect, in the third implementation, the weights of described pixel adopt following formulae discovery:
W=γ*|mv|+η*|disp
crossed|+λ*|δd|
Wherein, the weights of W to be coordinate the be pixel of (x, y), γ, η and λ are weighted value;
represent that described coordinate is the plane motion vector of the pixel of (x, y), d
xand d
ybe respectively horizontal displacement and vertical displacement that described coordinate is the pixel of (x, y), described d
xand d
ysearch in the adjacent video frames of described coordinate to be the pixel place frame of video of (x, y) and described coordinate the be frame of video at the pixel place of (x, y) and obtain; Disp
crossedfor crossed disparity value; δ d is the difference of the mean parallax of the described frame of video of stereopsis frequency range and the match block of adjacent video frames thereof.
In conjunction with the second or the third implementation of first aspect, in the 4th kind of implementation, if the pixel in the frame of video of described stereopsis frequency range with maximum weights has multiple, then multiplely have in the pixel of maximum weights by described, the position of the pixel nearest apart from the image center location of the frame of video of described stereopsis frequency range, is defined as the visual focus position of the frame of video of described stereopsis frequency range.
In conjunction with the first of first aspect to any one implementation in the 4th kind of implementation, in the 5th kind of implementation, described Spatial characteristic comprises one of following parameter or combination in any: with reference to parallax, watch the comfort degree factor with reference to the percentage and first of parallax;
The viewing area of the parallax of the described frame of video according to described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract the Spatial characteristic of the frame of video of described stereopsis frequency range, comprising:
Determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
Calculate in the viewing area of the frame of video of described stereopsis frequency range, parallax is described with reference to the number of pixel of parallax and the ratio of the number of valid pixel, obtain the percentage of the reference parallax of the frame of video of described stereopsis frequency range, the parallax absolute value that described valid pixel refers to is less than the pixel of hunting zone;
Whether there is frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away, determine the first viewing comfort degree factor of the frame of video of described stereopsis frequency range, the first value is got during space layout far away the described first viewing comfort degree factor is near under the frame of video of described stereopsis frequency range does not exist frame effect and meets, the second value is got during space layout far away near under the frame of video of described stereopsis frequency range exists frame effect but meets, the 3rd value is got during space layout far away the frame of video of described stereopsis frequency range does not exist frame effect but discontented foot is near, the 4th value is got during space layout far away the frame of video of described stereopsis frequency range exists frame effect and discontented foot is near, wherein, first value, second value, 3rd value and the 4th value are preset value, and the first value is less than the 4th value, 3rd value first value with second value between and be worth with first and second be worth all unequal, described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect, described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators.
In conjunction with the first of first aspect to any one implementation in the 5th kind of implementation, in the 6th kind of implementation, described time domain specification comprises the second viewing comfort degree factor;
The viewing area of the parallax of the described frame of video according to described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract the time domain specification of the frame of video of described stereopsis frequency range, comprising:
According to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
In conjunction with the 6th kind of implementation of first aspect, in the 7th kind of implementation, according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, comprising:
Described stereopsis frequency range is carried out subsegment division, the reference parallax monotone variation of the frame of video in same subsegment and pace of change is identical, according to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=(disp
last-disp
first)/(Np-1)
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, disp
firstand disp
lastbe respectively the first frame of subsegment belonging to the i-th frame and the reference parallax of last frame, the frame of video quantity of Np subsegment belonging to the i-th frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value; Described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the 6th kind of implementation of first aspect, in the 8th kind of implementation, according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, comprising:
According to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=min Disp
i-min Disp
i-1
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, min Disp
iwith min Disp
i-1be respectively the reference parallax of the i-th frame and the i-th-1 frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value; Described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the first of first aspect or first aspect to any one implementation in the 8th kind of implementation, in the 9th kind of implementation, the Spatial characteristic of the described frame of video according to described stereopsis frequency range and time domain specification, determine the viewing comfort level of described stereopsis frequency range, comprising:
According to Spatial characteristic and the time domain specification of each frame of video of described stereopsis frequency range, calculate the viewing comfort level of each frame of video described respectively;
According to the visual focus position of each frame of video of described stereopsis frequency range, carry out subsegment division to described stereopsis frequency range, the visual focus position transfer amount of the frame of video of each subsegment is not more than setting transfer amount threshold value; According to the viewing comfort level of frame of video each in each subsegment, calculate the viewing comfort level of described each subsegment respectively;
The viewing comfort level of described stereopsis frequency range is calculated according to the viewing comfort level of each subsegment.
In conjunction with the 9th kind of implementation of first aspect, in the tenth kind of implementation, the viewing comfort level of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein,
be the viewing comfort level of the i-th frame, Spatial_frame_vc
ibe the viewing comfort level of the i-th frame of the Spatial characteristic decision of the i-th frame, Temperal_frame_vc
ibe the viewing comfort level of the i-th frame of the time domain specification decision of the i-th frame, α and β is weighted value; Disp
distribution ibe the first viewing comfort degree factor of the i-th frame, minDisp
ibe the reference parallax of the i-th frame, P (minDisp
i) be the percentage of the reference parallax of the i-th frame;
it is the second viewing comfort degree factor of the i-th frame; B1, b2 and c1 are model parameter, and described b1, b2 and c1 are set point; Described i-th frame is any frame of video in described stereopsis frequency range;
The reference parallax of the frame of video of described stereopsis frequency range obtains in the following manner: determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is that the frame of video of described stereopsis frequency range is with reference to the number of pixels of parallax and the ratio of valid pixel number;
Whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
In conjunction with the 9th kind or the tenth kind of implementation of first aspect, in the 11 kind of implementation, the viewing comfort level of the subsegment of stereopsis frequency range according to following formulae discovery:
Wherein,
for the level of comfort of a kth subsegment,
for the viewing comfort level of the i-th frame in a described kth subsegment, N
ffor the number of frame of video in a described kth subsegment, P
1with c2 for setting numerical value, a described kth subsegment is any one subsegment in described stereopsis frequency range.
In conjunction with any one implementation in the 9th kind to the 11 kind implementation of first aspect, in the 12 kind of implementation, the viewing comfort level of stereopsis frequency range according to following formulae discovery:
Wherein, Q is the viewing comfort level of described stereopsis frequency range,
for the level of comfort of a kth subsegment in described stereopsis frequency range, N
sfor the number of subsegment in described stereopsis frequency range, P
2be set point with c2.
In conjunction with the first of first aspect or first aspect to any one implementation in the 8th kind of implementation, in the 13 kind of implementation, the Spatial characteristic of the described frame of video according to described stereopsis frequency range and time domain specification, determine the viewing comfort level of described stereopsis frequency range, comprising:
According to the Spatial characteristic of the frame of video of described stereopsis frequency range, calculate the Spatial characteristic of described stereopsis frequency range;
According to the time domain specification of the frame of video of described stereopsis frequency range, calculate the time domain specification of described stereopsis frequency range;
According to Spatial characteristic and the time domain specification of described stereopsis frequency range, calculate the viewing comfort level of described stereopsis frequency range.
In conjunction with the 13 kind of implementation of first aspect, in the 14 kind of implementation, the Spatial characteristic of the frame of video of described stereopsis frequency range comprises one of following parameter or combination in any: the reference parallax of the frame of video of described stereopsis frequency range, with reference to percentage and the first viewing comfort degree factor of parallax;
The Spatial characteristic of the frame of video of described stereopsis frequency range obtains in the following way:
By in the viewing area of the frame of video of described stereopsis frequency range, the parallax value of the pixel met the following conditions is defined as the reference parallax of the frame of video of described stereopsis frequency range: in the viewing area of the frame of video of described stereopsis frequency range, the quantity of the described pixel corresponding with reference to parallax is greater than setting pixel quantity threshold value, and described reference parallax is the minimum parallax value in parallax set; Described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range, the percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is the number of pixels of the reference parallax of the frame of video of described stereopsis frequency range and the ratio of valid pixel number;
The first viewing comfort degree factor of the frame of video of described stereopsis frequency range, whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
The Spatial characteristic of the described frame of video according to described stereopsis frequency range, calculates the Spatial characteristic of described stereopsis frequency range, comprising:
The weighted average of the reference parallax of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range, N is the number of frame of video in described stereopsis frequency range, minDisp
ibe the reference parallax of the i-th frame, P (min Disp
i) be the percentage of the reference parallax of the i-th frame, described i-th frame is any frame of video in described stereopsis frequency range;
The mean value of the first viewing comfort degree factor of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range, disp
distribution ibe the first viewing comfort degree factor of the i-th frame, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the 13 kind or 14 kinds of implementations of first aspect, in the 15 kind of implementation, the time domain specification of the frame of video of described stereopsis frequency range comprises:
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction;
The time domain specification of the described frame of video according to described stereopsis frequency range, calculates the time domain specification of described stereopsis frequency range, comprising:
The second viewing comfort degree factor of stereopsis frequency range according to following formulae discovery:
Wherein,
for the second viewing comfort degree factor of described stereopsis frequency range,
for the second viewing comfort level impression factor of the i-th frame in described stereopsis frequency range, P
ifor set point, P
ivalue according to V
1 iwith min Disp
isymbol determine, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with any one implementation in the 13 kind to the 15 kind implementation of first aspect, in the 16 kind of implementation, the viewing comfort level of described stereopsis frequency range adopts following formulae discovery:
VC=α*Spatial_vc+β*Temperal_vc
Wherein, VC is the viewing comfort level of described stereopsis frequency range, α and β is weighted value; B1, c3 are set point; Spatial_frame_vc is the viewing comfort level of the described stereopsis frequency range that Spatial characteristic determines, Temperal_frame_vc is the viewing comfort level of the described stereopsis frequency range that time domain specification determines;
for the mean value of the first viewing comfort level impression factor of all frame of video in described stereopsis frequency range,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range,
for the second viewing comfort degree factor of described stereopsis frequency range.
Second aspect, provide a kind of three-dimensional video-frequency Comfort Evaluation device, this device comprises:
Acquisition module, for obtaining the frame of video of stereopsis frequency range;
Extraction module, for extracting Spatial characteristic and the time domain specification of the frame of video of the stereopsis frequency range that described acquisition module gets;
Evaluation module, for the Spatial characteristic of the frame of video of described stereopsis frequency range that extracts according to described extraction module and time domain specification, determines the viewing comfort level of described stereopsis frequency range.
In conjunction with second aspect, in the first implementation, described extraction module specifically for, estimate the parallax of the frame of video of described stereopsis frequency range;
The visual focus position of the frame of video of described stereopsis frequency range is determined according to the parallax of the frame of video of described stereopsis frequency range and movable information;
The viewing area of the frame of video of described stereopsis frequency range is determined according to the visual focus position of the frame of video of described stereopsis frequency range;
According to the viewing area of the parallax of the frame of video of described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range.
In conjunction with the first implementation of second aspect, in the second implementation, described extraction module specifically for, in the frame of video determining described stereopsis frequency range, the weights of each pixel, are defined as the visual focus position of the frame of video of described stereopsis frequency range by the position of the pixel with maximum weights.
In conjunction with the second implementation of second aspect, in the third implementation, described extraction module adopts the weights of pixel described in following formulae discovery:
W=γ*|mv|+η*|disp
crossed|+λ*|δd|
Wherein, the weights of W to be coordinate the be pixel of (x, y), γ, η and λ are weighted value;
represent that described coordinate is the plane motion vector of the pixel of (x, y), d
xand d
ybe respectively horizontal displacement and vertical displacement that described coordinate is (x, y) pixel, described d
xand d
ysearch in the adjacent video frames of described coordinate to be the pixel place frame of video of (x, y) and described coordinate the be frame of video at the pixel place of (x, y) and obtain; Disp
crossedfor crossed disparity value; δ d is the difference of the mean parallax of the described frame of video of stereopsis frequency range and the match block of adjacent video frames thereof.
In conjunction with the second or the third implementation of second aspect, in the 4th kind of implementation, described extraction module specifically for, if the pixel in the frame of video of described stereopsis frequency range with maximum weights has multiple, then multiplely have in the pixel of maximum weights by described, the position of the pixel nearest apart from the image center location of the frame of video of described stereopsis frequency range, is defined as the visual focus position of the frame of video of described stereopsis frequency range.
In conjunction with the first of second aspect to any one implementation in the 4th kind of implementation, in the 5th kind of implementation, described Spatial characteristic comprises one of following parameter or combination in any: with reference to parallax, the percentage with reference to parallax, the first viewing comfort degree factor;
Described extraction module specifically for, determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
Calculate in the viewing area of the frame of video of described stereopsis frequency range, parallax is described with reference to the number of pixel of parallax and the ratio of the number of valid pixel, obtain the percentage of the reference parallax of the frame of video of described stereopsis frequency range, the parallax absolute value that described valid pixel refers to is less than the pixel of hunting zone;
Whether there is frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away, determine the first viewing comfort degree factor of the frame of video of described stereopsis frequency range, the first value is got during space layout far away the described first viewing comfort degree factor is near under the frame of video of described stereopsis frequency range does not exist frame effect and meets, the second value is got during space layout far away near under the frame of video of described stereopsis frequency range exists frame effect but meets, the 3rd value is got during space layout far away the frame of video of described stereopsis frequency range does not exist frame effect but discontented foot is near, the 4th value is got during space layout far away the frame of video of described stereopsis frequency range exists frame effect and discontented foot is near, wherein, first value, second value, 3rd value and the 4th value are preset value, and the first value is less than the 4th value, 3rd value first value with second value between and be worth with first and second be worth all unequal, described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect, described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators.
In conjunction with the first of second aspect to any one implementation in the 5th kind of implementation, in the 6th kind of implementation, described time domain specification comprises the second viewing comfort degree factor;
Described extraction module specifically for, according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
In conjunction with the 6th kind of implementation of second aspect, in the 7th kind of implementation, described extraction module specifically for, described stereopsis frequency range is carried out subsegment division, the reference parallax monotone variation of the frame of video in same subsegment and pace of change is identical, according to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=(disp
last-disp
first)/(Np-1)
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, disp
firstand disp
lastbe respectively the first frame of subsegment belonging to the i-th frame and the reference parallax of last frame, the frame of video quantity of Np subsegment belonging to the i-th frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value; Described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the 6th kind of implementation of second aspect, in the 8th kind of implementation, described extraction module specifically for, according to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=min Disp
i-min Disp
i-1
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, min Disp
iwith min Disp
i-1be respectively the reference parallax of the i-th frame and the i-th-1 frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value, and described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the first of second aspect or second aspect to any one implementation in the 8th kind of implementation, in the 9th kind of implementation, described evaluation module specifically for:
According to Spatial characteristic and the time domain specification of each frame of video of described stereopsis frequency range, calculate the viewing comfort level of each frame of video described respectively;
According to the visual focus position of each frame of video of described stereopsis frequency range, carry out subsegment division to described stereopsis frequency range, the visual focus position transfer amount of the frame of video of each subsegment is not more than setting transfer amount threshold value; According to the viewing comfort level of frame of video each in each subsegment, calculate the viewing comfort level of described each subsegment respectively;
The viewing comfort level of described stereopsis frequency range is calculated according to the viewing comfort level of each subsegment.
In conjunction with the 9th kind of implementation of second aspect, in the tenth kind of implementation, described evaluation module is specifically for, the viewing comfort level of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein,
be the viewing comfort level of the i-th frame, Spatial_frame_vc
ibe the viewing comfort level of the i-th frame of the Spatial characteristic decision of the i-th frame, Temperal_frame_vc
ibe the viewing comfort level of the i-th frame of the time domain specification decision of the i-th frame, α and β is weighted value; Disp
distribution ibe the first viewing comfort degree factor of the i-th frame, minDisp
ibe the reference parallax of the i-th frame, P (minDisp
i) be the percentage of the reference parallax of the i-th frame;
it is the second viewing comfort degree factor of the i-th frame; B1, b2 and c1 are model parameter, and described b1, b2 and c1 are set point, and described i-th frame is any frame of video in described stereopsis frequency range;
The reference parallax of the frame of video of described stereopsis frequency range obtains in the following manner: determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is that the frame of video of described stereopsis frequency range is with reference to the number of pixels of parallax and the ratio of valid pixel number;
Whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
In conjunction with the 9th kind or the tenth kind of implementation of second aspect, in the 11 kind of implementation, described evaluation module specifically for, according to following formulae discovery institute stereopsis frequency range viewing comfort level:
Wherein,
for the level of comfort of a kth subsegment,
for the viewing comfort level of the i-th frame in a described kth subsegment, N
ffor the number of frame of video in a described kth subsegment, P
1with c2 for setting numerical value, a described kth subsegment is any one subsegment in described stereopsis frequency range.
In conjunction with any one implementation in the 9th kind to the 11 kind implementation of second aspect, in the 12 kind of implementation, described evaluation module is specifically for, the viewing comfort level of stereopsis frequency range according to following formulae discovery:
Wherein, Q is the viewing comfort level of described stereopsis frequency range,
for the level of comfort of a kth subsegment in described stereopsis frequency range, N
sfor the number of subsegment in described stereopsis frequency range, P
2be set point with c2.
In conjunction with the first of second aspect or second aspect to any one implementation in the 12 kind of implementation, in the 13 kind of implementation, described evaluation module specifically for: according to the Spatial characteristic of the frame of video of described stereopsis frequency range, calculate the Spatial characteristic of described stereopsis frequency range;
According to the time domain specification of the frame of video of described stereopsis frequency range, calculate the time domain specification of described stereopsis frequency range;
According to Spatial characteristic and the time domain specification of described stereopsis frequency range, calculate the viewing comfort level of described stereopsis frequency range.
In conjunction with the 13 kind of implementation of second aspect, in the 14 kind of implementation, the Spatial characteristic of the frame of video of described stereopsis frequency range comprises one of following parameter or combination in any: the reference parallax of the frame of video of described stereopsis frequency range, with reference to percentage and the first viewing comfort degree factor of parallax;
The Spatial characteristic of the frame of video of described stereopsis frequency range obtains in the following way:
By in the viewing area of the frame of video of described stereopsis frequency range, the parallax value of the pixel met the following conditions is defined as the reference parallax of the frame of video of described stereopsis frequency range: in the viewing area of the frame of video of described stereopsis frequency range, the quantity of the described pixel corresponding with reference to parallax is greater than setting pixel quantity threshold value, and described reference parallax is the minimum parallax value in parallax set; Described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range, the percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is the number of pixels of the reference parallax of the frame of video of described stereopsis frequency range and the ratio of valid pixel number;
The first viewing comfort degree factor of the frame of video of described stereopsis frequency range, whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
Described evaluation module is specifically for, the weighted average of the reference parallax of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range, N is the number of frame of video in described stereopsis frequency range, minDisp
ibe the reference parallax of the i-th frame, P (min Disp
i) be the percentage of the reference parallax of the i-th frame, described i-th frame is any frame of video in described stereopsis frequency range;
The mean value of the first viewing comfort degree factor of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range, disp
distribution ibe the first viewing comfort degree factor of the i-th frame, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the 13 kind or the 14 kind of implementation of second aspect, in the 15 kind of implementation, the time domain specification of the frame of video of described stereopsis frequency range comprises:
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction;
Described evaluation module is specifically for, the second viewing comfort degree factor of stereopsis frequency range according to following formulae discovery:
Wherein,
for the second viewing comfort degree factor of described stereopsis frequency range,
for the second viewing comfort degree factor of the i-th frame in described stereopsis frequency range, P
ifor set point, P
ivalue according to V
1 iwith min Disp
isymbol determine, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with any one implementation in the 13 kind to the 15 kind implementation of second aspect, in the 16 kind of implementation, described evaluation module is specifically for, the viewing comfort level of stereopsis frequency range according to following formulae discovery:
VC=α*Spatial_vc+β*Temperal_vc
Wherein, VC is the viewing comfort level of described stereopsis frequency range, α and β is weighted value; B1, c3 are set point; Spatial_frame_vc is the viewing comfort level of the described stereopsis frequency range that Spatial characteristic determines, Temperal_frame_vc is the viewing comfort level of the described stereopsis frequency range that time domain specification determines;
for the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range,
for the second viewing comfort degree factor of described stereopsis frequency range.
The third aspect, provide a kind of three-dimensional video-frequency Comfort Evaluation device, this device comprises:
Transceiver, for obtaining the frame of video of stereopsis frequency range;
Processor, for extracting Spatial characteristic and the time domain specification of the frame of video of the stereopsis frequency range that described acquisition module gets; And the Spatial characteristic of the frame of video of the described stereopsis frequency range extracted according to described extraction module and time domain specification, determine the viewing comfort level of described stereopsis frequency range.
In conjunction with the third aspect, in the first implementation, described processor specifically for, estimate the parallax of the frame of video of described stereopsis frequency range;
The visual focus position of the frame of video of described stereopsis frequency range is determined according to the parallax of the frame of video of described stereopsis frequency range and movable information;
The viewing area of the frame of video of described stereopsis frequency range is determined according to the visual focus position of the frame of video of described stereopsis frequency range;
According to the viewing area of the parallax of the frame of video of described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range.
In conjunction with the first implementation of the third aspect, in the second implementation, described processor specifically for, in the frame of video determining described stereopsis frequency range, the weights of each pixel, are defined as the visual focus position of the frame of video of described stereopsis frequency range by the position of the pixel with maximum weights.
In conjunction with the second implementation of the third aspect, in the third implementation, described processor adopts the weights of pixel described in following formulae discovery:
W=γ*|mv|+η*|disp
crossed|+λ*|δd|
Wherein, the weights of W to be coordinate the be pixel of (x, y), γ, η and λ are weighted value;
represent that described coordinate is the plane motion vector of the pixel of (x, y), d
xand d
ybe respectively horizontal displacement and vertical displacement that described coordinate is (x, y) pixel, described d
xand d
ysearch in the adjacent video frames of described coordinate to be the pixel place frame of video of (x, y) and described coordinate the be frame of video at the pixel place of (x, y) and obtain; Disp
crossedfor crossed disparity value; δ d is the difference of the mean parallax of the described frame of video of stereopsis frequency range and the match block of adjacent video frames thereof.
In conjunction with the second or the third implementation of the third aspect, in the 4th kind of implementation, described processor specifically for, if the pixel in the frame of video of described stereopsis frequency range with maximum weights has multiple, then multiplely have in the pixel of maximum weights by described, the position of the pixel nearest apart from the image center location of the frame of video of described stereopsis frequency range, is defined as the visual focus position of the frame of video of described stereopsis frequency range.
In conjunction with the first of the third aspect to any one implementation in the 4th kind of implementation, in the 5th kind of implementation, described Spatial characteristic comprises one of following parameter or combination in any: with reference to parallax, the percentage with reference to parallax, the first viewing comfort degree factor;
Described processor specifically for, determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
Calculate in the viewing area of the frame of video of described stereopsis frequency range, parallax is described with reference to the number of pixel of parallax and the ratio of the number of valid pixel, obtain the percentage of the reference parallax of the frame of video of described stereopsis frequency range, the parallax absolute value that described valid pixel refers to is less than the pixel of hunting zone;
Whether there is frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away, determine the first viewing comfort degree factor of the frame of video of described stereopsis frequency range, the first value is got during space layout far away the described first viewing comfort degree factor is near under the frame of video of described stereopsis frequency range does not exist frame effect and meets, the second value is got during space layout far away near under the frame of video of described stereopsis frequency range exists frame effect but meets, the 3rd value is got during space layout far away the frame of video of described stereopsis frequency range does not exist frame effect but discontented foot is near, the 4th value is got during space layout far away the frame of video of described stereopsis frequency range exists frame effect and discontented foot is near, wherein, first value, second value, 3rd value and the 4th value are preset value, and the first value is less than the 4th value, 3rd value first value with second value between and be worth with first and second be worth all unequal, described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect, described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators.
In conjunction with the first of the third aspect to any one implementation in the 5th kind of implementation, in the 6th kind of implementation, described time domain specification comprises the second viewing comfort degree factor;
Described processor specifically for, according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
In conjunction with the 6th kind of implementation of the third aspect, in the 7th kind of implementation, described processor specifically for, described stereopsis frequency range is carried out subsegment division, the reference parallax monotone variation of the frame of video in same subsegment and pace of change is identical, according to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=(disp
last-disp
first)/(Np-1)
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, disp
firstand disp
lastbe respectively the first frame of subsegment belonging to the i-th frame and the reference parallax of last frame, the frame of video quantity of Np subsegment belonging to the i-th frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value; Described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the 6th kind of implementation of the third aspect, in the 8th kind of implementation, described processor specifically for, according to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=min Disp
i-min Disp
i-1
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, min Disp
iwith min Disp
i-1be respectively the reference parallax of the i-th frame and the i-th-1 frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value, and described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the first of the third aspect or second aspect to any one implementation in the 8th kind of implementation, in the 9th kind of implementation, described processor specifically for:
According to Spatial characteristic and the time domain specification of each frame of video of described stereopsis frequency range, calculate the viewing comfort level of each frame of video described respectively;
According to the visual focus position of each frame of video of described stereopsis frequency range, carry out subsegment division to described stereopsis frequency range, the visual focus position transfer amount of the frame of video of each subsegment is not more than setting transfer amount threshold value; According to the viewing comfort level of frame of video each in each subsegment, calculate the viewing comfort level of described each subsegment respectively;
The viewing comfort level of described stereopsis frequency range is calculated according to the viewing comfort level of each subsegment.
In conjunction with the 9th kind of implementation of the third aspect, in the tenth kind of implementation, described processor is specifically for, the viewing comfort level of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein,
be the viewing comfort level of the i-th frame, Spatial_frame_vc
ibe the viewing comfort level of the i-th frame of the Spatial characteristic decision of the i-th frame, Temperal_frame_vc
ibe the viewing comfort level of the i-th frame of the time domain specification decision of the i-th frame, α and β is weighted value; Disp
distribution ibe the first viewing comfort degree factor of the i-th frame, minDisp
ibe the reference parallax of the i-th frame, P (minDisp
i) be the percentage of the reference parallax of the i-th frame;
it is the second viewing comfort degree factor of the i-th frame; B1, b2 and c1 are model parameter, and described b1, b2 and c1 are set point, and described i-th frame is any frame of video in described stereopsis frequency range;
The reference parallax of the frame of video of described stereopsis frequency range obtains in the following manner: determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is that the frame of video of described stereopsis frequency range is with reference to the number of pixels of parallax and the ratio of valid pixel number;
Whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
In conjunction with the 9th kind or the tenth kind of implementation of the third aspect, in the 11 kind of implementation, described processor specifically for, according to following formulae discovery institute stereopsis frequency range viewing comfort level:
Wherein,
for the level of comfort of a kth subsegment,
for the viewing comfort level of the i-th frame in a described kth subsegment, N
ffor the number of frame of video in a described kth subsegment, P
1with c2 for setting numerical value, a described kth subsegment is any one subsegment in described stereopsis frequency range.
In conjunction with any one implementation in the 9th kind to the 11 kind implementation of the third aspect, in the 12 kind of implementation, described processor is specifically for, the viewing comfort level of stereopsis frequency range according to following formulae discovery:
Wherein, Q is the viewing comfort level of described stereopsis frequency range,
for the level of comfort of a kth subsegment in described stereopsis frequency range, N
sfor the number of subsegment in described stereopsis frequency range, P
2be set point with c2.
In conjunction with the first of the third aspect or second aspect to any one implementation in the 12 kind of implementation, in the 13 kind of implementation, described processor specifically for: according to the Spatial characteristic of the frame of video of described stereopsis frequency range, calculate the Spatial characteristic of described stereopsis frequency range;
According to the time domain specification of the frame of video of described stereopsis frequency range, calculate the time domain specification of described stereopsis frequency range;
According to Spatial characteristic and the time domain specification of described stereopsis frequency range, calculate the viewing comfort level of described stereopsis frequency range.
In conjunction with the 13 kind of implementation of the third aspect, in the 14 kind of implementation, the Spatial characteristic of the frame of video of described stereopsis frequency range comprises one of following parameter or combination in any: the reference parallax of the frame of video of described stereopsis frequency range, with reference to percentage and the first viewing comfort degree factor of parallax;
The Spatial characteristic of the frame of video of described stereopsis frequency range obtains in the following way:
By in the viewing area of the frame of video of described stereopsis frequency range, the parallax value of the pixel met the following conditions is defined as the reference parallax of the frame of video of described stereopsis frequency range: in the viewing area of the frame of video of described stereopsis frequency range, the quantity of the described pixel corresponding with reference to parallax is greater than setting pixel quantity threshold value, and described reference parallax is the minimum parallax value in parallax set; Described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range, the percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is the number of pixels of the reference parallax of the frame of video of described stereopsis frequency range and the ratio of valid pixel number;
The first viewing comfort degree factor of the frame of video of described stereopsis frequency range, whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
Described processor is specifically for, the weighted average of the reference parallax of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range, N is the number of frame of video in described stereopsis frequency range, minDisp
ibe the reference parallax of the i-th frame, P (min Disp
i) be the percentage of the reference parallax of the i-th frame, described i-th frame is any frame of video in described stereopsis frequency range;
The mean value of the first viewing comfort degree factor of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range, disp
distribution ibe the first viewing comfort degree factor of the i-th frame, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with the 13 kind or the 14 kind of implementation of the third aspect, in the 15 kind of implementation, the time domain specification of the frame of video of described stereopsis frequency range comprises:
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction;
Described processor is specifically for, the second viewing comfort degree factor of stereopsis frequency range according to following formulae discovery:
Wherein,
for the second viewing comfort degree factor of described stereopsis frequency range,
for the second viewing comfort degree factor of the i-th frame in described stereopsis frequency range, P
ifor set point, P
ivalue according to V
1 iwith min Disp
isymbol determine, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
In conjunction with any one implementation in the 13 kind to the 15 kind implementation of the third aspect, in the 16 kind of implementation, described processor is specifically for, the viewing comfort level of stereopsis frequency range according to following formulae discovery:
VC=α*Spatial_vc+β*Temperal_vc
Wherein, VC is the viewing comfort level of described stereopsis frequency range, α and β is weighted value; B1, c3 are set point; Spatial_frame_vc is the viewing comfort level of the described stereopsis frequency range that Spatial characteristic determines, Temperal_frame_vc is the viewing comfort level of the described stereopsis frequency range that time domain specification determines;
for the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range,
for the second viewing comfort degree factor of described stereopsis frequency range.
In the above embodiment of the present invention, by detecting Spatial characteristic and the time domain specification of frame of video in stereopsis frequency range, and evaluate according to the Spatial characteristic of frame of video and the viewing comfort level of time domain specification to whole stereopsis frequency range.The three-dimensional video-frequency Comfort Evaluation scheme that the embodiment of the present invention proposes considers the impact of disparity space distribution (i.e. Spatial characteristic) and Annual distribution (i.e. time domain specification) stereoscopic video comfort level, and the poor frame of comfort level can be given prominence on the impact of global comfort, thus can comparatively objectively stereoscopic video comfort level evaluate.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, below the accompanying drawing used required in describing embodiment is briefly introduced, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
The three-dimensional video-frequency Comfort Evaluation schematic flow sheet that Fig. 1 provides for the embodiment of the present invention;
The single frames frequency domain characteristic that Fig. 2 provides for the embodiment of the present invention and time domain specification extract schematic flow sheet;
What Fig. 3 provided for the embodiment of the present invention carries out based on single frames frequency domain characteristic and time domain specification stereoscopic video section comfort level the schematic flow sheet evaluated;
What Fig. 4 provided for another embodiment of the present invention carries out based on single frames frequency domain characteristic and time domain specification stereoscopic video section comfort level the schematic flow sheet evaluated;
The three-dimensional video-frequency Comfort Evaluation device schematic diagram that Fig. 5 provides for the embodiment of the present invention;
The three-dimensional video-frequency Comfort Evaluation device schematic diagram that Fig. 6 provides for another embodiment of the present invention.
Embodiment
First the embodiment of the present invention obtains the disparity map of the frame of video that stereopsis frequency range comprises, and then extracts Spatial characteristic and the time domain specification of frame of video, obtains the comfort level of whole section through Comfort Evaluation model.The Comfort Evaluation method that the embodiment of the present invention proposes, take into account the impact of the change stereoscopic video comfort level of object time domain parallax size and frequency.Spectators are in viewing video process, second-rate fragment is often larger on the impact of overall viewing experience, prior art can not embody the impact of poor frames or poor fragment, and the appraisal procedure that the embodiment of the present invention proposes, the impact of comfort level poor frames can be given prominence to.
In order to make the object, technical solutions and advantages of the present invention clearly, below in conjunction with accompanying drawing, the present invention is described in further detail, and obviously, described embodiment is only a part of embodiment of the present invention, instead of whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art, not making other embodiments all obtained under creative work prerequisite, belong to the scope of protection of the invention.
See Fig. 1, be the three-dimensional video-frequency Comfort Evaluation schematic flow sheet that the embodiment of the present invention provides, as shown in the figure, this flow process can comprise:
Step 101: the frame of video obtaining stereopsis frequency range to be evaluated.
Step 102: the Spatial characteristic and the time domain specification that extract the frame of video of described stereopsis frequency range.
Step 103: according to Spatial characteristic and the time domain specification of the frame of video of this stereopsis frequency range, determines the viewing comfort level of this stereopsis frequency range.
In step 102, by realizing with under type: for each frame of video of stereopsis frequency range, perform following operation: the parallax estimating this frame of video, the visual focus position of this frame of video is determined according to the parallax of this frame of video and movable information, the viewing area of this frame of video is determined, according to the parallax of this frame of video and the viewing area of this frame of video according to the visual focus position of this frame of video.Be described step 102 in order to clearer, Fig. 2 shows the optional implementation of the one of step 102 in Fig. 1, and this flow process can comprise:
Step 201: initialization i value, namely arranges i=1.
Step 202: the parallax estimating the i-th frame.
Usually, each frame of three-dimensional video-frequency comprises left-eye image and eye image, is called stereo pairs.In frame of video, the parallax of a pixel refers to the distance between the pixel of a stereo image pair coupling, and usual crossed disparity is negative, and uncrossed disparity is just.The Disparity estimation of three-dimensional video-frequency has multiple, and the optional Stereo Matching Algorithm of the embodiment of the present invention carries out disparity estimation, and the Stereo Matching Algorithm based on color segmentation such as can be utilized to obtain dense disparity map.Obtained after parallax by Stereo Matching Algorithm, filtering can also be carried out further, to remove the Mismatching point in disparity map, make the disparity map that obtains level and smooth, continuously, accurately.During concrete enforcement, medium filtering can be used to remove abnormity point.
Utilizing the Stereo Matching Algorithm based on color segmentation to obtain in the process of dense disparity map, usually need to define search window, the search window according to definition carries out Stereo matching (namely searching for the pixel of coupling).Such as, be that benchmark searches for 32 pixels (now hunting zone is 32, and the scope of parallax is-32 ~ 32) left and to the right respectively with current pixel, in this hunting zone, namely search for the pixel of mating with current pixel.In the embodiment of the present invention, according to the difference of video-frequency band content, the size of search window can be adjusted.The size of search window is except relevant with the resolution of video, also relevant to the overall parallax size of frame of video.If search window is too little, partial pixel can be caused to can not find matched pixel; If window is too large, the probability of error hiding can increase, and the disparity map accuracy obtained is lower.For avoiding the problems referred to above, the large I of search window is close with the parallax amplitude of frame of video.
When adopting Stereo Matching Algorithm estimating disparity based on color segmentation, likely some pixel can not get parallax, and such as, when existence such as to block at the problem, partial pixel can not obtain parallax.For solving this problem, in the embodiment of the present invention, carrying out in Stereo matching process, if certain pixel does not find the pixel matched, then the parallax of this pixel can be set to empirical value, show the pixel not finding coupling, such as, be still 32 for above-mentioned hunting zone, if do not search the pixel of coupling for current pixel, then the parallax of this current pixel is set to hunting zone and adds one, be namely set to 33.
Step 203: according to parallax and the movable information of the i-th frame, determines the visual focus position of the i-th frame.
In this step, first can determine the weights of each pixel in the i-th frame, the position of the pixel with maximum weights is defined as the visual focus position of the i-th frame.Further, if the pixel in the i-th frame with maximum weights has multiple, then can have in the pixel of maximum weights by the plurality of, the position of the pixel nearest apart from the image center location of the i-th frame, is defined as the visual focus position of the i-th frame.
Because motion (motion of such as plane motion and/or depth direction) comparatively violent in video and larger crossed disparity easily cause the attention of spectators, therefore the embodiment of the present invention is when the weights of calculating pixel, can calculate according to the motion of the plane motion of matched pixel in frame of video (comprise horizontal movement and move both vertically), depth direction and crossed disparity.Wherein, crossed disparity refers to that image objects is in screen front.
Concrete, can according to the weights of pixel in following formulae discovery i-th frame:
W=γ*|mv|+η*|disp
crossed|+λ*|δd|....................................[1]
Wherein, the weights of W to be coordinate the be pixel of (x, y), γ, η and λ are weighted value, and γ, η and λ sum can value be 1, optionally, γ=0.2, η=0.4, λ=0.4; Mv represents plane motion vector, and the size of mv shows the severe degree of plane motion,
denotation coordination is the plane motion vector of the pixel of (x, y), d
xand d
ythe coordinate be respectively in the i-th frame is horizontal displacement and the vertical displacement of (x, y) pixel, described d
xand d
ysearch in the adjacent video frames (adjacent video frames of described i-th frame is such as the i-th-1 frame or the i-th+1 frame) of the i-th frame and the i-th frame and obtain, such as, by comparing pixel that coordinate in the i-th frame is (x, y) and the coordinate of pixel in the i-th-1 frame with this pixel matching, the d of this pixel can be obtained
xand d
y; Disp
crossedfor crossed disparity value, for each pixel, the only disp when its parallax is crossed disparity
crossedjust there is value, if the parallax of pixel is non-crossing, then by disp
crossedvalue is 0; δ d is the difference of the mean parallax of the match block of the adjacent video frames (adjacent video frames of described i-th frame is such as the i-th-1 frame or the i-th+1 frame) of the i-th frame and the i-th frame, and the size of δ d shows the speed that depth direction moves.
Step 204: the viewing area determining this frame of video according to the visual focus position of the i-th frame.
In this step, after determining the visual focus position of frame of video i, the ken and viewing distance determination image visual region can be differentiated according to human eye.In the embodiment of the present invention, the region optionally can got centered by focus, in 15 ° of angulars field of view is viewing area.Such as, the resolution of a stereopsis frequency range is 640*480, and the viewing area of a frame of video in this stereopsis frequency range may in 350*262 pixel coverage.
Step 205: according to the parallax of the i-th frame and the viewing area of this frame of video, extracts Spatial characteristic and the time domain specification of the i-th frame.
In the embodiment of the present invention, the Spatial characteristic of a frame of video can comprise one of following parameter or combination in any: the reference parallax (below the reference parallax of the i-th frame being expressed as min Disp) of this frame of video, with reference to the percentage (hereinafter referred to as P (minDisp)) of parallax, frame effect the order of severity, whether meet " lower near far away " space layout.Wherein:
The reference parallax min Disp of frame of video, can be frame of video viewing area in minimum pixel parallax value.Further, consider the phenomenon that may there is error hiding when Stereo matching, the parallax sometimes having some pixels is abnormal large or abnormal little, but these pixels are all often some pixels that are discrete, negligible amounts.But in fact, for same object, the pixel parallax of adjacent area should be all identical or close, just think that the coupling of these pixels is inaccurate during the negligible amounts of therefore pixel parallax, therefore can after the pixel getting rid of these parallax exceptions, minimum parallax is selected, as the reference parallax of this frame of video according to the parallax of remaining pixel.Such as, can by the viewing area of described frame of video, the minimum value in the parallax of other pixels except noise pixel, is defined as the reference parallax of described frame of video; Wherein, for each parallax value of noise pixel, pixel (namely having the pixel of this parallax value) quantity of its correspondence is lower than setting pixel quantity threshold value.Like this, some noises can be avoided the selection with reference to parallax, and then improve the accuracy of comfort level assessment.
With reference to the percentage P (minDisp) of parallax, refer in the viewing area of frame of video, parallax value is with reference to parallax min Disp
inumber of pixels and the ratio of valid pixel number, wherein, the parallax absolute value of valid pixel is less than hunting zone.In Stereo matching process, there is the phenomenon of some pixel matching failures, now the parallax of this pixel is set to an empirical value determined to indicate this pixel matching failure, when said hunting zone is 32 such as, the parallax value of the pixel that it fails to match is set to 33.Because effective disparity range that size limit of search window is [-32,32], therefore parallax is [-32,32] pixel in is the pixel that the match is successful, i.e. valid pixel, and the pixel of parallax not in [-32,32] is inactive pixels, i.e. error hiding pixel, such as parallax be 33 pixel be error hiding pixel.
Frame effect, refer to for a frame of video, if the parallax being positioned at the object (namely imaging in the object of screen edge) of screen edge is crossed disparity (namely image objects is in screen front), and a part for object is beyond screen ranges, then there is frame effect (frame-effect) in this frame of video, and this can make spectators produce extremely uncomfortable sensation when watching.
" lower near far away (bottom-up) ", refer to the perceived depth of the object (namely imaging in the object of screen bottom) of the screen bottom that frame of video is corresponding close to spectators, the perceived depth of screen top object (namely imaging in the object on screen top) is away from spectators.The frame of video meeting the space layout of " lower near far away " not easily causes visual fatigue.
During concrete enforcement, viewing comfort degree factor disp can be used
distribution(can be described as the first viewing comfort degree factor) represents the order of severity of frame effect and whether meets " lower near far away " space layout.With reference to the percentage of parallax, reference parallax, and disp
distributionreflect the space distribution situation of parallax in frame of video.
Below for the i-th frame, the percentage P (minDisp) of the reference parallax min Disp of frame of video, reference parallax is described, and disp
distributioncomputational methods.
For the i-th frame, determine that it is with reference to parallax min Disp
imethod can be: in the viewing area of the i-th frame all pixels parallax in, choose the value of minimum value as the reference parallax of the i-th frame.Another kind of optional implementation can be: determine the parallax set that the i-th frame is corresponding, this parallax set is the pixel parallax set in institute i frame viewing area, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in this parallax set is defined as the min Disp of the i-th frame
i.Such as, be pixel quantity at N*N(N) statistical window in, get minimum parallax, judge whether the pixel quantity with this minimum parallax value is greater than threshold value 0.875*N*N, if be greater than, then using the reference parallax of this minimum parallax as the i-th frame; If be less than, then choosing the minimum value in all the other parallaxes except this minimum parallax in this statistical window again, judge whether the pixel quantity with this minimum value is greater than threshold value 0.875*N*N, if be greater than, then using the reference parallax of this minimum parallax as the i-th frame, otherwise get rid of this minimum value with reference to aforesaid way, in remaining parallax, select a minimum value again, and carry out judging and processing with reference to aforesaid way.
For the i-th frame, the percentage P (minDisp) of reference parallax is by calculating in the viewing area of the i-th frame, and parallax is with reference to parallax min Disp
inumber of pixels and the ratio of valid pixel number obtain.
For the i-th frame, by detecting in screen edge certain limit corresponding to the i-th frame, whether there is the concentrated crossed disparity being less than setting parallax value, determining whether the i-th frame exists frame effect.Such as, for the i-th frame, can in screen edge certain limit, whether the number being detected the crossed disparity being less than setting parallax thresholding T by the search window of N*N is greater than threshold value 0.75*N*N, if be greater than, then judge that the i-th frame exists frame effect, otherwise judge that the i-th frame does not exist frame effect.Wherein, the value of thresholding T requires the lower bound being less than " comfort zone ", and the scope of " comfort zone " is relevant to the series of factors such as resolution, human eye interpupillary distance, viewing distance of video, therefore the value of thresholding T is not a fixed value, and selectable value is also relevant to the resolution of video." comfort zone " refers to: when parallax is less than certain limit, and human eye can not produce visual fatigue, under such as current video resolution " comfort zone " be-5pixels, 8pixels], then the value of thresholding T should be less than-5.Such as, be the frame of video of 640*480 for resolution, in the scope of screen edge 5%, if detect that in the window of 20*20 the number of the crossed disparity being less than-5 is greater than 300, then judge that this frame of video exists frame effect.
For the i-th frame, the i-th frame can be divided into three regions according to a certain percentage, judge by the magnitude relationship of the mean parallax comparing valid pixel in three regions the space layout that whether the i-th frame meets " lower near far away ".Such as, for the i-th frame, can by the i-th two field picture according to Q%, 1-Q%(0 < Q < 100) picture altitude be divided into three regions, judge whether to meet " lower near far away " by the magnitude relationship of the mean parallax comparing valid pixel in three regions.Such as, the i-th two field picture is divided into three regions in short transverse, the height ratio in each region is from top to bottom: 0 ~ 20%, 20% ~ 80%, 80% ~ 1, calculates the mean parallax of each region valid pixel respectively, if top
disp<middle
disp<bottom
disp, then judge the space layout that the i-th frame meets " lower near far away ".Wherein top
disp, middle
disp, bottom
disprepresent the mean parallax value of valid pixel in three regions respectively.Certainly, also the i-th frame can be divided into according to a certain percentage two regions or more than three regions, according to mentioned above principle, judge by the magnitude relationship comparing the mean parallax of valid pixel in regional the space layout that whether the i-th frame meets " lower near far away ".
As mentioned above, the space layout that whether embodiment of the present invention can meet according to the order of severity of frame effect with " lower near far away ", determines the factor of influence disp of disparity space layout to comfort level
distribution.Disp
distributionvalue is high, shows parallax distribution better, not easily makes beholder produce visual fatigue, disp
distributionvalue is low, shows that parallax distribution is poor, easily makes beholder produce visual fatigue.In the specific implementation, disp
distributionspan can be set to [0,1], frame of video there is not frame effect and meet lower near space layout far away when, can by disp
distributionget higher value, such as can in the scope of [0.9,1] value.When there is frame effect in frame of video and discontented foot is near when space layout far away, can by disp
distributionvalue is less.
The embodiment of the present invention gives following a kind of disp
distributionthe possibility of value:
If only meet the spatial distribution of " lower near far away ", then disp
distribution=0.8;
If only meet and there is not frame effect, then disp
distribution=0.9;
If meet optimal layout, that is, there is not frame effect and meet the spatial distribution of " lower near far away ", then disp
distribution=1;
If there is frame effect and the spatial distribution of dissatisfied " lower near far away ", then disp
distribution=0.7.
Can find out, disp
distributionthe first value is got during space layout far away near under frame of video does not exist frame effect and meets, the second value is got during space layout far away near under frame of video exists frame effect but meets, the 3rd value is got during space layout far away frame of video does not exist frame effect but discontented foot is near, the 4th value is got during space layout far away frame of video exists frame effect and discontented foot is near, wherein, first value, the second value, the 3rd value and the 4th value are preset value, and the first value is less than the 4th value, the 3rd value first value with second value between and be worth with first and second be worth all unequal.Optionally, the span of the first value, the second value, the 3rd value and the 4th value is: be greater than zero and be less than or equal to 1.According to disp
distributionvalue order from small to large, the viewing Comfort Evaluation score value calculated also increases (viewing Comfort Evaluation score value is higher, shows that viewing comfort level is higher) according to order from small to large
The time domain specification of a frame of video can comprise the change of the parallax in the viewing area of this frame of video at depth direction.During concrete enforcement, viewing comfort degree factor Ⅴ can be used
d(can be described as the second viewing comfort degree factor) represents the impact of the parallax in viewing area in the change of depth direction.
Parallax is at the change (V of depth direction
d) influencing factor have two, one is the change (V in time domain with reference to parallax
1), another is the change (V of frequency in time domain of reference parallax appearance
2), V
d=f (V
1, V
2).In the embodiment of the present invention, can according to the change of the reference parallax of a frame of video in time domain, and/or the frequency that the reference parallax of this frame of video occurs is in the change of time domain, calculates the viewing comfort degree factor Ⅴ of this frame of video
d.
For the i-th frame, a kind of simple metric parallax is at the change (V of depth direction
d) function be:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame; γ and μ is weighted value, and such as, γ+μ=1, optionally, γ and μ all gets 0.5.Can according to V
1 iwith
the significance level of impact, the value of adjustment γ and μ.A kind of extreme way is γ=1, μ=0, now only considers the impact that parallax changes in time domain.Or μ=1, γ=0, now only considers the impact of frequency in the change of time domain of reference parallax appearance.
For the V of the i-th frame
1 i, embodiments provide two kinds of optional computational methods:
Method 1: by the segmentation of stereopsis frequency range, the reference parallax monotone variation of the frame of video in same video-frequency band and pace of change is identical, according to the V of following formulae discovery i-th frame
1 i:
V
1 i=(disp
last-disp
first)/(Np-1).......................................[3]
Wherein, disp
firstand disp
lastbe respectively the first frame of video-frequency band belonging to the i-th frame and the reference parallax of last frame, the frame of video quantity of Np video-frequency band belonging to the i-th frame.
Method 2: according to the V of following formulae discovery i-th frame
1 i:
V
1 i=min Disp
i-min Disp
i-1..........................................[4]
Wherein, min Disp
iwith min Disp
i-1be respectively the reference parallax of the i-th frame and the i-th-1 frame.
For the i-th frame
can according to following formulae discovery:
Wherein, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame.
Step 206: increase progressively i value, namely i=i+1 is set, and judge whether the i value after increasing progressively exceedes the number of video frames N of described stereopsis frequency range, if be judged as NO, then proceeds to step 202, otherwise process ends.
Can be found out by above flow process, because motion (motion of such as plane motion and/or depth direction) comparatively violent in frame of video and larger crossed disparity easily cause the attention of spectators, therefore the embodiment of the present invention is when the weights of calculating pixel, can according to the plane motion of matched pixel in frame of video (comprise horizontal movement and move both vertically), motion and the crossed disparity of depth direction calculate, choose the pixel of wherein maximum weight as visual focus, viewing area is determined according to this visual focus, this viewing area is generally the region of interest viewers, thus viewing Comfort Evaluation can be carried out for the region of interest viewers.
In three-dimensional video-frequency playing process, human eye accurately cannot identify the comfort level of each frame of video, for the reflection of video comfort degree based on video-frequency band, instead of based on frame of video.Therefore, in the step 103 of Fig. 1, need the comfort level assessing video-frequency band.
Fig. 3 shows the optional implementation method of the one of step 103 in Fig. 1, and as shown in the figure, the method can comprise the following steps:
Step 301: the viewing comfort level calculating each frame of video of stereopsis frequency range.The comfort level of a frame of video is determined jointly by time domain specification and Spatial characteristic.
In the specific implementation, the comfort level model of a frame of video can be (below for the i-th frame):
Wherein,
be the viewing comfort level of the i-th frame, Spatial_frame_vc
ibe the viewing comfort level of the i-th frame of the Spatial characteristic decision of the i-th frame, Temperal_frame_vc
ibe the viewing comfort level of the i-th frame of the time domain specification decision of the i-th frame, α and β is weighted value.
Wherein, disp
distribution ibe the disp of the i-th frame
distributionthe factor, minDisp
ibe the reference parallax of the i-th frame, P (minDisp
i) be the percentage of the reference parallax of the i-th frame, b1 is model parameter;
be the V of the i-th frame
dthe factor; B2 is model parameter, the value of b2 and V
1 iand minDisp
isymbol relevant.B1 and b2 is the model parameter of training out in the process of matching subjective data, and its value can be determined according to hands-on result.
Optionally, if what adopt during subjective evaluation to video-see comfort level is 5 points of systems, minimum 1 point the highest 5 points, then the parameter c1 value in above-mentioned formula 7 and formula 8 is 4.What adopt during subjective evaluation due to video-see comfort level is 5 points of systems, minimum 1 point the highest 5 points, by c1 value be 4 can ensure minimum value be 1 and maximum be 5.In like manner, if other point of system, as 11 point system or hundred-mark system, then can adjust accordingly the value of the model parameter in formula 7 and formula 8.
Table 1 shows one group of optional model parameter value.
Table 1 model parameter
Step 302: according to the visual focus position of described each frame of video of stereopsis frequency range, subsegment division is carried out to described stereopsis frequency range, each subsegment at least comprises a frame of video, the visual focus position transfer amount of the frame of video of each subsegment is not more than setting transfer amount threshold value, then calculates the viewing comfort level of each subsegment.
In this step, this continuous multiple frames, according to being the conversion amount size of focus in locus, if the locus of the focus of continuous multiple frames is close, is then divided in a subsegment by segmentation.Such as, if the spatial focal point position transfer amount δ p of adjacent video frames is less than 5, then described adjacent video frames is divided into a subsegment, otherwise described adjacent video frames is divided into different subsegment.Wherein,
d
xand d
yrepresent the horizontal and vertical displacement of focus transfer respectively.
During concrete enforcement, the specific formula for calculation calculating the viewing comfort level of each subsegment can be:
Wherein,
for the level of comfort of a kth subsegment, N
ffor the number of frame of video in a kth subsegment, P
1for combined parameters, being setting numerical value, can selected value be such as 7.P
1can value be greater than 1 integer.P
1choosing of selectable value can obtain according to existing test section and the training of corresponding subjective comfort score value, by arranging reasonable value, the correlation of subjective score and objective score value can be made the highest.
Optionally, if what adopt during subjective evaluation to video-see comfort level is 5 points of systems, minimum 1 point the highest 5 points, then the parameter c2 value in above-mentioned formula 9 and formula 10 is 5, and namely value is the best result of subjective evaluation.In like manner, if other point of system, as 11 point system or hundred-mark system, then can adjust accordingly the value of the model parameter in formula 7 and formula 8.
Step 303: the viewing comfort level calculating described stereopsis frequency range according to the viewing comfort level of each subsegment.
During concrete enforcement, specific formula for calculation can be:
Wherein, Q is the viewing comfort level of described stereopsis frequency range, N
sfor the number of subsegment in described video-frequency band, P
2for time domain combined parameter, being set point, can selected value be such as 3.P
2can value be greater than 1 integer.P
2choosing of selectable value can obtain according to existing test section and the training of corresponding subjective comfort score value, by arranging reasonable value, the correlation of subjective score and objective score value can be made the highest.
Fig. 4 shows the optional implementation method of the another kind of step 103 in Fig. 1, and in the method, the statistical parameter of the characterisitic parameter section of being extracted in viewing Comfort Evaluation model, does not need to calculate the comfort level of each frame of video and the comfort level of subsegment separately.As shown in the figure, the method can comprise the following steps:
Step 401: according to the Spatial characteristic of each frame of video of stereopsis frequency range, calculates the Spatial characteristic of this stereopsis frequency range.
The Spatial characteristic of described stereopsis frequency range can comprise one of following parameter or combination in any: the reference parallax of this stereopsis frequency range, the viewing comfort degree factor of this stereopsis frequency range
During concrete enforcement, the reference parallax of stereopsis frequency range calculates by such as under type: the weighted average of the reference parallax of all frame of video in stereopsis frequency range as described in calculating.Specific formula for calculation can be:
Wherein,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range, N is the number of frame of video in described stereopsis frequency range, minDisp
ibe the reference parallax of the i-th frame, P (min Disp
i) be the percentage of the reference parallax of the i-th frame.
During concrete enforcement, the viewing comfort degree factor of stereopsis frequency range
calculate by such as under type: the viewing comfort degree factor disp of all frame of video in stereopsis frequency range as described in calculating
distributionmean value.Specific formula for calculation can be:
Wherein,
for the viewing comfort degree factor disp of all frame of video in described stereopsis frequency range
distributionmean value, disp
distribution ibe the viewing comfort degree factor disp of the i-th frame
distribution;
Step 402: according to the time domain specification of each frame of video of this stereopsis frequency range, calculates the time domain specification of this stereopsis frequency range.
The time domain specification of described stereopsis frequency range can comprise the viewing comfort degree factor of this stereopsis frequency range
the viewing comfort degree factor Ⅴ of described stereopsis frequency range
d, specifically by following formulae discovery:
Wherein,
for the viewing comfort degree factor Ⅴ of described stereopsis frequency range
d,
for the viewing comfort degree factor Ⅴ of the i-th frame in described stereopsis frequency range
d, P
ifor set point, P
ivalue according to V
1 iwith min Disp
isymbol determine, that is, P
ivalue and V
1 iwith min Disp
isymbol relevant, a kind of desirable value is:
Work as V
1 iwith min Disp
iduring contrary sign, P
i=0.8;
Work as V
1 iwith min Disp
iduring jack per line, P
i=1;
Pass through P
ithe Different Effects of different motion direction to viewing comfort level can be distinguished.Work as V
1 iwith min Disp
iduring jack per line, object moves to the direction away from screen, the non-constant of comfort level in this case, P
idesirable higher value, such as above-mentioned P
i=1; Work as V
1 iwith min Disp
iduring contrary sign, object moves near the direction of screen, and comfort level is in this case better, P
idesirable smaller value, such as above-mentioned P
i=0.8.
Step 403: the parameter calculated according to step 401 ~ 402, calculates the viewing comfort level of described stereopsis frequency range.
Specific formula for calculation can be:
VC=α*Spatial_vc+β*Temperal_vc....................................[14]
Wherein, VC is the viewing comfort level of described stereopsis frequency range, α and β is weighted value, optionally, and alpha+beta=1; B1 is model parameter.One group of available model parameter is: α=0.8, β=0.2, b1=0.04.The choosing of selectable value of b1 can obtain according to existing test section and the training of corresponding subjective comfort score value, and when getting this value, the correlation of subjective score and objective score value is the highest.
Optionally, if what adopt during subjective evaluation to video-see comfort level is 5 points of systems, minimum 1 point the highest 5 points, then the parameter c3 value in above-mentioned formula 15 and formula 16 is 4, and namely value is the best result of subjective evaluation.In like manner, if other point of system, as 11 point system or hundred-mark system, then can adjust accordingly the value of the model parameter in formula 15 and formula 16.
Can be found out by above flow process, the embodiment of the present invention, by detecting human eye vision focal position, determines viewing area according to human eye angular field of view, utilizes converge like the spokes of a wheel at the hub severity metrics three-dimensional video-frequency conflicted with adjustment in viewing area to watch comfort level.Converge like the spokes of a wheel at the hub with regulate the order of severity of conflicting by analyzing three-dimensional video-frequency space-time function distribution situation and time domain parallax distribution situation is measured.The appraisal procedure that the embodiment of the present invention proposes has taken into full account the impact of disparity space distribution and Annual distribution stereoscopic video comfort level, and can give prominence to the poor frame of comfort level to the impact of global comfort.
In sum, the assessment of three-dimensional video-frequency comfort level is the precondition improving three-dimensional video-frequency viewing comfort level, is one of key technology affecting three-dimensional video-frequency development.The objective evaluation scheme of three-dimensional video-frequency comfort level that the embodiment of the present invention proposes, computation complexity is low, and reliability is high, extensive use and three-dimensional video-frequency comfort level can assess and monitor.The program has taken into full account the Spatial characteristic of three-dimensional video-frequency and the impact of time domain specification stereoscopic video comfort level compared with existing program, and take into account human visual system and viewer watching habits, cover the main influencing factor of impact viewing comfort level, thus it is comparatively objective and accurate to make three-dimensional video-frequency watch Comfort Evaluation.
Based on identical technical conceive, the embodiment of the present invention additionally provides a kind of three-dimensional video-frequency evaluating apparatus.
See Fig. 5, it is the structural representation of the three-dimensional video-frequency evaluating apparatus that the embodiment of the present invention provides.
Acquisition module 501, for obtaining the frame of video of stereopsis frequency range;
Extraction module 502, for extracting Spatial characteristic and the time domain specification of the frame of video of the stereopsis frequency range that described acquisition module gets;
Evaluation module 503, for the Spatial characteristic of the frame of video of described stereopsis frequency range that extracts according to described extraction module and time domain specification, determines the viewing comfort level of described stereopsis frequency range.
In conjunction with said apparatus, in the implementation that the first is possible, extraction module 502 is specifically for the parallax of estimating the frame of video of described stereopsis frequency range, the visual focus position of the frame of video of described stereopsis frequency range is determined according to the parallax of the frame of video of described stereopsis frequency range and movable information, the viewing area of the frame of video of described stereopsis frequency range is determined according to the visual focus position of the frame of video of described stereopsis frequency range, and according to the viewing area of the parallax of the frame of video of described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range.
In conjunction with the first possible implementation of said apparatus, in the implementation that the second is possible, in the frame of video that extraction module 502 can determine described stereopsis frequency range, the weights of each pixel, are defined as the visual focus position of the frame of video of described stereopsis frequency range by the position of the pixel with maximum weights.
In conjunction with the implementation that the second of said apparatus is possible, in the implementation that the third is possible, extraction module 502 can adopt formula (1) to calculate the weights of described pixel, the expression formula of formula (1) and the implication of relevant parameter ditto described in, no longer repeat at this.
In conjunction with the second or the third possible implementation of said apparatus, in the 4th kind of possible implementation, extraction module 502 specifically for: if the pixel in the frame of video of described stereopsis frequency range with maximum weights has multiple, then multiplely have in the pixel of maximum weights by described, the position of the pixel nearest apart from the image center location of the frame of video of described stereopsis frequency range, is defined as the visual focus position of the frame of video of described stereopsis frequency range.
In conjunction with the first of said apparatus to any one the possible implementation in the 4th kind of possible implementation, in the 5th kind of possible implementation, described Spatial characteristic comprises one of following parameter or combination in any: with reference to parallax, the percentage with reference to parallax, the first viewing comfort degree factor;
Extraction module 502 specifically for: determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
Calculate in the viewing area of the frame of video of described stereopsis frequency range, parallax is described with reference to the number of pixel of parallax and the ratio of the number of valid pixel, obtain the percentage of the reference parallax of the frame of video of described stereopsis frequency range, the parallax absolute value that described valid pixel refers to is less than the pixel of hunting zone;
Whether there is frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away, determine the first viewing comfort degree factor of the frame of video of described stereopsis frequency range, the first value is got during space layout far away the described first viewing comfort degree factor is near under the frame of video of described stereopsis frequency range does not exist frame effect and meets, the second value is got during space layout far away near under the frame of video of described stereopsis frequency range exists frame effect but meets, the 3rd value is got during space layout far away the frame of video of described stereopsis frequency range does not exist frame effect but discontented foot is near, the 4th value is got during space layout far away the frame of video of described stereopsis frequency range exists frame effect and discontented foot is near, wherein, first value, second value, 3rd value and the 4th value are preset value, and the first value is less than the 4th value, 3rd value first value with second value between and be worth with first and second be worth all unequal, described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect, described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators.
In conjunction with the first of said apparatus to any one the possible implementation in the 5th kind of possible implementation, in the 6th kind of possible implementation, described time domain specification comprises the second viewing comfort degree factor;
Extraction module 502 specifically for: according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
In conjunction with the 6th kind of possible implementation of said apparatus, in the 7th kind of possible implementation, extraction module 502 specifically for: described stereopsis frequency range is carried out subsegment division, the reference parallax monotone variation of the frame of video in same subsegment and pace of change is identical, the change of reference parallax in time domain of the frame of video of described stereopsis frequency range is calculated according to formula (3), the frequency that the reference parallax calculating the frame of video of described stereopsis frequency range according to formula (5) occurs is in the change of time domain, the second viewing comfort degree factor of the frame of video of described stereopsis frequency range is calculated according to formula (2).The expression formula of formula (3), formula (2) and formula (5) and the implication of relevant parameter ditto described in, no longer repeat at this.
In conjunction with the 6th kind of possible implementation of said apparatus, in the 8th kind of possible implementation, extraction module 502 can calculate the change of reference parallax in time domain of the frame of video of described stereopsis frequency range according to formula (4), the frequency that the reference parallax calculating the frame of video of described stereopsis frequency range according to formula (5) occurs, in the change of time domain, calculates the second viewing comfort degree factor of the frame of video of described stereopsis frequency range according to formula (2).The expression formula of formula (4), formula (5) and formula (2) and the implication of relevant parameter ditto described in, no longer repeat at this.
In conjunction with the first of said apparatus or said apparatus to any one the possible implementation in the 8th kind of possible implementation, in the 9th kind of possible implementation, evaluation module 503 specifically for:
According to Spatial characteristic and the time domain specification of each frame of video of described stereopsis frequency range, calculate the viewing comfort level of each frame of video described respectively;
According to the visual focus position of each frame of video of described stereopsis frequency range, carry out subsegment division to described stereopsis frequency range, the visual focus position transfer amount of the frame of video of each subsegment is not more than setting transfer amount threshold value; According to the viewing comfort level of frame of video each in each subsegment, calculate the viewing comfort level of described each subsegment respectively;
The viewing comfort level of described stereopsis frequency range is calculated according to the viewing comfort level of each subsegment.
In conjunction with the 9th kind of possible implementation of said apparatus, in the tenth kind of possible implementation, evaluation module 503 can calculate the viewing comfort level of the frame of video of described stereopsis frequency range according to formula (6), formula (7) and formula (8).The expression formula of formula (4), formula (5) and formula (2) and the implication of relevant parameter ditto described in, no longer repeat at this.The percentage of the reference parallax of the reference parallax of the frame of video of the stereopsis frequency range related in computational process, the frame of video of stereopsis frequency range, the first viewing comfort degree factor, the isoparametric computational methods of the second viewing comfort degree factor, described in the same, no longer repeat at this.
In conjunction with the 9th kind or the tenth kind of possible implementation of said apparatus, in the 11 kind of possible implementation, evaluation module 503 specifically for: according to formula (9) calculate institute stereopsis frequency range viewing comfort level.The implication of formula (9) expression formula and relevant parameter ditto described in, no longer repeat at this.
In conjunction with any one the possible implementation in the 9th kind to the 11 kind possible implementation of said apparatus, in the 12 kind of possible implementation, evaluation module 503 can calculate the viewing comfort level of described stereopsis frequency range according to formula (10).The implication of formula (10) expression formula and relevant parameter ditto described in, no longer repeat at this.
In conjunction with the first of said apparatus or said apparatus to any one the possible implementation in the 12 kind of possible implementation, in the 13 kind of possible implementation, evaluation module 503 can according to the Spatial characteristic of the frame of video of described stereopsis frequency range, calculate the Spatial characteristic of described stereopsis frequency range, according to the time domain specification of the frame of video of described stereopsis frequency range, calculate the time domain specification of described stereopsis frequency range, according to Spatial characteristic and the time domain specification of described stereopsis frequency range, calculate the viewing comfort level of described stereopsis frequency range.
In conjunction with the 13 kind of possible implementation of said apparatus, in the 14 kind of possible implementation, the Spatial characteristic of the frame of video of described stereopsis frequency range comprises one of following parameter or combination in any: the reference parallax of the frame of video of described stereopsis frequency range, with reference to percentage and the first viewing comfort degree factor of parallax, the circular of these parameters ditto described in, no longer repeat at this.Accordingly, evaluation module 503 can calculate the weighted average of the reference parallax of all frame of video in described stereopsis frequency range according to formula (11), calculate the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range according to formula (12).The implication of formula (11) and formula (12) expression formula and relevant parameter ditto described in, no longer repeat at this.
In conjunction with the 13 kind or the 14 kind of possible implementation of said apparatus, in the 15 kind of possible implementation, the time domain specification of the frame of video of described stereopsis frequency range comprises: the second viewing comfort degree factor of the frame of video of described stereopsis frequency range (circular ditto described in, no longer repeat at this), evaluation module 503 can calculate the second viewing comfort degree factor of described stereopsis frequency range according to formula (13).The implication of formula (13) expression formula and relevant parameter ditto described in, no longer repeat at this.
In conjunction with any one the possible implementation in the 13 kind to the 15 kind possible implementation of said apparatus, in the 16 kind of possible implementation, evaluation module 503 can calculate the viewing comfort level of described stereopsis frequency range according to formula (14), formula (15) and formula (16).The implication of formula (14), formula (15) and formula (16) expression formula and relevant parameter ditto described in, no longer repeat at this.
Based on identical technical conceive, the embodiment of the present invention additionally provides a kind of three-dimensional video-frequency evaluating apparatus.
See Fig. 6, it is the structural representation of the three-dimensional video-frequency evaluating apparatus that the embodiment of the present invention provides.This device can comprise: transceiver 601, memory 602 and processor 603.Memory 602, for storing the information such as application program, algorithmic rule, calculating parameter, also can be used to the intermediate object program produced in storage of processor 603 processing procedure.
Transceiver 601, for obtaining the frame of video of stereopsis frequency range;
Processor 603, for extracting Spatial characteristic and the time domain specification of the frame of video of the stereopsis frequency range that described acquisition module gets; The Spatial characteristic of the frame of video of the described stereopsis frequency range extracted according to described extraction module and time domain specification, determine the viewing comfort level of described stereopsis frequency range.
In conjunction with said apparatus, in the implementation that the first is possible, processor 603 is specifically for the parallax of estimating the frame of video of described stereopsis frequency range, the visual focus position of the frame of video of described stereopsis frequency range is determined according to the parallax of the frame of video of described stereopsis frequency range and movable information, the viewing area of the frame of video of described stereopsis frequency range is determined according to the visual focus position of the frame of video of described stereopsis frequency range, and according to the viewing area of the parallax of the frame of video of described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range.
In conjunction with the first possible implementation of said apparatus, in the implementation that the second is possible, in the frame of video that processor 603 can determine described stereopsis frequency range, the weights of each pixel, are defined as the visual focus position of the frame of video of described stereopsis frequency range by the position of the pixel with maximum weights.
In conjunction with the implementation that the second of said apparatus is possible, in the implementation that the third is possible, processor 603 can adopt formula (1) to calculate the weights of described pixel, the expression formula of formula (1) and the implication of relevant parameter ditto described in, no longer repeat at this.
In conjunction with the second or the third possible implementation of said apparatus, in the 4th kind of possible implementation, processor 603 specifically for: if the pixel in the frame of video of described stereopsis frequency range with maximum weights has multiple, then multiplely have in the pixel of maximum weights by described, the position of the pixel nearest apart from the image center location of the frame of video of described stereopsis frequency range, is defined as the visual focus position of the frame of video of described stereopsis frequency range.
In conjunction with the first of said apparatus to any one the possible implementation in the 4th kind of possible implementation, in the 5th kind of possible implementation, described Spatial characteristic comprises one of following parameter or combination in any: with reference to parallax, the percentage with reference to parallax, the first viewing comfort degree factor;
Processor 603 specifically for: determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
Calculate in the viewing area of the frame of video of described stereopsis frequency range, parallax is described with reference to the number of pixel of parallax and the ratio of the number of valid pixel, obtain the percentage of the reference parallax of the frame of video of described stereopsis frequency range, the parallax absolute value that described valid pixel refers to is less than the pixel of hunting zone;
Whether there is frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away, determine the first viewing comfort degree factor of the frame of video of described stereopsis frequency range, the first value is got during space layout far away the described first viewing comfort degree factor is near under the frame of video of described stereopsis frequency range does not exist frame effect and meets, the second value is got during space layout far away near under the frame of video of described stereopsis frequency range exists frame effect but meets, the 3rd value is got during space layout far away the frame of video of described stereopsis frequency range does not exist frame effect but discontented foot is near, the 4th value is got during space layout far away the frame of video of described stereopsis frequency range exists frame effect and discontented foot is near, wherein, first value, second value, 3rd value and the 4th value are preset value, and the first value is less than the 4th value, 3rd value first value with second value between and be worth with first and second be worth all unequal, described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect, described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators.
In conjunction with the first of said apparatus to any one the possible implementation in the 5th kind of possible implementation, in the 6th kind of possible implementation, described time domain specification comprises the second viewing comfort degree factor;
Processor 603 specifically for: according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
In conjunction with the 6th kind of possible implementation of said apparatus, in the 7th kind of possible implementation, processor 603 specifically for: described stereopsis frequency range is carried out subsegment division, the reference parallax monotone variation of the frame of video in same subsegment and pace of change is identical, the change of reference parallax in time domain of the frame of video of described stereopsis frequency range is calculated according to formula (3), the frequency that the reference parallax calculating the frame of video of described stereopsis frequency range according to formula (5) occurs is in the change of time domain, the second viewing comfort degree factor of the frame of video of described stereopsis frequency range is calculated according to formula (2).The expression formula of formula (3), formula (2) and formula (5) and the implication of relevant parameter ditto described in, no longer repeat at this.
In conjunction with the 6th kind of possible implementation of said apparatus, in the 8th kind of possible implementation, processor 603 can calculate the change of reference parallax in time domain of the frame of video of described stereopsis frequency range according to formula (4), the frequency that the reference parallax calculating the frame of video of described stereopsis frequency range according to formula (5) occurs, in the change of time domain, calculates the second viewing comfort degree factor of the frame of video of described stereopsis frequency range according to formula (2).The expression formula of formula (4), formula (5) and formula (2) and the implication of relevant parameter ditto described in, no longer repeat at this.
In conjunction with the first of said apparatus or said apparatus to any one the possible implementation in the 8th kind of possible implementation, in the 9th kind of possible implementation, processor 603 specifically for:
According to Spatial characteristic and the time domain specification of each frame of video of described stereopsis frequency range, calculate the viewing comfort level of each frame of video described respectively;
According to the visual focus position of each frame of video of described stereopsis frequency range, carry out subsegment division to described stereopsis frequency range, the visual focus position transfer amount of the frame of video of each subsegment is not more than setting transfer amount threshold value; According to the viewing comfort level of frame of video each in each subsegment, calculate the viewing comfort level of described each subsegment respectively;
The viewing comfort level of described stereopsis frequency range is calculated according to the viewing comfort level of each subsegment.
In conjunction with the 9th kind of possible implementation of said apparatus, in the tenth kind of possible implementation, processor 603 can calculate the viewing comfort level of the frame of video of described stereopsis frequency range according to formula (6), formula (7) and formula (8).The expression formula of formula (4), formula (5) and formula (2) and the implication of relevant parameter ditto described in, no longer repeat at this.The percentage of the reference parallax of the reference parallax of the frame of video of the stereopsis frequency range related in computational process, the frame of video of stereopsis frequency range, the first viewing comfort degree factor, the isoparametric computational methods of the second viewing comfort degree factor, described in the same, no longer repeat at this.
In conjunction with the 9th kind or the tenth kind of possible implementation of said apparatus, in the 11 kind of possible implementation, processor 603 specifically for: according to formula (9) calculate institute stereopsis frequency range viewing comfort level.The implication of formula (9) expression formula and relevant parameter ditto described in, no longer repeat at this.
In conjunction with any one the possible implementation in the 9th kind to the 11 kind possible implementation of said apparatus, in the 12 kind of possible implementation, processor 603 can calculate the viewing comfort level of described stereopsis frequency range according to formula (10).The implication of formula (10) expression formula and relevant parameter ditto described in, no longer repeat at this.
In conjunction with the first of said apparatus or said apparatus to any one the possible implementation in the 12 kind of possible implementation, in the 13 kind of possible implementation, processor 603 can according to the Spatial characteristic of the frame of video of described stereopsis frequency range, calculate the Spatial characteristic of described stereopsis frequency range, according to the time domain specification of the frame of video of described stereopsis frequency range, calculate the time domain specification of described stereopsis frequency range, according to Spatial characteristic and the time domain specification of described stereopsis frequency range, calculate the viewing comfort level of described stereopsis frequency range.
In conjunction with the 13 kind of possible implementation of said apparatus, in the 14 kind of possible implementation, the Spatial characteristic of the frame of video of described stereopsis frequency range comprises one of following parameter or combination in any: the reference parallax of the frame of video of described stereopsis frequency range, with reference to percentage and the first viewing comfort degree factor of parallax, the circular of these parameters ditto described in, no longer repeat at this.Accordingly, processor 603 can calculate the weighted average of the reference parallax of all frame of video in described stereopsis frequency range according to formula (11), calculate the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range according to formula (12).The implication of formula (11) and formula (12) expression formula and relevant parameter ditto described in, no longer repeat at this.
In conjunction with the 13 kind or the 14 kind of possible implementation of said apparatus, in the 15 kind of possible implementation, the time domain specification of the frame of video of described stereopsis frequency range comprises: the second viewing comfort degree factor of the frame of video of described stereopsis frequency range (circular ditto described in, no longer repeat at this), processor 603 can calculate the second viewing comfort degree factor of described stereopsis frequency range according to formula (13).The implication of formula (13) expression formula and relevant parameter ditto described in, no longer repeat at this.
In conjunction with any one the possible implementation in the 13 kind to the 15 kind possible implementation of said apparatus, in the 16 kind of possible implementation, processor 603 can calculate the viewing comfort level of described stereopsis frequency range according to formula (14), formula (15) and formula (16).The implication of formula (14), formula (15) and formula (16) expression formula and relevant parameter ditto described in, no longer repeat at this.
The present invention describes with reference to according to the flow chart of the method for the embodiment of the present invention, equipment (system) and computer program and/or block diagram.Should understand can by the combination of the flow process in each flow process in computer program instructions realization flow figure and/or block diagram and/or square frame and flow chart and/or block diagram and/or square frame.These computer program instructions can be provided to the processor of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing device, make the function that the instruction that performed by the processor of this computer or other programmable data processing device can be specified in a flow process in realization flow figure or multiple flow process and/or block diagram square frame or multiple square frame.
These computer program instructions also can be stored in can in the computer-readable memory that works in a specific way of vectoring computer or other programmable data processing device, the instruction making to be stored in this computer-readable memory produces the manufacture comprising command device, and this command device realizes the function of specifying in flow chart flow process or multiple flow process and/or block diagram square frame or multiple square frame.
These computer program instructions also can be loaded in computer or other programmable data processing device, make on computer or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computer or other programmable devices is provided for the step realizing the function of specifying in a flow process of flow chart or a square frame of multiple flow process and/or block diagram or multiple square frame.
Although describe optional embodiment of the present invention, those skilled in the art once obtain the basic creative concept of cicada, then can make other change and amendment to these embodiments.So claims are intended to be interpreted as comprising embodiment and falling into all changes and the amendment of the scope of the invention.
Obviously, those skilled in the art can carry out various change and modification to the present invention and not depart from the spirit and scope of the present invention.Like this, if these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, then the present invention is also intended to comprise these change and modification.
Claims (34)
1. a three-dimensional video-frequency Comfort Evaluation method, is characterized in that, comprising:
Obtain the frame of video of stereopsis frequency range;
Extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range;
According to Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range, determine the viewing comfort level of described stereopsis frequency range.
2. the method for claim 1, is characterized in that, Spatial characteristic and the time domain specification of the frame of video of the described stereopsis frequency range of described extraction comprise:
Estimate the parallax of the frame of video of described stereopsis frequency range;
The visual focus position of the frame of video of described stereopsis frequency range is determined according to the parallax of the frame of video of described stereopsis frequency range and movable information;
The viewing area of the frame of video of described stereopsis frequency range is determined according to the visual focus position of the frame of video of described stereopsis frequency range;
According to the viewing area of the parallax of the frame of video of described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range.
3. method as claimed in claim 2, it is characterized in that, the parallax of the described frame of video according to described stereopsis frequency range and movable information determine the visual focus position of the frame of video of described stereopsis frequency range, comprising:
In the frame of video determining described stereopsis frequency range, the weights of each pixel, are defined as the visual focus position of the frame of video of described stereopsis frequency range by the position of the pixel with maximum weights.
4. method as claimed in claim 3, it is characterized in that, the weights of described pixel adopt following formulae discovery:
W=γ*|mv|+η*|disp
crossed|+λ*|δd|
Wherein, the weights of W to be coordinate the be pixel of (x, y), γ, η and λ are weighted value;
represent that described coordinate is the plane motion vector of the pixel of (x, y), d
xand d
ybe respectively horizontal displacement and vertical displacement that described coordinate is the pixel of (x, y), described d
xand d
ysearch in the adjacent video frames of described coordinate to be the pixel place frame of video of (x, y) and described coordinate the be frame of video at the pixel place of (x, y) and obtain; Disp
crossedfor crossed disparity value; δ d is the difference of the mean parallax of the described frame of video of stereopsis frequency range and the match block of adjacent video frames thereof.
5. the method as described in claim 3 or 4, it is characterized in that, if the pixel in the frame of video of described stereopsis frequency range with maximum weights has multiple, then multiplely have in the pixel of maximum weights by described, the position of the pixel nearest apart from the image center location of the frame of video of described stereopsis frequency range, is defined as the visual focus position of the frame of video of described stereopsis frequency range.
6. the method according to any one of claim 2-5, is characterized in that, described Spatial characteristic comprises one of following parameter or combination in any: with reference to parallax, watch the comfort degree factor with reference to the percentage and first of parallax;
The viewing area of the parallax of the described frame of video according to described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract the Spatial characteristic of the frame of video of described stereopsis frequency range, comprising:
Determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
Calculate in the viewing area of the frame of video of described stereopsis frequency range, parallax is described with reference to the number of pixel of parallax and the ratio of the number of valid pixel, obtain the percentage of the reference parallax of the frame of video of described stereopsis frequency range, the parallax absolute value that described valid pixel refers to is less than the pixel of hunting zone;
Whether there is frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away, determine the first viewing comfort degree factor of the frame of video of described stereopsis frequency range, the first value is got during space layout far away the described first viewing comfort degree factor is near under the frame of video of described stereopsis frequency range does not exist frame effect and meets, the second value is got during space layout far away near under the frame of video of described stereopsis frequency range exists frame effect but meets, the 3rd value is got during space layout far away the frame of video of described stereopsis frequency range does not exist frame effect but discontented foot is near, the 4th value is got during space layout far away the frame of video of described stereopsis frequency range exists frame effect and discontented foot is near, wherein, first value, second value, 3rd value and the 4th value are preset value, and the first value is less than the 4th value, 3rd value first value with second value between and be worth with first and second be worth all unequal, described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect, described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators.
7. the method according to any one of claim 2-6, is characterized in that, described time domain specification comprises the second viewing comfort degree factor;
The viewing area of the parallax of the described frame of video according to described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract the time domain specification of the frame of video of described stereopsis frequency range, comprising:
According to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
8. method as claimed in claim 7, it is characterized in that, according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, comprising:
Described stereopsis frequency range is carried out subsegment division, the reference parallax monotone variation of the frame of video in same subsegment and pace of change is identical, according to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=(disp
last-disp
first)/(Np-1)
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, disp
firstand disp
lastbe respectively the first frame of subsegment belonging to the i-th frame and the reference parallax of last frame, the frame of video quantity of Np subsegment belonging to the i-th frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value; Described i-th frame is any frame of video in described stereopsis frequency range.
9. method as claimed in claim 7, it is characterized in that, according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, comprising:
According to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=min Disp
i-min Disp
i-1
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, min Disp
iwith min Disp
i-1be respectively the reference parallax of the i-th frame and the i-th-1 frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value; Described i-th frame is any frame of video in described stereopsis frequency range.
10. method as claimed in any one of claims 1-9 wherein, is characterized in that the Spatial characteristic of the described frame of video according to described stereopsis frequency range and time domain specification are determined the viewing comfort level of described stereopsis frequency range, being comprised:
According to Spatial characteristic and the time domain specification of each frame of video of described stereopsis frequency range, calculate the viewing comfort level of each frame of video described respectively;
According to the visual focus position of each frame of video of described stereopsis frequency range, carry out subsegment division to described stereopsis frequency range, the visual focus position transfer amount of the frame of video of each subsegment is not more than setting transfer amount threshold value; According to the viewing comfort level of frame of video each in each subsegment, calculate the viewing comfort level of described each subsegment respectively;
The viewing comfort level of described stereopsis frequency range is calculated according to the viewing comfort level of each subsegment.
11. methods as claimed in claim 10, is characterized in that, the viewing comfort level of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein,
be the viewing comfort level of the i-th frame, Spatial_frame_vc
ibe the viewing comfort level of the i-th frame of the Spatial characteristic decision of the i-th frame, Temperal_frame_vc
ibe the viewing comfort level of the i-th frame of the time domain specification decision of the i-th frame, α and β is weighted value; Disp
distribution ibe the first viewing comfort degree factor of the i-th frame, minDisp
ibe the reference parallax of the i-th frame, P (minDisp
i) be the percentage of the reference parallax of the i-th frame;
it is the second viewing comfort degree factor of the i-th frame; B1, b2 and c1 are model parameter, and described b1, b2 and c1 are set point; Described i-th frame is any frame of video in described stereopsis frequency range;
The reference parallax of the frame of video of described stereopsis frequency range obtains in the following manner: determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is that the frame of video of described stereopsis frequency range is with reference to the number of pixels of parallax and the ratio of valid pixel number;
Whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
12. methods as described in claim 10 or 11, is characterized in that, the viewing comfort level of the subsegment of stereopsis frequency range according to following formulae discovery:
Wherein,
for the level of comfort of a kth subsegment,
for the viewing comfort level of the i-th frame in a described kth subsegment, N
ffor the number of frame of video in a described kth subsegment, P
1with c2 for setting numerical value, a described kth subsegment is any one subsegment in described stereopsis frequency range.
13. methods according to any one of claim 10-12, is characterized in that, the viewing comfort level of stereopsis frequency range according to following formulae discovery:
Wherein, Q is the viewing comfort level of described stereopsis frequency range,
for the level of comfort of a kth subsegment in described stereopsis frequency range, N
sfor the number of subsegment in described stereopsis frequency range, P
2be set point with c2.
14. methods as claimed in any one of claims 1-9 wherein, is characterized in that, the Spatial characteristic of the described frame of video according to described stereopsis frequency range and time domain specification are determined the viewing comfort level of described stereopsis frequency range, being comprised:
According to the Spatial characteristic of the frame of video of described stereopsis frequency range, calculate the Spatial characteristic of described stereopsis frequency range;
According to the time domain specification of the frame of video of described stereopsis frequency range, calculate the time domain specification of described stereopsis frequency range;
According to Spatial characteristic and the time domain specification of described stereopsis frequency range, calculate the viewing comfort level of described stereopsis frequency range.
15. methods as claimed in claim 14, it is characterized in that, the Spatial characteristic of the frame of video of described stereopsis frequency range comprises one of following parameter or combination in any: the reference parallax of the frame of video of described stereopsis frequency range, with reference to percentage and the first viewing comfort degree factor of parallax;
The Spatial characteristic of the frame of video of described stereopsis frequency range obtains in the following way:
By in the viewing area of the frame of video of described stereopsis frequency range, the parallax value of the pixel met the following conditions is defined as the reference parallax of the frame of video of described stereopsis frequency range: in the viewing area of the frame of video of described stereopsis frequency range, the quantity of the described pixel corresponding with reference to parallax is greater than setting pixel quantity threshold value, and described reference parallax is the minimum parallax value in parallax set; Described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range, the percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is the number of pixels of the reference parallax of the frame of video of described stereopsis frequency range and the ratio of valid pixel number;
The first viewing comfort degree factor of the frame of video of described stereopsis frequency range, whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
The Spatial characteristic of the described frame of video according to described stereopsis frequency range, calculates the Spatial characteristic of described stereopsis frequency range, comprising:
The weighted average of the reference parallax of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range, N is the number of frame of video in described stereopsis frequency range, minDisp
ibe the reference parallax of the i-th frame, P (min Disp
i) be the percentage of the reference parallax of the i-th frame, described i-th frame is any frame of video in described stereopsis frequency range;
The mean value of the first viewing comfort degree factor of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range, disp
distribution ibe the first viewing comfort degree factor of the i-th frame, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
16. methods as described in claims 14 or 15, it is characterized in that, the time domain specification of the frame of video of described stereopsis frequency range comprises:
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction;
The time domain specification of the described frame of video according to described stereopsis frequency range, calculates the time domain specification of described stereopsis frequency range, comprising:
The second viewing comfort degree factor of stereopsis frequency range according to following formulae discovery:
Wherein,
for the second viewing comfort degree factor of described stereopsis frequency range,
for the second viewing comfort level impression factor of the i-th frame in described stereopsis frequency range, P
ifor set point, P
ivalue according to V
1 iwith min Disp
isymbol determine, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
17. methods according to any one of claim 14-16, it is characterized in that, the viewing comfort level of described stereopsis frequency range adopts following formulae discovery:
VC=α*Spatial_vc+β*Temperal_vc
Wherein, VC is the viewing comfort level of described stereopsis frequency range, α and β is weighted value; B1, c3 are set point; Spatial_frame_vc is the viewing comfort level of the described stereopsis frequency range that Spatial characteristic determines, Temperal_frame_vc is the viewing comfort level of the described stereopsis frequency range that time domain specification determines;
for the mean value of the first viewing comfort level impression factor of all frame of video in described stereopsis frequency range,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range,
for the second viewing comfort degree factor of described stereopsis frequency range.
18. 1 kinds of three-dimensional video-frequency Comfort Evaluation devices, is characterized in that, comprising:
Acquisition module, for obtaining the frame of video of stereopsis frequency range;
Extraction module, for extracting Spatial characteristic and the time domain specification of the frame of video of the stereopsis frequency range that described acquisition module gets;
Evaluation module, for the Spatial characteristic of the frame of video of described stereopsis frequency range that extracts according to described extraction module and time domain specification, determines the viewing comfort level of described stereopsis frequency range.
19. devices as claimed in claim 18, is characterized in that, described extraction module specifically for, estimate the parallax of the frame of video of described stereopsis frequency range;
The visual focus position of the frame of video of described stereopsis frequency range is determined according to the parallax of the frame of video of described stereopsis frequency range and movable information;
The viewing area of the frame of video of described stereopsis frequency range is determined according to the visual focus position of the frame of video of described stereopsis frequency range;
According to the viewing area of the parallax of the frame of video of described stereopsis frequency range and the frame of video of described stereopsis frequency range, extract Spatial characteristic and the time domain specification of the frame of video of described stereopsis frequency range.
20. devices as claimed in claim 19, it is characterized in that, described extraction module specifically for, in the frame of video determining described stereopsis frequency range, the weights of each pixel, are defined as the visual focus position of the frame of video of described stereopsis frequency range by the position of the pixel with maximum weights.
21. devices as claimed in claim 20, it is characterized in that, described extraction module adopts the weights of pixel described in following formulae discovery:
W=γ*|mv|+η*|disp
crossed|+λ*|δd|
Wherein, the weights of W to be coordinate the be pixel of (x, y), γ, η and λ are weighted value;
represent that described coordinate is the plane motion vector of the pixel of (x, y), d
xand d
ybe respectively horizontal displacement and vertical displacement that described coordinate is (x, y) pixel, described d
xand d
ysearch in the adjacent video frames of described coordinate to be the pixel place frame of video of (x, y) and described coordinate the be frame of video at the pixel place of (x, y) and obtain; Disp
crossedfor crossed disparity value; δ d is the difference of the mean parallax of the described frame of video of stereopsis frequency range and the match block of adjacent video frames thereof.
22. devices as described in claim 20 or 21, it is characterized in that, described extraction module specifically for, if the pixel in the frame of video of described stereopsis frequency range with maximum weights has multiple, then multiplely have in the pixel of maximum weights by described, the position of the pixel nearest apart from the image center location of the frame of video of described stereopsis frequency range, is defined as the visual focus position of the frame of video of described stereopsis frequency range.
23. devices according to any one of claim 19-22, it is characterized in that, described Spatial characteristic comprises one of following parameter or combination in any: with reference to parallax, the percentage with reference to parallax, the first viewing comfort degree factor;
Described extraction module specifically for, determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
Calculate in the viewing area of the frame of video of described stereopsis frequency range, parallax is described with reference to the number of pixel of parallax and the ratio of the number of valid pixel, obtain the percentage of the reference parallax of the frame of video of described stereopsis frequency range, the parallax absolute value that described valid pixel refers to is less than the pixel of hunting zone;
Whether there is frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away, determine the first viewing comfort degree factor of the frame of video of described stereopsis frequency range, the first value is got during space layout far away the described first viewing comfort degree factor is near under the frame of video of described stereopsis frequency range does not exist frame effect and meets, the second value is got during space layout far away near under the frame of video of described stereopsis frequency range exists frame effect but meets, the 3rd value is got during space layout far away the frame of video of described stereopsis frequency range does not exist frame effect but discontented foot is near, the 4th value is got during space layout far away the frame of video of described stereopsis frequency range exists frame effect and discontented foot is near, wherein, first value, second value, 3rd value and the 4th value are preset value, and the first value is less than the 4th value, 3rd value first value with second value between and be worth with first and second be worth all unequal, described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect, described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators.
24. devices according to any one of claim 19-23, it is characterized in that, described time domain specification comprises the second viewing comfort degree factor;
Described extraction module specifically for, according to the change in time domain of the reference parallax of the frame of video of described stereopsis frequency range, and/or the frequency of the reference parallax appearance of the frame of video of described stereopsis frequency range is in the change of time domain, calculate the second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
25. devices as claimed in claim 24, it is characterized in that, described extraction module specifically for, described stereopsis frequency range is carried out subsegment division, the reference parallax monotone variation of the frame of video in same subsegment and pace of change is identical, according to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=(disp
last-disp
first)/(Np-1)
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, disp
firstand disp
lastbe respectively the first frame of subsegment belonging to the i-th frame and the reference parallax of last frame, the frame of video quantity of Np subsegment belonging to the i-th frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value; Described i-th frame is any frame of video in described stereopsis frequency range.
26. devices as claimed in claim 24, is characterized in that, described extraction module specifically for, according to following formulae discovery, the reference parallax of the frame of video of stereopsis frequency range is in the change of time domain:
V
1 i=min Disp
i-min Disp
i-1
The frequency that the reference parallax of the frame of video of stereopsis frequency range occurs according to following formulae discovery is in the change of time domain:
The second viewing comfort degree factor of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein, V
1 irepresent the change of reference parallax in time domain of the i-th frame, min Disp
iwith min Disp
i-1be respectively the reference parallax of the i-th frame and the i-th-1 frame;
represent the change of frequency in time domain of the reference parallax appearance of the i-th frame, P (min Disp
i) and P (min Disp
i-1) be respectively the percentage of the reference parallax of the i-th frame and the i-th-1 frame;
be the second viewing comfort degree factor of the i-th frame, γ and μ is weighted value, and described i-th frame is any frame of video in described stereopsis frequency range.
27. devices according to any one of claim 18-26, is characterized in that, described evaluation module specifically for:
According to Spatial characteristic and the time domain specification of each frame of video of described stereopsis frequency range, calculate the viewing comfort level of each frame of video described respectively;
According to the visual focus position of each frame of video of described stereopsis frequency range, carry out subsegment division to described stereopsis frequency range, the visual focus position transfer amount of the frame of video of each subsegment is not more than setting transfer amount threshold value; According to the viewing comfort level of frame of video each in each subsegment, calculate the viewing comfort level of described each subsegment respectively;
The viewing comfort level of described stereopsis frequency range is calculated according to the viewing comfort level of each subsegment.
28. devices as claimed in claim 27, is characterized in that, described evaluation module is specifically for, the viewing comfort level of the frame of video of stereopsis frequency range according to following formulae discovery:
Wherein,
be the viewing comfort level of the i-th frame, Spatial_frame_vc
ibe the viewing comfort level of the i-th frame of the Spatial characteristic decision of the i-th frame, Temperal_frame_vc
ibe the viewing comfort level of the i-th frame of the time domain specification decision of the i-th frame, α and β is weighted value; Disp
distribution ibe the first viewing comfort degree factor of the i-th frame, minDisp
ibe the reference parallax of the i-th frame, P (minDisp
i) be the percentage of the reference parallax of the i-th frame;
it is the second viewing comfort degree factor of the i-th frame; B1, b2 and c1 are model parameter, and described b1, b2 and c1 are set point, and described i-th frame is any frame of video in described stereopsis frequency range;
The reference parallax of the frame of video of described stereopsis frequency range obtains in the following manner: determine the parallax set that the frame of video of described stereopsis frequency range is corresponding, described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold; Minimum parallax value in described parallax set is defined as the reference parallax of the frame of video of described stereopsis frequency range;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is that the frame of video of described stereopsis frequency range is with reference to the number of pixels of parallax and the ratio of valid pixel number;
Whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction.
29. devices as described in claim 27 or 28, is characterized in that, described evaluation module specifically for, according to following formulae discovery institute stereopsis frequency range viewing comfort level:
Wherein,
for the level of comfort of a kth subsegment,
for the viewing comfort level of the i-th frame in a described kth subsegment, N
ffor the number of frame of video in a described kth subsegment, P
1with c2 for setting numerical value, a described kth subsegment is any one subsegment in described stereopsis frequency range.
30. devices according to any one of claim 27-29, is characterized in that, described evaluation module is specifically for, the viewing comfort level of stereopsis frequency range according to following formulae discovery:
Wherein, Q is the viewing comfort level of described stereopsis frequency range,
for the level of comfort of a kth subsegment in described stereopsis frequency range, N
sfor the number of subsegment in described stereopsis frequency range, P
2be set point with c2.
31. devices according to any one of claim 18-26, is characterized in that, described evaluation module specifically for: according to the Spatial characteristic of the frame of video of described stereopsis frequency range, calculate the Spatial characteristic of described stereopsis frequency range;
According to the time domain specification of the frame of video of described stereopsis frequency range, calculate the time domain specification of described stereopsis frequency range;
According to Spatial characteristic and the time domain specification of described stereopsis frequency range, calculate the viewing comfort level of described stereopsis frequency range.
32. devices as claimed in claim 31, it is characterized in that, the Spatial characteristic of the frame of video of described stereopsis frequency range comprises one of following parameter or combination in any: the reference parallax of the frame of video of described stereopsis frequency range, with reference to percentage and the first viewing comfort degree factor of parallax;
The Spatial characteristic of the frame of video of described stereopsis frequency range obtains in the following way:
By in the viewing area of the frame of video of described stereopsis frequency range, the parallax value of the pixel met the following conditions is defined as the reference parallax of the frame of video of described stereopsis frequency range: in the viewing area of the frame of video of described stereopsis frequency range, the quantity of the described pixel corresponding with reference to parallax is greater than setting pixel quantity threshold value, and described reference parallax is the minimum parallax value in parallax set; Described parallax set is the pixel parallax set in the frame of video viewing area of described stereopsis frequency range, and pixel quantity corresponding to each parallax value wherein is all greater than described setting threshold;
The percentage of the reference parallax of the frame of video of described stereopsis frequency range, the percentage of the reference parallax of the frame of video of described stereopsis frequency range is: in the viewing area of the frame of video of described stereopsis frequency range, and parallax is the number of pixels of the reference parallax of the frame of video of described stereopsis frequency range and the ratio of valid pixel number;
The first viewing comfort degree factor of the frame of video of described stereopsis frequency range, whether the first viewing comfort degree factor of the frame of video of described stereopsis frequency range exists frame effect according to the frame of video of described stereopsis frequency range and whether meet lower nearly space layout far away is determined; Described frame effect refers to for a frame of video, if the parallax imaging in the object of screen edge is crossed disparity, and a part for described object is beyond screen ranges, then there is frame effect; Described lower near on far refer to the perceived depth imaging in the object of screen bottom corresponding to frame of video close to spectators, image in the perceived depth of object on screen top away from spectators;
Described evaluation module is specifically for, the weighted average of the reference parallax of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range, N is the number of frame of video in described stereopsis frequency range, minDisp
ibe the reference parallax of the i-th frame, P (min Disp
i) be the percentage of the reference parallax of the i-th frame, described i-th frame is any frame of video in described stereopsis frequency range;
The mean value of the first viewing comfort degree factor of all frame of video in stereopsis frequency range according to following formulae discovery:
Wherein,
for the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range, disp
distribution ibe the first viewing comfort degree factor of the i-th frame, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
33. devices as described in claim 31 or 32, it is characterized in that, the time domain specification of the frame of video of described stereopsis frequency range comprises:
The second viewing comfort degree factor of the frame of video of described stereopsis frequency range, the second viewing comfort degree factor of the frame of video of described stereopsis frequency range is according to the change of the reference parallax of the frame of video of described stereopsis frequency range in time domain, and/or the frequency that the reference parallax of the frame of video of described stereopsis frequency range occurs is determined in the change of time domain, the size of described second viewing comfort degree factor value represents the degree that the reference parallax of frame of video changes at depth direction;
Described evaluation module is specifically for, the second viewing comfort degree factor of stereopsis frequency range according to following formulae discovery:
Wherein,
for the second viewing comfort degree factor of described stereopsis frequency range,
for the second viewing comfort degree factor of the i-th frame in described stereopsis frequency range, P
ifor set point, P
ivalue according to V
1 iwith min Disp
isymbol determine, N is the number of frame of video in described stereopsis frequency range, and described i-th frame is any frame of video in described stereopsis frequency range.
34. devices according to any one of claim 31-33, is characterized in that, described evaluation module is specifically for, the viewing comfort level of stereopsis frequency range according to following formulae discovery:
VC=α*Spatial_vc+β*Temperal_vc
Wherein, VC is the viewing comfort level of described stereopsis frequency range, α and β is weighted value; B1, c3 are set point; Spatial_frame_vc is the viewing comfort level of the described stereopsis frequency range that Spatial characteristic determines, Temperal_frame_vc is the viewing comfort level of the described stereopsis frequency range that time domain specification determines;
for the mean value of the first viewing comfort degree factor of all frame of video in described stereopsis frequency range,
for the weighted average of the reference parallax of all frame of video in described stereopsis frequency range,
for the second viewing comfort degree factor of described stereopsis frequency range.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710203859.6A CN107155106B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203860.9A CN107181940B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203858.1A CN107155105B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710204296.2A CN106973288B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201310740605.XA CN104754322B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
PCT/CN2014/082079 WO2015096461A1 (en) | 2013-12-27 | 2014-07-11 | Method and device for evaluating degree of comfort of stereoscopic video |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310740605.XA CN104754322B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
Related Child Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710203859.6A Division CN107155106B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203858.1A Division CN107155105B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203860.9A Division CN107181940B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710204296.2A Division CN106973288B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104754322A true CN104754322A (en) | 2015-07-01 |
CN104754322B CN104754322B (en) | 2018-01-23 |
Family
ID=53477473
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310740605.XA Active CN104754322B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203859.6A Active CN107155106B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203858.1A Active CN107155105B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710204296.2A Active CN106973288B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203860.9A Active CN107181940B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710203859.6A Active CN107155106B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203858.1A Active CN107155105B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710204296.2A Active CN106973288B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
CN201710203860.9A Active CN107181940B (en) | 2013-12-27 | 2013-12-27 | A kind of three-dimensional video-frequency Comfort Evaluation method and device |
Country Status (2)
Country | Link |
---|---|
CN (5) | CN104754322B (en) |
WO (1) | WO2015096461A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105335992A (en) * | 2015-10-15 | 2016-02-17 | 北京邮电大学 | Scoring model determining method of three-dimensional animation scene frames and device |
CN106028025A (en) * | 2016-05-11 | 2016-10-12 | 吉林大学 | 3D video comfort level evaluation method for convergence and adjustment consistency |
CN106210710A (en) * | 2016-07-25 | 2016-12-07 | 宁波大学 | A kind of stereo image vision comfort level evaluation methodology based on multi-scale dictionary |
CN106341677A (en) * | 2015-07-07 | 2017-01-18 | 中国科学院深圳先进技术研究院 | Virtual viewpoint video quality evaluation method |
CN106851246A (en) * | 2017-02-06 | 2017-06-13 | 京东方科技集团股份有限公司 | Method and apparatus for determining the visual fatigue degree of 3-D view or video |
CN109089111A (en) * | 2018-10-22 | 2018-12-25 | Oppo广东移动通信有限公司 | A kind of three-dimensional video-frequency Comfort Evaluation method, system and terminal device |
CN109429051A (en) * | 2017-07-12 | 2019-03-05 | 天津大学 | Based on multiple view feature learning without reference stereoscopic video quality method for objectively evaluating |
CN109905694A (en) * | 2017-12-08 | 2019-06-18 | 中移(杭州)信息技术有限公司 | A kind of quality evaluating method of three-dimensional video-frequency, device and equipment |
CN110691236A (en) * | 2019-09-18 | 2020-01-14 | 宁波大学 | Panoramic video quality evaluation method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107909565A (en) * | 2017-10-29 | 2018-04-13 | 天津大学 | Stereo-picture Comfort Evaluation method based on convolutional neural networks |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101582063A (en) * | 2008-05-13 | 2009-11-18 | 华为技术有限公司 | Video service system, video service device and extraction method for key frame thereof |
KR20110139020A (en) * | 2010-06-22 | 2011-12-28 | 연세대학교 산학협력단 | Fatigue evaluation method and apparatus for 3d video based on depth map |
US20120098823A1 (en) * | 2010-10-22 | 2012-04-26 | Samsung Electronics Co., Ltd. | Display apparatus and method |
CN103096106A (en) * | 2011-11-01 | 2013-05-08 | 三星电子株式会社 | Image processing apparatus and method |
CN103096122A (en) * | 2013-01-24 | 2013-05-08 | 上海交通大学 | Stereoscopic vision comfort level evaluation method based on motion features inside area of interest |
CN103210652A (en) * | 2010-11-12 | 2013-07-17 | 索尼公司 | 3d video image encoding apparatus, decoding apparatus and method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103096125B (en) * | 2013-02-22 | 2015-03-04 | 吉林大学 | Stereoscopic video visual comfort evaluation method based on region segmentation |
CN103595990B (en) * | 2013-10-30 | 2015-05-20 | 清华大学 | Method for obtaining binocular stereoscopic video comfort level of motion perception |
-
2013
- 2013-12-27 CN CN201310740605.XA patent/CN104754322B/en active Active
- 2013-12-27 CN CN201710203859.6A patent/CN107155106B/en active Active
- 2013-12-27 CN CN201710203858.1A patent/CN107155105B/en active Active
- 2013-12-27 CN CN201710204296.2A patent/CN106973288B/en active Active
- 2013-12-27 CN CN201710203860.9A patent/CN107181940B/en active Active
-
2014
- 2014-07-11 WO PCT/CN2014/082079 patent/WO2015096461A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101582063A (en) * | 2008-05-13 | 2009-11-18 | 华为技术有限公司 | Video service system, video service device and extraction method for key frame thereof |
KR20110139020A (en) * | 2010-06-22 | 2011-12-28 | 연세대학교 산학협력단 | Fatigue evaluation method and apparatus for 3d video based on depth map |
US20120098823A1 (en) * | 2010-10-22 | 2012-04-26 | Samsung Electronics Co., Ltd. | Display apparatus and method |
CN103210652A (en) * | 2010-11-12 | 2013-07-17 | 索尼公司 | 3d video image encoding apparatus, decoding apparatus and method |
CN103096106A (en) * | 2011-11-01 | 2013-05-08 | 三星电子株式会社 | Image processing apparatus and method |
CN103096122A (en) * | 2013-01-24 | 2013-05-08 | 上海交通大学 | Stereoscopic vision comfort level evaluation method based on motion features inside area of interest |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106341677B (en) * | 2015-07-07 | 2018-04-20 | 中国科学院深圳先进技术研究院 | Virtual view method for evaluating video quality |
CN106341677A (en) * | 2015-07-07 | 2017-01-18 | 中国科学院深圳先进技术研究院 | Virtual viewpoint video quality evaluation method |
CN105335992B (en) * | 2015-10-15 | 2020-02-04 | 北京邮电大学 | Three-dimensional animation scene frame scoring model determination method and device |
CN105335992A (en) * | 2015-10-15 | 2016-02-17 | 北京邮电大学 | Scoring model determining method of three-dimensional animation scene frames and device |
CN106028025A (en) * | 2016-05-11 | 2016-10-12 | 吉林大学 | 3D video comfort level evaluation method for convergence and adjustment consistency |
CN106210710A (en) * | 2016-07-25 | 2016-12-07 | 宁波大学 | A kind of stereo image vision comfort level evaluation methodology based on multi-scale dictionary |
CN106210710B (en) * | 2016-07-25 | 2018-01-30 | 宁波大学 | A kind of stereo image vision comfort level evaluation method based on multi-scale dictionary |
US10277881B2 (en) | 2017-02-06 | 2019-04-30 | Boe Technology Group Co., Ltd. | Methods and devices for determining visual fatigue of three-dimensional image or video and computer readable storage medium |
CN106851246A (en) * | 2017-02-06 | 2017-06-13 | 京东方科技集团股份有限公司 | Method and apparatus for determining the visual fatigue degree of 3-D view or video |
CN109429051A (en) * | 2017-07-12 | 2019-03-05 | 天津大学 | Based on multiple view feature learning without reference stereoscopic video quality method for objectively evaluating |
CN109429051B (en) * | 2017-07-12 | 2020-08-18 | 天津大学 | Non-reference stereo video quality objective evaluation method based on multi-view feature learning |
CN109905694A (en) * | 2017-12-08 | 2019-06-18 | 中移(杭州)信息技术有限公司 | A kind of quality evaluating method of three-dimensional video-frequency, device and equipment |
CN109905694B (en) * | 2017-12-08 | 2020-09-08 | 中移(杭州)信息技术有限公司 | Quality evaluation method, device and equipment for stereoscopic video |
CN109089111A (en) * | 2018-10-22 | 2018-12-25 | Oppo广东移动通信有限公司 | A kind of three-dimensional video-frequency Comfort Evaluation method, system and terminal device |
CN110691236A (en) * | 2019-09-18 | 2020-01-14 | 宁波大学 | Panoramic video quality evaluation method |
Also Published As
Publication number | Publication date |
---|---|
CN107155105A (en) | 2017-09-12 |
CN106973288B (en) | 2019-08-13 |
CN107181940A (en) | 2017-09-19 |
CN106973288A (en) | 2017-07-21 |
CN107155106B (en) | 2019-03-01 |
CN107181940B (en) | 2019-05-03 |
WO2015096461A1 (en) | 2015-07-02 |
CN104754322B (en) | 2018-01-23 |
CN107155105B (en) | 2019-03-01 |
CN107155106A (en) | 2017-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104754322A (en) | Stereoscopic video comfort evaluation method and device | |
US9215452B2 (en) | Stereoscopic video display apparatus and stereoscopic video display method | |
US9277207B2 (en) | Image processing apparatus, image processing method, and program for generating multi-view point image | |
EP3869797B1 (en) | Method for depth detection in images captured using array cameras | |
CN103096125B (en) | Stereoscopic video visual comfort evaluation method based on region segmentation | |
US20110032341A1 (en) | Method and system to transform stereo content | |
CN103986925B (en) | based on the stereoscopic video visual comfort evaluation method of luminance compensation | |
US10298905B2 (en) | Method and apparatus for determining a depth map for an angle | |
CN102750695A (en) | Machine learning-based stereoscopic image quality objective assessment method | |
US20110267338A1 (en) | Apparatus and method for reducing three-dimensional visual fatigue | |
KR101393621B1 (en) | Method and system for analyzing a quality of three-dimensional image | |
EP2498502A2 (en) | Analysis of stereoscopic images | |
CN104601979A (en) | Multi view image display apparatus and control method thereof | |
US20150085087A1 (en) | Method and device for correcting distortion errors due to accommodation effect in stereoscopic display | |
CN108848365B (en) | A kind of reorientation stereo image quality evaluation method | |
US9082210B2 (en) | Method and apparatus for adjusting image depth | |
KR101889952B1 (en) | Stereo matiching method and device based on confidence metric | |
Kim et al. | Quality assessment of perceptual crosstalk on two-view auto-stereoscopic displays | |
Chen et al. | Full-reference quality assessment of stereoscopic images by modeling binocular rivalry | |
Md et al. | Multiscale-ssim index based stereoscopic image quality assessment | |
CN102821299B (en) | Semi-automatic 3D stereoscopic disparity cursor | |
KR20130057586A (en) | Apparatus and method for generating depth map, stereo-scopic image conversion apparatus and method usig that | |
Boev et al. | Optimized single-viewer mode of multiview autostereoscopic display | |
KR101649185B1 (en) | Method and apparatus for calculating visual attention score | |
Kim et al. | Quality assessment of perceptual crosstalk in autostereoscopic display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |