WO2005020592A1 - 映像品質評価装置、映像品質評価方法及び映像品質評価プログラム、並びに映像整合装置、映像整合方法及び映像整合プログラム - Google Patents
映像品質評価装置、映像品質評価方法及び映像品質評価プログラム、並びに映像整合装置、映像整合方法及び映像整合プログラム Download PDFInfo
- Publication number
- WO2005020592A1 WO2005020592A1 PCT/JP2004/011992 JP2004011992W WO2005020592A1 WO 2005020592 A1 WO2005020592 A1 WO 2005020592A1 JP 2004011992 W JP2004011992 W JP 2004011992W WO 2005020592 A1 WO2005020592 A1 WO 2005020592A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- video signal
- degraded
- frame
- reference video
- Prior art date
Links
- 238000013441 quality evaluation Methods 0.000 title claims description 95
- 238000000034 method Methods 0.000 title claims description 86
- 238000012937 correction Methods 0.000 claims abstract description 158
- 238000004364 calculation method Methods 0.000 claims abstract description 26
- 230000002123 temporal effect Effects 0.000 claims description 71
- 230000006866 deterioration Effects 0.000 claims description 33
- 238000006243 chemical reaction Methods 0.000 claims description 32
- 238000012545 processing Methods 0.000 claims description 22
- 230000008569 process Effects 0.000 claims description 19
- 230000008859 change Effects 0.000 claims description 18
- 230000037433 frameshift Effects 0.000 claims description 12
- 238000012544 monitoring process Methods 0.000 claims description 8
- 230000006870 function Effects 0.000 claims description 4
- 230000007704 transition Effects 0.000 claims description 3
- 230000000593 degrading effect Effects 0.000 claims 1
- 238000011156 evaluation Methods 0.000 description 47
- 238000010586 diagram Methods 0.000 description 28
- 230000015556 catabolic process Effects 0.000 description 23
- 238000006731 degradation reaction Methods 0.000 description 23
- 238000009826 distribution Methods 0.000 description 15
- 230000014509 gene expression Effects 0.000 description 10
- 238000012360 testing method Methods 0.000 description 8
- 238000012795 verification Methods 0.000 description 8
- 238000002474 experimental method Methods 0.000 description 6
- 230000001360 synchronised effect Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 2
- 238000001303 quality assessment method Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 1
- 230000003187 abdominal effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000000053 physical method Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/04—Diagnosis, testing or measuring for television systems or their details for receivers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
Definitions
- Video quality evaluation device video quality evaluation method and video quality evaluation program
- video matching device video matching method and video matching program
- the present invention relates to a video quality evaluation device for estimating subjective quality from measurement of physical features of a video signal without performing a subjective quality evaluation test in which a human looks at an actual video and evaluates the quality.
- the present invention relates to a quality evaluation method and a video quality evaluation program.
- a space between a reference video signal and a degraded video signal whose quality has been degraded through code transmission network transmission or the like In a video quality evaluation device, a video quality evaluation method, and a video quality evaluation program, a space between a reference video signal and a degraded video signal whose quality has been degraded through code transmission network transmission or the like.
- the present invention relates to a video matching device, a video matching method, and a video matching program for performing temporal and temporal position matching.
- the quality of video information is degraded through some processing, for example, encoding or network transmission.
- some processing for example, encoding or network transmission.
- the degree of deterioration that a person actually sees and perceives is called subjective quality.
- the subjective quality of a video is such a power that a subjective evaluation value can be accurately estimated if the video is limited.
- ANSI T1.801.03-1996 "Digital Transport of One-Way Video Signals Parameters ror Objective Performance Assessment; Okamoto, Naohashi, Okamoto, Kurita, and Takahashi, "A Study on the Application of Objective Video Quality Evaluation Techniques", Sep. 2002 .; 2003.)
- the subjective quality judged as a result is often different even for the same degree of deterioration, since the quality of a given image greatly depends on the nature of the image.
- the spatial and temporal differences between the reference video signal and the degraded video signal are premised on comparing the physical features of the reference video signal and the degraded video signal. Position must be consistent. In other words, it is necessary that the time-direction shift and the spatial position shift match between the reference video signal and the deteriorated video signal.
- the present invention provides a video quality evaluation device, a video quality evaluation method, and a video quality evaluation program that can accurately and uniformly estimate the subjective quality of an arbitrary video.
- the purpose is to provide.
- the present invention when comparing the physical feature amounts of the reference video signal and the degraded video signal in estimating the subjective quality, it is possible to reliably match their spatial and temporal positions. It is an object to provide a video matching device, a video matching method, and a video matching program.
- the invention according to the first aspect is characterized in that a reference video signal that is an undegraded video signal and a degraded video signal whose reference video signal has been degraded are input, and the video of both signals is input.
- a subjective quality estimating unit for calculating a signal characteristic amount and estimating a subjective quality of the degraded video signal based on a difference between the calculated video signal characteristic amounts of the two signals; and correction information for correcting the subjective quality.
- a video quality product comprising: a subjective quality correction unit that obtains from the storage unit and corrects the estimated subjective quality based on the obtained correction information.
- the quality evaluation device is the gist.
- the invention according to a second aspect is the invention according to the first aspect, wherein the subjective quality estimating unit includes: spatial information indicating a video state in each frame of the video included in the video signal; The gist is that the subjective quality is estimated based on at least one of the time information indicating the video change at the time and the difference between the deteriorated video signal and the reference video signal.
- the subjective quality estimating unit comprises a video signal characteristic difference between the degraded video signal and the reference video signal. It is necessary to calculate at least one of the edge power amount (E) indicating the amount of deterioration in each frame of the video included in the frame and the motion power amount (M) indicating the amount of deterioration between frames.
- E edge power amount
- M motion power amount
- the subjective quality estimating unit is configured to include at least one of spatial information and temporal information which are also referred to in ITU-R Recommendation P.910.
- the gist is to estimate the subjective quality based on the difference between the deteriorated video signal and the reference video signal.
- the correction information storage section includes a correction coefficient as correction information for correcting the subjective quality, wherein the correction coefficient includes a correction coefficient.
- the gist of the present invention is to store and store the space information indicating the video state in each frame and the time information indicating a video change between frames.
- the invention according to a sixth aspect is the invention according to the first aspect, wherein the subjective quality estimating unit receives the reference video signal and the degraded video signal, and outputs a reference video from the reference video signal. Time shift between the frame and the degraded video frame from the degraded video signal;
- An alignment information generating unit that generates alignment information on a spatial shift between the reference video frame and the degraded video frame, and eliminates the spatial shift and the temporal shift based on the alignment information. Then, a spatial feature calculating unit that calculates each spatial feature based on the spatial information indicating the video state of the reference video frame and the degraded video frame, and the spatial feature calculating unit that calculates the spatial feature based on the alignment information. After eliminating the shift and the time shift, the time information indicating the video change between each of the reference video frame and the deteriorated video frame is added to the time information.
- the present invention is characterized in that it comprises a time feature value calculation unit that calculates each time feature value based on the above, and an estimation unit that estimates the subjective quality of the deteriorated video signal based on the spatial feature value and the time feature value. .
- the invention according to a seventh aspect is directed to a method for correcting an estimated subjective quality of a deteriorated video signal in which a reference video signal which is an undegraded video signal is deteriorated.
- a video quality evaluation method in a video quality evaluation device comprising: a correction information storage unit for storing correction information corresponding to a video signal feature amount, and estimating a subjective quality of the degraded video signal.
- a reference video signal and the degraded video signal are input, a video signal characteristic amount of both signals is calculated, and the subjective quality of the degraded video signal is calculated based on a difference between the calculated video signal characteristic amounts of both signals.
- Estimated and calculated correction information corresponding to the calculated video signal characteristic amount of the reference video signal from the correction information storage unit, and corrects the estimated subjective quality based on the obtained correction information. Summarize the evaluation method .
- the invention according to an eighth aspect provides a computer which inputs a reference video signal which is a video signal which has not deteriorated and a deteriorated video signal which has deteriorated the reference video signal.
- a subjective quality estimating means for calculating a video signal characteristic amount of both signals, and estimating a subjective quality of the degraded video signal based on a difference between the calculated video signal characteristic amounts of both signals; and Correction information storing means for storing correction information for correction corresponding to the video signal characteristic amount; and obtaining correction information corresponding to the calculated video signal characteristic amount of the reference video signal from the correction information storing means.
- the gist of the present invention is a video quality evaluation program for functioning as subjective quality correction means for correcting the estimated subjective quality based on the acquired correction information.
- the invention according to a ninth aspect relates to a method of inputting a reference video signal which is an undegraded video signal and a degraded video signal whose reference video signal has been degraded. It generates alignment information about a temporal shift between a reference video frame from the reference video signal and a deteriorated video frame from the degraded video signal and a spatial shift between the reference video frame and the degraded video frame.
- An alignment information generating unit that performs the spatial offset and the time offset based on the alignment information
- a spatial feature calculating unit that calculates respective spatial features based on the spatial information indicating the video state of the reference video frame and the degraded video frame
- a spatial feature calculating unit that calculates the spatial feature based on the alignment information. After eliminating the shift and the time shift, the time for calculating the time feature amount of each of the reference video frame and the degraded video frame based on time information indicating a video change between the respective frames.
- the gist of the present invention is a video quality evaluation device that includes a feature amount calculation unit and a subjective quality estimation unit that estimates a subjective quality of the degraded video signal based on the spatial feature amount and the temporal feature amount.
- the invention according to a tenth aspect is the invention according to the ninth aspect, wherein the file format of the degraded video included in the degraded video signal is changed to the file format of the reference video included in the corresponding reference video signal. And a coefficient converter for outputting information related to the unified file format, and a coefficient for estimating the subjective quality of the degraded video signal corresponding to the information related to the file format. And a subjective coefficient estimating unit, wherein the subjective quality estimating unit stores the coefficient corresponding to the information related to the unified file format input from the format converting unit in the correction coefficient storing unit. And obtaining the subjective quality of the deteriorated video signal based on the spatial feature, the temporal feature, and the obtained coefficient.
- the format conversion unit includes, as information relating to the unified file format, a signal format of the degraded video signal
- the correction coefficient storage unit outputs at least one of the information amount of the degraded video signal transmitted by the video signal and the encoding method of the degraded video signal, and stores the signal format of the degraded video signal
- the gist of the present invention is to store an optimal coefficient corresponding to at least one of the information amount of the degraded video transmitted by the signal and the encoding method of the degraded video signal.
- the spatial feature quantity calculation unit is configured to determine a luminance value within a frame based on the reference video signal and the degraded video signal.
- the gist of the present invention is to calculate an index for quantifying the deterioration occurring at a boundary where abruptly changes as the spatial feature amount.
- the invention according to a thirteenth aspect is the invention according to the twelfth aspect, wherein the spatial feature amount calculation is performed.
- the gist of the output section is to calculate the edge power amount specified in ANSI T1.801.03-1995 as a spatial feature amount based on the reference video signal and the degraded video signal.
- the invention according to a fourteenth aspect is the invention according to the ninth, tenth, or twelfth aspect, wherein the spatial feature amount calculation unit compares the reference video frame with the reference video frame.
- the gist is to calculate, as the spatial feature, an index for quantifying the degree of occurrence of a boundary in which the luminance value sharply changes in the corresponding deteriorated video frame in the horizontal and vertical directions.
- the invention according to a fifteenth aspect is the invention according to the ninth, tenth, twelfth, or fourteenth aspect, wherein the temporal feature amount calculation unit is a set of one or more pixels in a frame.
- the gist is to calculate the amount of change between frames of a video as the time feature based on the difference between the Temporal Information values.
- the invention according to a sixteenth aspect is the invention according to the ninth or tenth aspect, wherein the correction information for correcting the subjective quality is stored in correspondence with a spatial feature amount and a temporal feature amount.
- the correction information for correcting the subjective quality is stored in correspondence with a spatial feature amount and a temporal feature amount.
- a subjective quality correction unit that corrects the estimated subjective quality based on the acquired correction information.
- the invention according to a seventeenth aspect provides a method as described above, wherein a reference video signal which is an undegraded video signal and a degraded video signal whose reference video signal has been degraded are inputted. Generating alignment information regarding a temporal shift between a reference video frame from a video signal and a degraded video frame from the degraded video signal and a spatial shift between the reference video frame and the degraded video frame; After eliminating the spatial shift and the temporal shift based on the alignment information, each spatial characteristic amount is determined based on the spatial information indicating the video state of the reference video frame and the degraded video frame.
- the reference video frame and the degraded video frame are calculated.
- Stomach Calculating the respective temporal features based on the time information indicating the video change between the respective frames, and estimating the subjective quality of the degraded video signal based on the spatial features and the temporal features.
- the summary is a video quality evaluation method.
- the invention according to an eighteenth aspect provides a method as described above, wherein a reference video signal which is an undegraded video signal and a degraded video signal whose reference video signal has been degraded are inputted.
- the file format of the degraded video included in the video signal is converted into the file format of the reference video included in the corresponding reference video signal, and information related to the unified file format is generated.
- each spatial feature is calculated based on spatial information indicating the video state of the frame, and after eliminating the spatial shift and the temporal shift based on the alignment information, the reference video frame and the For each of the deteriorated video frames, a time feature amount is calculated based on time information indicating a video change between each frame, and the time feature amount is calculated based on the space feature amount, the time feature amount, and the unified file format.
- the gist is a video quality evaluation method for estimating the subjective quality of the deteriorated video signal based on a coefficient for subjective quality estimation corresponding to the obtained information.
- the invention according to a nineteenth aspect provides a computer, comprising: a reference video signal which is a video signal degraded; a degraded video signal in which the reference video signal is degraded; And a time shift between a reference video frame from the reference video signal and a degraded video frame from the degraded video signal, and a spatial shift between the reference video frame and the degraded video frame.
- Alignment information generating means for generating alignment information; and displaying video states of the reference video frame and the degraded video frame after eliminating the spatial shift and the temporal shift based on the alignment information.
- Spatial feature calculating means for calculating each spatial feature based on spatial information; and the spatial shift and the time based on the alignment information.
- a time feature value calculating unit that calculates respective time feature values based on time information indicating a video change between the respective frames;
- the gist of the present invention is a video quality evaluation program for functioning as subjective quality estimating means for estimating the subjective quality of the degraded video signal based on the temporal feature amount.
- the invention according to a twentieth aspect is based on the invention according to the nineteenth aspect, wherein the computer is configured to convert the file format of the deteriorated video included in the degraded video signal into a reference video included in the corresponding reference video signal. And a format conversion means for outputting information related to the unified file format, and for estimating the subjective quality of the degraded video signal in association with the information related to the file format.
- the subjective quality estimating means further stores the coefficient corresponding to the information relating to the unified file format input from the format converting means, as the correction coefficient storing means. And estimating the subjective quality of the degraded video signal based on the spatial feature, the temporal feature, and the obtained coefficient. And Abstract.
- the invention according to a twenty-first aspect is characterized in that a reference video signal which is an undegraded video signal and a degraded video signal whose reference video signal has been degraded are input, A format converter for converting a file format of the degraded video included in the video signal into a file format of a reference video included in the corresponding reference video signal; a reference video frame included in the reference video signal and the degraded video signal; A display timing matching unit that matches the number and display timing of the degraded video frames included in the reference video frame and the target frame of the degraded video frame and the several frames before and after the reference video frame and the degraded video frame.
- a synchronous Z position-aligning part to take integer engagement of the gist of the video aligning apparatus comprising a.
- the invention according to a twenty-second aspect is based on the invention according to the twenty-first aspect, wherein the format conversion unit converts at least one of a data format, a size, and an aspect ratio of the degraded video, and The point is to match.
- the display timing matching section is configured to, when a frame rate of the reference video frame is different from a frame rate of the degraded video frame, The gist is to match the frame rates of the reference video frame and the degraded video frame by interpolating or deleting frames.
- the invention according to a twenty-fourth aspect is based on the invention according to the twenty-first aspect, wherein the display timing matching unit adjusts the display timing of the degraded video frame to the display timing of the reference video frame.
- the display timing matching unit may determine whether the reference video frame and the degradation The point is that the display timing of both video frames is set to a predetermined time interval.
- the synchronization / position matching unit includes a target frame of the reference video frame and the degraded video frame and several frames before and after the target frame. Macro synchronization processing for comparing the transition of the feature amount of the entire frame or the specific region thereof and determining the temporal correspondence between the reference video frame and the degraded video frame in which the deviation of the feature amount is minimized. The main point is to perform
- the synchronization / position matching unit is configured to: While shifting the temporal correspondence relationship and the pixel correspondence relationship between the reference video frame and the degraded video frame, the respective feature amounts of the entire frame or a specific area thereof are compared, and the reference video frame in which the difference in the feature amount is minimized.
- the point is to perform a micro synchronization / position matching process for determining a temporal correspondence relationship and a pixel correspondence relationship between the image and the deteriorated video frame.
- the synchronization / position matching unit performs the macro synchronization processing and the micro synchronization / position matching processing initially. Make a summary.
- the invention according to a twenty-ninth aspect is the invention according to the twenty-eighth aspect, wherein the synchronization / alignment is performed.
- the gist is to sum up the duration of the freeze state by counting the number of frames when the deteriorated video frame falls into the freeze state.
- the synchronization / position matching unit determines, for a target frame of the reference video frame and the degraded video frame, and several frames before and after the target frame. Deriving the respective characteristic amounts of the entire frame, and when the characteristic amount of the reference video frame is temporally changing while the characteristic amount of the degraded video frame is not temporally changing, The gist is to judge that the video frame has been frozen.
- the synchronization / position matching unit determines whether the degraded video frame is in a frozen state or in which the reference video frame When synchronization is lost, the macro synchronization processing is performed again.
- the invention according to a thirty-second aspect is the invention according to the twenty-eighth aspect, wherein the synchronization / position matching unit outputs the number of frame shifts when the degraded video frame falls into a frame shift state. That is the gist.
- the invention according to a thirty-third aspect is based on the invention according to the twenty-first aspect, wherein the reference video signal and the degraded video signal are input from the synchronization / position matching unit, and luminance and color information of the degraded video are input. And a luminance / color correction unit that returns the degraded image combined with the luminance and color information to the synchronization / position matching unit.
- the invention includes the steps of: inputting a reference video signal that is an undegraded video signal and a degraded video signal whose reference video signal has been degraded;
- the file format of the degraded video included in the video signal is converted into the file format of the reference video included in the corresponding reference video signal, and the reference video frame included in the reference video signal and the degradation included in the degraded video signal are converted.
- the number of video frames and the display timing are matched, and for the target frame of the reference video frame and the degraded video frame and a few frames before and after the target frame, the shift of the frame between the reference video frame and the degraded video frame, and the like.
- the gist of the present invention is a video matching method for matching the correspondence between frames and the correspondence between pixels while monitoring the frozen state of the deteriorated video.
- the invention according to a thirty-fifth aspect provides a computer, comprising: a reference video signal which is a video signal degraded; a degraded video signal in which the reference video signal is degraded; Format conversion means for converting the file format of the degraded video included in the degraded video signal into the file format of the reference video included in the corresponding reference video signal.
- Display timing matching means for matching the number and the display timing of the video frame and the number of degraded video frames included in the degraded video signal; and the reference video frame, While monitoring the frame shift between the frame and the degraded video frame and the frozen state of the degraded video
- the image matching program for functioning as a synchronous Z alignment means for matching correspondence relation between the involved and pixels and gist.
- the invention according to the thirty-sixth aspect includes a reference video signal (RI) which is an undegraded video signal and a degraded video signal (PI) whose reference video signal has been degraded. And a format converter (41) for converting a file format of the deteriorated video included in the deteriorated video signal into a file format of the reference video included in the corresponding reference video signal, and a format converter (41) included in the reference video signal.
- a display timing matching unit (42) for matching the number and display timing of the reference video frame and the deteriorated video frame included in the deteriorated video signal; and a target frame of the reference video frame and the deteriorated video frame and several frames before and after the target frame.
- the gist is a video quality evaluation device including:
- FIG. 1 is a block diagram showing a configuration of a first embodiment according to a video quality evaluation device of the present invention.
- Garden 2 is a diagram showing correction information stored in the correction information database of FIG.
- FIG. 3 is a diagram showing the relationship between the final estimated subjective quality Q after correction of the primary estimated subjective quality SQ by a correction formula and the actually measured subjective quality.
- FIG. 4 is a block diagram showing a configuration of a second embodiment of the video quality evaluation device of the present invention.
- FIG. 5 is a diagram for calculating the horizontal and vertical edge amounts used in the spatial feature amount calculation unit in FIG.
- FIG. 6 is a block diagram showing a configuration of a third embodiment of the video quality evaluation device of the present invention.
- FIG. 7 is a diagram showing a plurality of conditions and weighting factors corresponding to the conditions stored in the weighting factor database of FIG. 6;
- FIG. 8 is a diagram in which standard video data used for verification in the embodiment of the present invention are classified into learning data and verification data and enumerated.
- FIG. 9 is a distribution diagram of SI (spatial information) values and TI (time information) values calculated based on the learning data and the verification data of FIG.
- FIG. 10 is a diagram showing a result of estimation of learning data based on a conventional peak SN ratio (PSNR).
- PSNR peak SN ratio
- FIG. 11 is a diagram showing a result of estimating learning data based on a conventional edge power (Ave-EE).
- FIG. 12 is a diagram showing estimation results of learning data by the video quality evaluation devices according to the second and third embodiments of the present invention.
- FIG. 13 is a diagram showing estimation results of verification data by the video quality evaluation devices of the second and third embodiments of the present invention.
- FIG. 14 is a diagram showing a result of estimating learning data based on only the conventional edge power (Ave_EE).
- FIG. 15 is a diagram showing the relationship between the minimum value of the horizontal and vertical edge amounts (Min_HV) and the subjective evaluation value.
- FIG. 16 is a diagram showing a relationship between block average motion power (Ave—MEB) and subjective evaluation values.
- FIG. 17 is a block diagram showing a configuration of an embodiment according to a video matching device of the present invention.
- FIG. 18 is a flowchart showing an operation procedure of the embodiment of the video matching device of the present invention.
- FIG. 19 is a diagram for explaining processing in a display timing matching unit.
- FIG. 20 is a diagram for explaining macro time synchronization processing in the synchronization / position matching unit.
- FIG. 21 is a diagram for explaining a micro-synchronous Z position deriving process in the synchronization / position matching unit.
- first to third embodiments are embodiments relating to the invention of the video quality evaluation device, the video quality evaluation method, and the video quality evaluation program
- fourth embodiment is the video matching device, the video matching method, and the video matching device. It is an embodiment according to the invention of a program.
- FIG. 1 is a block diagram showing the configuration of the first embodiment according to the video quality evaluation device of the present invention.
- the video quality evaluation device of the first embodiment includes at least a subjective quality estimation unit 11, a feature amount calculation unit 12, a correction information database 13, a correction calculation unit 14, and a correction unit 15.
- the subjective quality estimation unit 11 receives the reference video signal RI and the degraded video signal PI.
- the reference video signal RI is a video signal before degradation
- the degradation video signal PI is a video signal in which the reference video signal RI has been degraded, for example, by being coded or passed through a network.
- the subjective quality estimating unit 11 calculates a difference between each video signal feature amount, which is a physical feature amount, with respect to the reference video signal RI and the degraded video signal PI.
- the video signal feature quantity includes, for example, spatial information (SI) indicating a video state in a certain frame of the video included in the video signal, and a video change between certain frames of the video included in the video signal.
- SI spatial information
- Time information (TI).
- These spatial information SI and time information TI include, for example, Spatial Information and Temporal Inrormat abdominal strength 51 , which are listed in Appendix A of the subjective Video Quality Assessment Methods for Multimedia Applications of iTU-R Recommendation P.910. is there.
- the subjective quality estimation unit 11 quantifies the deterioration of the degraded video signal PI from the calculated difference between the video signal feature amounts, and estimates the subjective quality based on the quantified deterioration. That is, the subjective quality estimating unit 11 quantifies the degradation of the degraded video signal PI from the reference video signal RI and the degraded video signal PI, and estimates the subjective quality based on the quantified degradation.
- the estimated subjective quality is output from the subjective quality estimating unit 11 as a primary estimated subjective quality SQ. If the primary estimated subjective quality SQ is determined by, for example, the edge power (E) and the motion power (M), it is generally represented by a function represented by the following equation (1).
- This function F is obtained in advance by a subjective evaluation experiment.
- the edge power (E) and motion power (M) are based on the objective video quality evaluation standard (ANSI T1.80 ⁇ .03—1996, Digital Transport of une-Way Video Signals Parameters for Objective Performance Assessment ").
- the feature value calculation unit 12 receives the reference video signal RI, and calculates the video signal feature value FI based on the reference video signal RI.
- the video signal feature amount FI includes, for example, spatial information SI and time information TI.
- the feature value calculating unit 12 specifies at least one of the spatial information SI and the time information TI and quantitatively calculates the feature value.
- the correction information database 13 stores correction information corresponding to the video signal feature amount.
- the video signal feature amount is, for example, the spatial information SI and the time information TI as described above.
- the correction information is a correction formula or a correction coefficient for correcting the primary estimated subjective quality SQ output from the subjective quality estimating unit 11. The correction information will be described later in detail with reference to FIG.
- These correction formulas and correction coefficients are determined in advance by experiments and stored in the correction information database 13. If a video with the video signal feature FI is degraded, the characteristics of how much the human perceives the deterioration as a subjective perception of the video are unified and calculated in advance through subjective evaluation experiments. Then, a correction formula and a correction coefficient corresponding to the video signal feature value FI are calculated based on the equation.
- the physical characteristic amount of the reference video signal RI and the subjective evaluation characteristics when a video having the characteristic is deteriorated are unified and previously calculated by a subjective evaluation experiment, and the calculated physical characteristics are calculated.
- the primary estimation subjective quality SQ derived by the subjective quality estimator 11 from the video signal feature of the reference video signal RI is corrected based on the correspondence between the feature and the subjective evaluation characteristics, and the unified objective evaluation value with high accuracy Enable conversion to
- the correction calculator 14 receives the video signal feature FI from the feature calculator 12 and extracts a correction formula and a correction coefficient corresponding to the video signal feature FI from the correction information database 13. That is, the correction calculation unit 14 searches the correction information database 13 for a correction expression and a correction coefficient corresponding to the video signal feature value FI, and extracts the corresponding correction expression and correction coefficient from the correction information database 13. Then, the correction calculator 14 outputs these correction formulas and correction coefficients as correction information CI.
- the correcting unit 15 inputs the primary estimated subjective quality SQ from the subjective quality estimating unit 11 and the correction information CI from the correction calculating unit 14. Then, the correction unit 15 substitutes the primary estimated subjective quality SQ into a correction equation with a correction coefficient included in the correction information CI, and outputs the corrected primary estimated subjective quality SQ as the final estimated subjective quality Q. I do.
- the final estimated subjective quality Q is a correction of the primary estimated subjective quality SQ that quantitatively indicates the subjective quality of the degraded video signal PI.
- FIG. 2 is a diagram showing correction information stored in the correction information database 13 of FIG.
- the correction information database 13 a plurality of correction coefficients are stored in a database corresponding to each value of the video signal feature amount.
- the video signal features (spatial information SI and time information TI) of the reference video signal RI are calculated as feature values.
- the correction calculator 14 inputs the video signal feature FI from the feature calculator 12 and extracts a correction coefficient corresponding to the video signal feature FI from the database of the correction information database 13 together with the correction formula.
- the correction information database 13 also includes a database of correction formulas corresponding to the video signal feature amount that is not limited to the correction coefficient.
- FIG. 3 is a diagram showing the relationship between the final estimated subjective quality Q after correction by the correction formula for the primary estimated subjective quality SQ and the actually measured subjective quality.
- the correction formula is the following formula (2).
- FIG. 3 shows a graph based on three correction equations calculated for three types of input video signals input to the video quality evaluation device.
- the horizontal axis shows the primary estimated subjective quality SQ, which is the output of the subjective quality estimator 11, and the vertical axis shows the quality of the primary estimated subjective quality SQ by actually viewing the video from the input video signal.
- the subjective quality calculated by the subjective evaluation quality test is shown.
- the circles, squares, and triangles in FIG. 3 respectively indicate the subjective quality score relative to the primary estimated subjective quality SQ for each input video signal.
- the three line segments are the correction formulas according to the present embodiment corresponding to three types of input video signals, respectively, and according to the present embodiment, the primary estimated subjective quality SQ is thus the correction formula corresponding to each video. Is corrected.
- the human visual characteristics of the reference video are obtained from the physical features of the reference video, and are converted into a database as correction information for the features of the reference video.
- the accuracy of any video is equivalent to that of the conventional subjective evaluation method. Allows the subjective quality to be uniformly estimated.
- the video signal feature amount of the reference video signal RI is calculated.
- the configuration is logically provided separately as the feature amount calculation unit 12, the feature amount calculation unit 1 is powerful.
- the video signal feature amount of the reference video signal RI derived by the subjective quality estimating unit 11 without specially providing 2 may be used as it is.
- correction calculation unit 14 and the correction unit 15 may be logically integrated as well as physically. That is, the correction unit 15 may directly input the video signal characteristic amount of the reference video signal RI, and the correction unit 15 may input correction information corresponding to the video signal characteristic amount from the correction information database.
- FIG. 4 is a block diagram showing the configuration of the second embodiment according to the video quality evaluation device of the present invention.
- the video quality evaluation device As shown in FIG. 4, the video quality evaluation device according to the second embodiment
- a spatial feature calculating unit 22 a temporal feature calculating unit 23, and a subjective quality estimating unit 24 are provided.
- the alignment information generation unit 21 receives the reference video signal RI and the degraded video signal PI, receives a reference video frame from the reference video signal RI, and receives a degraded video frame from the degraded video signal PI, respectively. A temporal and spatial shift between the reference video frame and the degraded video frame is detected, and alignment information regarding the temporal and spatial shift between the frames is generated.
- the time difference between the reference video frame and the degraded video frame means that the video of the reference video frame and the video of the degraded video frame received at a certain time are temporally shifted. .
- the alignment information generation unit 21 receives a frame of a certain video A as a reference video frame at a certain time and receives a frame three frames before the frame of the video A as a degraded video frame at this time, The alignment information generation unit 21 detects that the deteriorated video frame is three frames behind the reference video frame, and generates this information as alignment information.
- the spatial displacement between the reference video frame and the deteriorated video frame means that the spatial position of the video received at a certain time by the alignment information generating unit 21 is displaced. .
- the alignment information generation unit 21 receives a frame in which the center of the ball is reflected in the center of the video at a certain time.
- the alignment information generation unit 21 It detects that the degraded video frame is shifted one pixel to the right and two pixels above it with respect to the reference video frame, and generates this information as alignment information.
- the spatial feature calculating unit 22 receives the reference video signal RI, the degraded video signal PI, and the alignment information, and eliminates a spatial shift and a temporal shift between the reference video frame and the degraded video frame. After that, the spatial feature is calculated based on the reference video signal RI and the degraded video signal PI.
- an index of an edge power amount (Ave_EE) and a minimum value of the horizontal and vertical edge amounts (Min_HV), which will be described in detail below, are used as the spatial feature amounts.
- This index quantifies the deterioration (for example, the degree of blurring) that occurs at the boundary (called the edge) where the luminance value changes rapidly within the frame based on the reference video signal RI and the degraded video signal PI. It is.
- the deterioration on the edge is quantified by using a Sobel filter to emphasize the edge from the luminance value of the pixel.
- the edge power (Ave-EE) quantified here is specified in ANSI Tl. 801.03-1996, "Digital @ ransport of One-Way Video Signals Parameters for Objective Performance Assessment".
- SI (ijm) and SI Gjm) represent the Sobel filters at the position (ij) of the m-th frame, respectively, and are given by equations (5) and (6), respectively. .
- SI (i, j, m) ⁇ -Y (i-1 J-l, m) _2Y (ij_l m)-Y (i + lj- l, m)
- SI (ijm) ⁇ -Y (i-l, j-l, m) + Y (i + lj— l, m) _2Y (g l, jm)
- ⁇ (ijm) is the value of the pixel at the position (i, j) of the m-th frame of the reference video frame.
- Y (i, jm) is the position of the m-th frame of the degraded video frame.
- Min-HV Minimum horizontal and vertical edge amount
- This index quantifies the degree to which a boundary (edge) in which the luminance value sharply changes occurs in the horizontal and vertical directions in the deteriorated video frame corresponding to the reference video frame, as compared with the reference video frame. Things.
- a feature amount that captures a distortion amount from a ratio of a horizontal vertical edge amount generated at a position in a horizontal Z vertical direction of a frame to an edge amount generated at a position in other directions is used.
- Min_HV quantifies the degree to which edges occur in the horizontal and vertical directions. This degree is shown in Figure 5.
- Min_HV The minimum value of the horizontal and vertical edge amounts
- SI (i, j, m) tan-1 (i, j, m) / SI (i, j, m)].
- P in equation (9) is the number of pixel range marked with shadow in FIG. 5 c
- P in Expression (12) is the number of pixels in a range satisfying Expressions (13) and (14).
- Min_HV The minimum value of the horizontal and vertical edge amounts
- Ave-EE edge power amount
- This index is uniquely improved to capture only the occurrence of new edges with a minimum value, as shown in the force equation (7) based on the index specified by ANSI. .
- the temporal feature amount calculating unit 23 inputs the reference video signal RI, the degraded video signal PI, and the alignment information, and detects a spatial shift and a temporal shift between the reference video frame and the degraded video frame. After that, the time feature is calculated based on the reference video signal RI and the degraded video signal PI.
- a block average motion power amount (Ave_MEB), which is an index based on a difference between TI (time information) values described below, is used as the time feature amount.
- the TI value is the difference between the pixel brightness values between video frames, and
- This block average motion power (Ave—MEB):
- This index derives the difference between the TI value of the reference video frame and the TI value of the degraded video frame for each block, which is a set of several pixels in the frame, and calculates the difference by the TI value of each block of the reference video frame. This is the result of normalization.
- TI (k, l, m) is represented by equation (16).
- the block average motion power (Ave-MEB) is a feature that captures degradation that cannot be captured by the edge power (Ave-EE), specifically, the occurrence of degradation due to motion for each region. As will be described later with reference to the drawings, the deterioration is sensitively detected by this index.
- This index is a unique measure that derives the TI value for each block in order to capture the motion for each area, and further normalizes it with the value of the reference video to improve the sensitivity.
- the subjective quality estimating unit 24 inputs the spatial feature amount calculated by the spatial feature amount calculating unit 22 and the temporal feature amount calculated by the temporal feature amount calculating unit 23, and compares the spatial feature amount with the reference video signal RI.
- the subjective evaluation value which is the subjective quality of the coded video signal PI, is estimated. This subjective evaluation value (Y) is calculated by Expression (17).
- X Ave EE
- X Min HV
- X Ave_MEB
- hi, ⁇ , y, and ⁇ represent the correspondence between the temporal feature and the spatial feature with the subjective evaluation value based on the relationship with the subjective evaluation value for the deteriorated video obtained by the subjective evaluation experiment. Is a weighting coefficient obtained in advance to determine the weighting factor. Also, h, ⁇ , y, and ⁇ are, for example, the signal format of the degraded video signal ⁇ , the information amount (size) of the degraded video signal transmitted by the degraded video signal ⁇ , and the encoding method of the degraded video signal.
- the video quality evaluation device shown in FIG. 4 described in the second embodiment is incorporated in the subjective quality estimation unit 11 of the video quality evaluation device shown in FIG. 1, and the video quality evaluation device shown in FIG. It is also possible that a quality evaluation device is used. That is, the video quality evaluation device shown in FIG. 4 inputs the reference video signal RI and the degraded video signal PI, and outputs the value output by the subjective quality estimating unit 24 to the correcting unit 15 as the primary estimated subjective quality SQ.
- the correction unit 15 receives the correction information C1 from the correction calculation unit 14, inputs the primary estimated subjective quality SQ, and calculates the final estimated subjective quality Q.
- FIG. 6 is a block diagram showing the configuration of the third embodiment according to the video quality evaluation device of the present invention.
- the video quality evaluation device is different from the video quality evaluation device according to the second embodiment in that the reference video signal RI and the degraded video signal PI that are input to the device have different file formats. And the signal format of the degraded video signal PI, the information amount (size) of the degraded video sent by the degraded video signal PI, and the encoding method of the degraded video signal are unknown. Is different. Therefore, the same parts as those of the video quality evaluation device of the second embodiment are denoted by the same reference numerals, and description thereof will be omitted.
- the video quality evaluation device of the third embodiment includes a format conversion unit 35, an alignment information generation unit 21, a spatial feature calculation unit 22, a temporal feature calculation unit 23, a subjective quality.
- An estimating unit 34 and a weighting coefficient database 36 are provided.
- the format conversion unit 35 receives the reference video file and the degraded video file, and if the file format of the degraded video file is different from the file format of the reference video file, converts the degraded video file format to the reference video file. File format similar to that of Replace. More specifically, for example, when the signal format, color distribution, size, aspect ratio, and encoding method of the degraded video file are different from those of the reference video file, the degraded video file Convert to the same format as the video file format. Then, the format converter 35 outputs a deteriorated video signal PI from the deteriorated video file converted into the same file format as the reference video file. The degraded video signal PI from the format conversion unit 35 is output to the alignment information generation unit 21, the spatial feature calculation unit 22, and the time feature calculation unit 23.
- the format conversion unit 35 converts the degraded video into RGB format. If there is this video, please use Rec. IUT-R BT. 601 "STUDIO
- the format conversion unit 35 converts the size and aspect ratio of the reference video and the degraded video. When converting so that these forces are the same, it may be possible to simply calculate as an integer multiple, but if that is not enough, it is necessary to convert to an arbitrary size. (Eg, Muramatsu S. and Kiya ⁇ : "Scale Factor of Resolution Conversion Based on Orthogonal Transforms," IEICt !, lYans.
- the bias power S is not applied to the objective evaluation value. Normalize the distribution such as the luminance value of the deteriorated video. In other words, for both the reference video and the degraded video, statistics such as the maximum value, minimum value, average value, and variance are derived from the pixel value distribution of a specific frame for each luminance, color difference signal, or RGB value.
- the format conversion unit 35 outputs to the subjective quality estimation unit 34 information relating to the file format of the converted degraded video signal ⁇ , that is, the file format similar to the reference video signal RI.
- Information related to this file format includes, for example, the signal format, color distribution, size, aspect ratio, and encoding method of the deteriorated video file.
- the signal format of the deteriorated video file is used.
- Size, and coding method are output to the subjective quality estimating unit 34.
- the weighting coefficient database 36 stores a plurality of these conditions as a set of conditions, such as a signal format, a size, and an encoding method of the degraded video file. , y, and ⁇ are set in advance. These weighting factors are obtained in advance from the relationship with the subjective evaluation value for the deteriorated video obtained by the subjective evaluation experiment, and the correspondence between the temporal feature and the spatial feature with the subjective evaluation value.
- FIG. 7 shows a plurality of conditions stored in the weighting coefficient database 36 and weighting coefficients corresponding to these conditions.
- the subjective quality estimating unit 34 weights the weighting factors a, ⁇ , ⁇ , and ⁇ according to the signal format, size, and encoding method of the degraded video file input from the format conversion unit 35. What is obtained from the database 36 is different from the subjective quality estimation unit 24 of the second embodiment. The other points are the same as those of the subjective quality estimation unit 24.
- the format conversion unit 35 and the alignment information generation unit 21 are separately arranged. However, the format conversion unit 35 and the alignment information generation unit 21 are configured as one component and the video quality evaluation shown in FIG. It may be incorporated in the device.
- the file format of the degraded video file is based on The subjective quality of the degraded video can be estimated even if it is different from the quasi-video file. Furthermore, since it is possible to cope with many patterns regarding the signal format, the size, the encoding method, and the like of the degraded video file, it is possible to estimate the subjective quality of various degraded video files.
- the subjective evaluation data is the standard video selected in ITU-R shown in Fig. 8 (the video sequence name in Fig. 8) (ITU-R BT.802-1, Test Pictures and sequences for suojective Assessments or Digital odecs Conveying Signals Produced According to Recommendation ITU-RBT. 601, "1994, and ITU-R BT. 1201-2," Test materials to be used in subjective assessment, "2001), 36 types of reference images are shown in Fig. 8. As described above, the standard video is divided into verification data for verifying the estimation accuracy of the video quality evaluation value and learning data, which is data used for deriving coefficients in advance.
- the subjective evaluation data used to verify the estimation accuracy is selected so as to reduce the influence of the bias of the characteristics of the reference video. That is, the distribution of spatial information (SI) and temporal information (TI) indicated in ITU-TP.910 (see ITU-T P.910, "subjective viaeo quality assessment metnods for multimedia applications Aug. 1996.)
- SI spatial information
- TI temporal information
- the same number of videos are selected from area A to area D in Fig. 9.
- videos having various SI values and TI values can be used as reference videos.
- the video was degraded by assigning it in four steps within the range of 256kbps-8Mbps.
- the subjective quality evaluation method uses the DSCQS method (ITU-R BT. 500-10, "Methodology for the subjective assessment or the quality of television") that is often used in codec performance evaluation tests such as MPEG verification. pictures March 2000. ⁇ r participation 3 ⁇ 4 j ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇
- the target estimation accuracy of the video quality evaluation device is equal to the degree of variation in the score in the subjective evaluation.
- the degree of variation in the subjective evaluation value was derived as a one-sided 95% confidence interval, and was 7.24 for all data. Therefore, the target estimation accuracy of the video quality evaluation device is that the mean square error (RMSE) does not exceed this value.
- RMSE mean square error
- a conventional video quality evaluation device uses a PSNR (Peak Signal Noise Ratio), which is generally used, and an edge power amount (Ave_EE) known as an ANSI parameter.
- Figures 10 and 11 show the estimation results for the learning data, respectively.
- FIG. 10 shows the estimation result when PSNR is used
- FIG. 11 shows the estimation result when edge power (Ave_EE) is used.
- the RMSE for PSNR is 9.57
- the RMSE for edge power (Ave_EE) is 7.47.
- the edge power (Ave_EE) has better characteristics than PSNR. However, the RMSE exceeds the target value and the estimation accuracy is insufficient.
- the video quality evaluation devices of the second and third embodiments using the edge power amount (Ave-EE), the minimum value of the horizontal and vertical edge amounts (Min-HV), and the Ave-MEB are used.
- Figures 12 and 13 show the estimation results for the learning data and verification data.
- the RMSE for the training data is 6.43
- the RMSE for the verification data is 6.49.
- the target estimation accuracy has been sufficiently cleared. Therefore, the video quality evaluation devices of the second and third embodiments have an estimation accuracy that can be used in place of a subjective evaluation quality test in which a human looks at an actual video and evaluates the quality, and is practically practical. I understand that it is at a usable level.
- FIG. 14 is a diagram showing an estimation result of a part of the learning data based only on the amount of edge power (Ave_EE).
- Ave_EE edge power
- FIG. 15 shows the relationship between the minimum value of the horizontal and vertical edge amounts (Min HV) and the subjective evaluation value.
- Min HV the minimum value of the horizontal and vertical edge amounts
- FIG. 16 is a diagram showing the relationship between the block average motion power (Ave—MEB) and the subjective evaluation value. As shown in FIG. 16, according to the block average motion power (Ave_MEB), it can be seen that the input video [6], [9] is sensitive to the degradation.
- a lack of accuracy is compensated for as a physical feature of a video.
- FIG. 17 is a block diagram showing the configuration of an embodiment according to the video matching device of the present invention.
- the video matching device includes a format conversion unit 41, a display timing matching unit
- a synchronization / position matching unit 43 a luminance / color correction unit 44, and a deterioration amount deriving unit 45.
- the format conversion unit 41 matches the file format included in the deteriorated video signal degraded by encoding or loss in the network with the file format of the reference video included in the reference video signal.
- the display timing matching unit 42 matches the video display timing of the reference video signal and the deteriorated video signal.
- the synchronization / position matching unit 43 matches the reference video signal and the deteriorated video signal in the time-space direction while acquiring the deterioration amount and the synchronization shift information from the deterioration amount deriving unit 45. Further, if necessary, the luminance / color matching section 44 performs correction based on differences in luminance and color distribution between the reference video signal and the degraded video signal.
- the reference video signal and the degraded video signal include frame rate information or frame display time Z capture time information. It also includes signal format information as needed.
- the video matching device advances non-real time processing of the reference video and the degraded video while storing the target frame and several frames before and after the target frame in the memory.
- FIG. 18 is a flowchart showing an operation procedure of the video matching device according to the embodiment of the present invention.
- the format converter 41 converts the signal format of the degraded video (Step Sl). For example, if the data format of the reference video is the uncompressed YUV format and the data format of the degraded video is the uncompressed RGB format, the degraded video is transferred to Rec.IUT-R BT.601 "STUDIO ENCODING PARAMETERS OF DIGITAL TELEVISION FOR STANDARD 4: 3 AND
- the format converter 41 converts the degraded video to an uncompressed format in advance if the degraded video is in the compressed format. Part 41 converts the size and aspect ratio so that they are the same if they are different.Here, for example, it may be possible to simply calculate as an integral multiple, but if that alone is not enough, an arbitrary size In this case, conversion to an arbitrary size is performed by a known method (for example, Muramatsu S. and Kiya ri .: Scale r actor of Resolution uonversion Based on Orthogonal Transforms, IEICE Trans. Fundamentals.
- the format conversion unit 41 passes the reference video signal and the converted degraded video signal to the display timing matching unit 42.
- the display timing matching unit 42 performs a process such as complementing a frame in order to match the deteriorated video signal format-converted by the format conversion unit 41 with the display timing of the reference video signal (step S2).
- the display timing matching unit 42 sets the reference image and the reference image as shown in the lower part of FIG. 19B.
- the degraded video is supplemented with the video immediately before the degraded video.
- the image immediately before the deteriorated image Rather than compensating for the image, it is also possible to compensate for the degraded video with a temporally closer video
- the frame I is allocated as the second frame of the converted degraded video, if it is to be supplemented with a temporally closer video of the degraded video, the frame is Will be allocated.
- the display timing matching unit 42 passes the reference video signal and the deteriorated video signal to the synchronization / position matching unit 43.
- the synchronization / position matching unit 43 presumes three states: (1) an evaluation start state, (2) a synchronization state, and (3) a freeze state, and defines its operation.
- the synchronization / position matching unit 43 performs one macro for the reference video signal and the degraded video signal for a certain period of time or the entire frame in order to perform macro matching in the time direction.
- the frame information with the highest consistency is obtained, and the macro time shift is derived (Ste S3).
- the feature value is shifted in the time direction to derive the time-direction shift (frame difference) from the condition that the difference between each time-series value is minimum or the cross-correlation coefficient is maximum. I do.
- the synchronization / position matching unit 43 is shown in FIG. 21 for several frames before and after the image matched by the degradation amount deriving unit 45 in the reference image in order to match in the micro spatio-temporal direction. Pixels of the degraded image are moved up, down, left, and right as described above, and the position having the smallest difference value is received as pixel position information as the position where the best matching is obtained (step S4).
- synchronous Z position matching section 43 passes reference video and degraded video that have been matched to perform luminance Z color correction to luminance / color correction section 44.
- the luminance Z color correction unit 44 matches the average value, the minimum value, the maximum value, and the distribution of the luminance and color information of the deteriorated image with those of the reference image. For example, when the luminance distribution of the reference image and the deteriorated image is different, the brightness / color correction unit 44 determines the difference between the reference image and the deteriorated image based on the average value and the variance of the luminance distribution of the reference image and the deteriorated image. Performs linear normalization on the luminance distribution, Information about the conversion formula for normalization is passed to the synchronization / position matching unit 43 as correction information.
- the synchronization / position matching unit 43 receives the correction information, and performs a luminance / color correction process based on the received correction information (Step S5).
- Step 43 performs the processing of steps S6 to S22 shown in FIG.
- step S6 While the reference video target frame number i is smaller than the reference video final frame number N or the deteriorated video target frame number j is smaller than the degraded video final frame number M, the following steps S7 to S22 are performed. Processing is performed (step S6).
- step S7 it is determined whether or not F1 is "1", that is, whether or not the reference video and the degraded video are in an asynchronous state. If the reference video and the degraded video are synchronized, (F1
- step S8 0
- the synchronization / position matching unit 43 passes to the deterioration amount deriving unit 45 the reference image and the deteriorated image obtained by correcting the shift in the temporal and spatial directions and the luminance / color information obtained in the evaluation start state described above. .
- the degradation amount deriving unit 45 performs micro-synchronization processing on the corresponding frame of the reference video and several frames before and after the corresponding frame of the degradation video (see FIG. 21), and derives the degradation amount obtained thereby. At the same time, a difference value between frames between the reference video and the immediately preceding frame of each degraded video is derived as a degradation amount (step S8).
- the synchronization / position matching unit 43 receives the degradation amounts derived by the degradation amount deriving unit 45, and determines whether or not the deteriorated video is in the frozen state based on the inter-frame difference value among the degradation amounts. Judge (step S9). That is, if the difference value of the reference image indicates a certain value, while that of the deteriorated image indicates almost 0, the synchronization / position matching unit 43 determines that the deteriorated image is in a frozen state. to decide.
- the synchronization / position matching unit 43 determines the target frame of the degraded video obtained by the micro synchronization process of the degradation amount derivation unit 45 in step S8. Judgment is made as to whether or not the amount of deterioration is minimum (step Sl l).
- step S14 the reference image and the deteriorated image are output, and the F2 value and the Cotint value are output. After that, the Count value is reset to 0, and the routine goes to Step S22.
- step S16 the synchronization / position matching unit 43 obtains the target frame of the degraded video by the micro-synchronization processing performed on the corresponding reference video frame and several frames before and after it in step S8.
- a difference value between frames between the reference video and the immediately preceding frame of each deteriorated image is received as the deterioration amount.
- the synchronization / position matching unit 43 determines whether or not the deteriorated video is in a frozen state based on the inter-frame difference value among the deterioration amounts (step S17).
- Step S18 the synchronization / position matching unit 43 increments the number of freezes (Count value) (Step S18), and proceeds to Step S22.
- the synchronization / position matching unit 43 determines that the frozen state has ended, and performs the same macro time synchronization processing as in step S3, (step S19). ), And outputs the number of freezes (Count value) (step S20). After that, the F1 value and the Count value are reset to 0, and the process proceeds to step S22.
- step S22 the reference video target frame number i and the degraded video target frame number j are incremented, and the i value reaches the reference video final frame number N, and the j value If has reached the number M of degraded video final frames, the processing ends.
- the format of the video is converted and the synchronization is constantly performed based on the macro matching process and the micro matching process.
- the size and aspect ratio of the received video are different, or when the spatial position of the reference video and the deteriorated video cannot be matched due to the loss of a certain amount of information such as packet loss, or the IP packet arrival interval
- the fluctuation of the video and the occurrence of packet loss cause the non-existent phenomena such as the deviation of the video display timing on the time axis and the fluctuation or freeze of the video display timing, the time direction between the reference video and the degraded video And the spatial direction can be properly matched.
- the reference video, the degraded video and the accompanying information (state in the time direction) output from the synchronization / position matching unit 43 are input to the video quality evaluation device.
- the deterioration amount deriving unit 45 derives an objective evaluation value instead of the deterioration amount, and If the result is output from the position matching unit 43, it can be used as a video quality evaluation device.
- the instructions shown in the processing procedure can be executed based on a program that is software.
- a general-purpose computer system stores this program in advance and reads this program, so that the general-purpose computer system can function as a video quality evaluation device and a video matching device.
- the instructions described in each of the embodiments described above include a program that can be executed by a computer, such as a magnetic disk (flexible disk, hard disk, etc.), It is recorded on an optical disk (CD-ROM, CD-R, CD-RW, DVD-ROM, DVD ⁇ R, DVD-RW, etc.), semiconductor memory, or a similar recording medium.
- the computer reads the program from the recording medium, and if the CPU executes the instructions described in the program based on the program, the computer operates as the video quality evaluation device and the video matching device of the above-described embodiment. .
- the present invention is not limited to the above-described embodiment as it is, and can be embodied by modifying its constituent elements without departing from the scope of the invention at the stage of implementation. Further, various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, constituent elements over different embodiments may be appropriately combined.
- the video quality evaluation device even if the reference video is unknown, a uniform subjective quality is accurately estimated for any video. be able to.
- a human visual characteristic of a reference video is obtained from physical features of the reference video
- a database is created as correction information for the feature amount of the reference video
- the estimated subjective quality derived from the difference between the physical features of the reference video and the degraded video is weighted with the correction information, so that any video
- the subjective quality can be uniformly derived with the same accuracy as the subjective evaluation method.
- the video matching device when comparing the physical feature amounts of the reference video signal and the degraded video signal in estimating the subjective quality, they are surely compared. And the spatial and temporal positions can be matched.
- the personal computer terminal performs video format conversion and constantly performs synchronization matching based on macro matching processing and micro matching processing. If the size and aspect ratio of the video received when receiving the video are different, or the loss of a certain amount of information such as packet loss, the spatial position of the reference video and the degraded video may not correspond. In some cases, such as the inability to capture the data, or the fluctuation of the IP packet arrival interval or the occurrence of bucket loss, the occurrence of non-existent phenomena such as the deviation of the video display timing on the time axis, the fluctuation of the video display timing, and the freeze However, it is possible to properly match the temporal direction and the spatial direction between the reference image and the deteriorated image.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Image Analysis (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005513306A JP4257333B2 (ja) | 2003-08-22 | 2004-08-20 | 映像品質評価装置、映像品質評価方法及び映像品質評価プログラム、並びに映像整合装置、映像整合方法及び映像整合プログラム |
CA2525812A CA2525812C (en) | 2003-08-22 | 2004-08-20 | Video quality assessing apparatus, video quality assessing method, video quality assessing program, video aligning apparatus, video aligning method, and video aligning program |
EP04771953.9A EP1622395B1 (en) | 2003-08-22 | 2004-08-20 | Device, method and program for quality evaluation of a video signal after its transmission or encoding |
US10/556,103 US7705881B2 (en) | 2003-08-22 | 2004-08-20 | Video quality assessing apparatus, video quality assessing method, and video quality assessing program |
US12/717,983 US8253803B2 (en) | 2003-08-22 | 2010-03-05 | Video quality assessing apparatus |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-298864 | 2003-08-22 | ||
JP2003298864 | 2003-08-22 | ||
JP2004035434 | 2004-02-12 | ||
JP2004-035434 | 2004-02-12 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10556103 A-371-Of-International | 2004-08-20 | ||
US12/717,983 Division US8253803B2 (en) | 2003-08-22 | 2010-03-05 | Video quality assessing apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005020592A1 true WO2005020592A1 (ja) | 2005-03-03 |
Family
ID=34220712
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/011992 WO2005020592A1 (ja) | 2003-08-22 | 2004-08-20 | 映像品質評価装置、映像品質評価方法及び映像品質評価プログラム、並びに映像整合装置、映像整合方法及び映像整合プログラム |
Country Status (6)
Country | Link |
---|---|
US (2) | US7705881B2 (ja) |
EP (1) | EP1622395B1 (ja) |
JP (4) | JP4257333B2 (ja) |
KR (2) | KR100798834B1 (ja) |
CA (3) | CA2646808C (ja) |
WO (1) | WO2005020592A1 (ja) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007043642A (ja) * | 2005-03-04 | 2007-02-15 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質評価装置、方法およびプログラム |
JP2008005107A (ja) * | 2006-06-21 | 2008-01-10 | Nippon Telegr & Teleph Corp <Ntt> | 映像整合方法 |
JP2008035357A (ja) * | 2006-07-31 | 2008-02-14 | Kddi Corp | 映像品質の客観評価装置 |
JP2008066856A (ja) * | 2006-09-05 | 2008-03-21 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質客観評価装置、方法およびプログラム |
JPWO2007007750A1 (ja) * | 2005-07-11 | 2009-01-29 | 日本電信電話株式会社 | 映像整合装置、方法、およびプログラム |
JP2009027432A (ja) * | 2007-07-19 | 2009-02-05 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質客観評価方法、映像品質客観評価装置およびプログラム |
JP2011015165A (ja) * | 2009-07-01 | 2011-01-20 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質推定装置、システム、方法およびプログラム |
US8064638B2 (en) * | 2005-06-29 | 2011-11-22 | Ntt Docomo, Inc. | Video evaluation apparatus, spatio-temporal frequency analyzing apparatus, video evaluation method, spatio-temporal frequency analyzing method, video evaluation program, and spatio-temporal frequency analyzing program |
JP2013102353A (ja) * | 2011-11-08 | 2013-05-23 | Nippon Hoso Kyokai <Nhk> | 映像符号化方式変換装置 |
JP2015035696A (ja) * | 2013-08-08 | 2015-02-19 | 株式会社リコー | 通信端末、通信システムおよび通信方法並びにプログラム |
JP2016005111A (ja) * | 2014-06-17 | 2016-01-12 | 日本電信電話株式会社 | 評価映像分析装置及び方法及びプログラム |
US9407934B2 (en) | 2013-06-20 | 2016-08-02 | Fujitsu Limited | Image evaluation apparatus and method |
JP2020191681A (ja) * | 2016-10-08 | 2020-11-26 | 華為技術有限公司Huawei Technologies Co.,Ltd. | 映像品質評価方法および装置 |
JPWO2021181724A1 (ja) * | 2020-03-13 | 2021-09-16 | ||
CN113992943A (zh) * | 2021-10-25 | 2022-01-28 | 上海佰贝科技发展股份有限公司 | 一种监测播出服务器信号异态或劣化的方法及系统 |
Families Citing this family (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100541526B1 (ko) * | 2004-01-30 | 2006-01-10 | 에스케이 텔레콤주식회사 | 멀티미디어 데이터의 전송품질 측정방법 및 장치 |
US20090040303A1 (en) * | 2005-04-29 | 2009-02-12 | Chubb International Holdings Limited | Automatic video quality monitoring for surveillance cameras |
US8107540B2 (en) * | 2005-07-11 | 2012-01-31 | Cheetah Technologies, L.P. | Image complexity computation in packet based video broadcast systems |
KR100771616B1 (ko) * | 2005-09-09 | 2007-10-31 | 엘지전자 주식회사 | 투사형 디스플레이 장치 및 그 제어방법 |
EP1865730A3 (en) * | 2006-06-06 | 2010-01-20 | Sony Corporation | Video-signal processing method, program of video-signal processing method, recording medium having recorded thereon program of video-signal processing method, and video-signal processing apparatus |
US7756136B2 (en) * | 2006-07-10 | 2010-07-13 | Cheetah Technologies, L.P. | Spatial and temporal loss determination in packet based video broadcast system in an encrypted environment |
DE102006044929B4 (de) * | 2006-09-22 | 2008-10-23 | Opticom Dipl.-Ing. Michael Keyhl Gmbh | Vorrichtung zum Bestimmen von Informationen zur zeitlichen Ausrichtung zweier Informationssignale |
US8711926B2 (en) * | 2007-02-08 | 2014-04-29 | Qualcomm Incorporated | Distortion estimation for quantized data |
WO2008128249A1 (en) * | 2007-04-16 | 2008-10-23 | Tektronix, Inc. | Systems and methods for robust video temporal registration |
KR100893609B1 (ko) * | 2007-06-05 | 2009-04-20 | 주식회사 케이티 | 인간 시각 특성을 이용한 영상 품질 측정 장치 및 방법 |
KR100922898B1 (ko) * | 2007-12-17 | 2009-10-20 | 한국전자통신연구원 | IP 미디어의 QoE 보장형 영상품질 측정장치 및측정방법 |
US20090161011A1 (en) * | 2007-12-21 | 2009-06-25 | Barak Hurwitz | Frame rate conversion method based on global motion estimation |
US20090185076A1 (en) * | 2008-01-18 | 2009-07-23 | Po-Jui Chen | VGA port signal examining apparatus and method thereof |
JP2009260941A (ja) * | 2008-03-21 | 2009-11-05 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質客観評価方法、映像品質客観評価装置、及びプログラム |
EP2114080A1 (en) * | 2008-04-30 | 2009-11-04 | Thomson Licensing | Method for assessing the quality of a distorted version of a frame sequence |
EP2114079B2 (en) | 2008-05-02 | 2018-01-24 | Psytechnics Ltd | Method and apparatus for aligning signals |
US8754947B2 (en) * | 2008-05-07 | 2014-06-17 | Evertz Microsystems Ltd. | Systems and methods for comparing media signals |
US8780209B2 (en) * | 2008-05-07 | 2014-07-15 | Evertz Microsystems Ltd. | Systems and methods for comparing media signals |
US20090309977A1 (en) * | 2008-06-12 | 2009-12-17 | Microsoft Corporation | Benchmarking and calibrating video quality assessment tools |
CN101685591B (zh) * | 2008-09-26 | 2011-06-22 | 鸿富锦精密工业(深圳)有限公司 | 自动检测显示装置所支持图片格式的检测装置及方法 |
US8718404B2 (en) * | 2009-02-06 | 2014-05-06 | Thomson Licensing | Method for two-step temporal video registration |
US8422795B2 (en) | 2009-02-12 | 2013-04-16 | Dolby Laboratories Licensing Corporation | Quality evaluation of sequences of images |
KR101101139B1 (ko) * | 2009-11-23 | 2012-01-05 | 서강대학교산학협력단 | 모티브 스캔을 이용한 화질 평가방법 및 장치 |
US8456531B2 (en) * | 2010-01-14 | 2013-06-04 | Cheetah Technologies, L.P. | Video alignment and calibration for video quality measurement |
GB2477956B (en) * | 2010-02-19 | 2014-11-05 | Snell Ltd | Objective picture quality measurement |
JP5350300B2 (ja) * | 2010-03-24 | 2013-11-27 | 日本電信電話株式会社 | トランスコード映像品質客観評価装置及び方法及びプログラム |
JP5450279B2 (ja) * | 2010-06-16 | 2014-03-26 | 日本電信電話株式会社 | 映像品質客観評価装置及び方法及びプログラム |
JP5467030B2 (ja) * | 2010-11-29 | 2014-04-09 | 日本電信電話株式会社 | 映像品質客観評価装置及びプログラム |
US8581987B1 (en) * | 2011-03-21 | 2013-11-12 | Marvell International Ltd. | Systems and methods for evaluating video quality |
JPWO2012140783A1 (ja) * | 2011-04-15 | 2014-07-28 | 富士通株式会社 | 半導体集積回路の対向ポートの自律初期化方法および半導体集積回路 |
US8520075B2 (en) * | 2011-06-02 | 2013-08-27 | Dialogic Inc. | Method and apparatus for reduced reference video quality measurement |
CA2839778C (en) * | 2011-06-26 | 2019-10-29 | Universite Laval | Quality control and assurance of images |
CN102426019B (zh) * | 2011-08-25 | 2014-07-02 | 航天恒星科技有限公司 | 一种无人机景象匹配辅助导航方法及系统 |
US8867013B2 (en) * | 2012-01-26 | 2014-10-21 | Avaya Inc. | System and method for measuring video quality degradation using face detection |
JP2013186427A (ja) * | 2012-03-09 | 2013-09-19 | Ricoh Co Ltd | 映像処理装置 |
KR101327709B1 (ko) * | 2012-03-23 | 2013-11-11 | 한국전자통신연구원 | 비디오 품질 측정 장치 및 그 방법 |
DE102012102886B4 (de) * | 2012-04-03 | 2016-07-14 | DREFA Media Service GmbH | Verfahren und System zur automatischen Korrektur eines Videosignals |
US9369742B2 (en) * | 2012-12-06 | 2016-06-14 | Avaya Inc. | System and method to estimate end-to-end video frame delays |
US8994828B2 (en) * | 2013-02-28 | 2015-03-31 | Apple Inc. | Aligned video comparison tool |
DE102013211571B4 (de) * | 2013-06-19 | 2016-02-11 | Opticom Dipl.-Ing. Michael Keyhl Gmbh | Konzept zur bestimmung der qualität eines mediadatenstroms mit variierender qualität-zu-bitrate |
US9131214B2 (en) * | 2013-06-21 | 2015-09-08 | Zenith Electronics Llc | Diagnostics system for remotely located televisions |
US10674180B2 (en) | 2015-02-13 | 2020-06-02 | Netflix, Inc. | Techniques for identifying errors introduced during encoding |
JP6431449B2 (ja) * | 2015-07-01 | 2018-11-28 | 日本電信電話株式会社 | 映像整合装置、映像整合方法、及びプログラム |
JP6034529B1 (ja) * | 2016-06-14 | 2016-11-30 | 九州電力株式会社 | 表面状態診断装置 |
US10728427B2 (en) * | 2016-12-15 | 2020-07-28 | Disney Enterprises, Inc. | Apparatus, systems and methods for nonlinear synchronization of action videos |
US10735742B2 (en) | 2018-11-28 | 2020-08-04 | At&T Intellectual Property I, L.P. | Adaptive bitrate video testing |
US11205257B2 (en) * | 2018-11-29 | 2021-12-21 | Electronics And Telecommunications Research Institute | Method and apparatus for measuring video quality based on detection of change in perceptually sensitive region |
KR102401340B1 (ko) * | 2018-11-29 | 2022-05-25 | 한국전자통신연구원 | 인지 민감 영역의 변화의 검출에 기반하는 비디오 화질 측정 방법 및 장치 |
CN111277894B (zh) * | 2020-03-02 | 2021-08-27 | 四川长虹电器股份有限公司 | 一种自动检测视频播放画面流畅性的方法 |
CN112770105B (zh) * | 2020-12-07 | 2022-06-03 | 宁波大学 | 一种基于结构特征的重定位立体图像质量评价方法 |
CN112714309A (zh) * | 2020-12-22 | 2021-04-27 | 北京百度网讯科技有限公司 | 视频质量评估方法、装置、设备、介质及程序产品 |
KR20230076112A (ko) | 2021-11-23 | 2023-05-31 | 이화여자대학교 산학협력단 | 인공지능 기반의 영상 화질 평가장치, 방법 및 이를 위한 컴퓨터 판독가능 프로그램 |
CN114567780B (zh) * | 2022-02-25 | 2024-08-23 | 杭州当虹科技股份有限公司 | 一种视频帧对齐方法 |
KR102680344B1 (ko) | 2022-11-25 | 2024-07-02 | 주식회사 와이즈오토모티브 | 영상 시험 장치 및 방법 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0789497A2 (en) | 1996-02-12 | 1997-08-13 | Tektronix, Inc. | Progammable instrument for automatic measurement of compressed video quality |
JP2000032496A (ja) * | 1998-06-23 | 2000-01-28 | Tektronix Inc | ビデオ・シ―ケンスの空間―時間アライメント方法 |
JP2003009186A (ja) * | 2001-04-16 | 2003-01-10 | Kddi Corp | 伝送画質監視装置 |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0258993A (ja) * | 1988-08-25 | 1990-02-28 | Fujitsu Ltd | 立体テレビジョン信号処理装置 |
US5550937A (en) * | 1992-11-23 | 1996-08-27 | Harris Corporation | Mechanism for registering digital images obtained from multiple sensors having diverse image collection geometries |
US5446492A (en) * | 1993-01-19 | 1995-08-29 | Wolf; Stephen | Perception-based video quality measurement system |
JP3025415B2 (ja) * | 1995-01-20 | 2000-03-27 | ケイディディ株式会社 | ディジタル圧縮・再生画像の画質評価装置 |
JP3458600B2 (ja) | 1996-05-10 | 2003-10-20 | Kddi株式会社 | ディジタル画像品質評価装置 |
US6295083B1 (en) * | 1998-02-27 | 2001-09-25 | Tektronix, Inc. | High precision image alignment detection |
DE69803830T2 (de) | 1998-03-02 | 2002-09-12 | Koninklijke Kpn N.V., Groningen | Verfahren, Vorrichtung, ASIC und deren Benutzung zur objektiven Videoqualitätbewertung |
JP3494894B2 (ja) | 1998-07-06 | 2004-02-09 | 矢崎総業株式会社 | 電気接続箱 |
JP3501954B2 (ja) | 1998-07-21 | 2004-03-02 | 日本放送協会 | 画質評価装置 |
US6496221B1 (en) * | 1998-11-02 | 2002-12-17 | The United States Of America As Represented By The Secretary Of Commerce | In-service video quality measurement system utilizing an arbitrary bandwidth ancillary data channel |
US6483538B2 (en) * | 1998-11-05 | 2002-11-19 | Tektronix, Inc. | High precision sub-pixel spatial alignment of digital images |
JP3747662B2 (ja) * | 1998-12-07 | 2006-02-22 | トヨタ自動車株式会社 | 車輌の運動制御装置 |
US6285797B1 (en) * | 1999-04-13 | 2001-09-04 | Sarnoff Corporation | Method and apparatus for estimating digital video quality without using a reference video |
CA2403665C (en) * | 2000-03-31 | 2007-12-04 | British Telecommunications Public Limited Company | Image processing |
JP3739274B2 (ja) * | 2000-10-31 | 2006-01-25 | Kddi株式会社 | 2系統映像の位置ずれ補正装置 |
US6876381B2 (en) * | 2001-01-10 | 2005-04-05 | Koninklijke Philips Electronics N.V. | System and method for providing a scalable objective metric for automatic video quality evaluation employing interdependent objective metrics |
WO2002080563A2 (en) * | 2001-03-29 | 2002-10-10 | Koninklijke Philips Electronics N.V. | Scalable expandable system and method for optimizing a random system of algorithms for image quality |
JP2002342218A (ja) * | 2001-05-16 | 2002-11-29 | Nippon Telegr & Teleph Corp <Ntt> | コンテンツ提供方法及びシステム |
US7020093B2 (en) * | 2001-05-30 | 2006-03-28 | Intel Corporation | Delivery of streaming media |
US6577764B2 (en) * | 2001-08-01 | 2003-06-10 | Teranex, Inc. | Method for measuring and analyzing digital video quality |
JP2004080177A (ja) | 2002-08-13 | 2004-03-11 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質評価装置、映像品質評価方法、映像品質評価プログラム及びそのプログラムを記録した記録媒体 |
GB0314161D0 (en) * | 2003-06-18 | 2003-07-23 | British Telecomm | Edge analysis in video quality assessment |
JP3838513B2 (ja) * | 2004-03-02 | 2006-10-25 | Kddi株式会社 | 伝送画質監視装置 |
JP3838516B2 (ja) * | 2004-05-28 | 2006-10-25 | Kddi株式会社 | 伝送画質監視装置 |
-
2004
- 2004-08-20 CA CA2646808A patent/CA2646808C/en not_active Expired - Lifetime
- 2004-08-20 KR KR1020057021820A patent/KR100798834B1/ko active IP Right Grant
- 2004-08-20 WO PCT/JP2004/011992 patent/WO2005020592A1/ja active Application Filing
- 2004-08-20 US US10/556,103 patent/US7705881B2/en active Active
- 2004-08-20 KR KR1020077017814A patent/KR100824711B1/ko active IP Right Grant
- 2004-08-20 EP EP04771953.9A patent/EP1622395B1/en not_active Expired - Lifetime
- 2004-08-20 CA CA2525812A patent/CA2525812C/en not_active Expired - Lifetime
- 2004-08-20 JP JP2005513306A patent/JP4257333B2/ja not_active Expired - Lifetime
- 2004-08-20 CA CA2646805A patent/CA2646805C/en not_active Expired - Lifetime
-
2008
- 2008-12-05 JP JP2008311492A patent/JP4800376B2/ja not_active Expired - Lifetime
- 2008-12-05 JP JP2008311495A patent/JP4901848B2/ja not_active Expired - Lifetime
-
2010
- 2010-03-05 US US12/717,983 patent/US8253803B2/en active Active
-
2011
- 2011-12-02 JP JP2011264610A patent/JP5347012B2/ja not_active Expired - Lifetime
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0789497A2 (en) | 1996-02-12 | 1997-08-13 | Tektronix, Inc. | Progammable instrument for automatic measurement of compressed video quality |
JP2000032496A (ja) * | 1998-06-23 | 2000-01-28 | Tektronix Inc | ビデオ・シ―ケンスの空間―時間アライメント方法 |
JP2003009186A (ja) * | 2001-04-16 | 2003-01-10 | Kddi Corp | 伝送画質監視装置 |
Non-Patent Citations (2)
Title |
---|
MURAMATSU S.; KIYA H: "Scale Factor of Resolution Conversion Based on Orthogonal Transforms", IEICE TRANS. FUNDAMENTALS, vol. E76-A, no. 7, July 1993 (1993-07-01), pages 1150 - 1153 |
OKAMOTO, J. ET AL.: "Eizo Hunshitsu Kyakkan Hyoka no Seino Kojo ni Kansuru Ichikento", 2003 NEN THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS SOGO TAIKAI KOEN RONBUNSHU, 3 March 2003 (2003-03-03), pages 600, XP002995671 * |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007043642A (ja) * | 2005-03-04 | 2007-02-15 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質評価装置、方法およびプログラム |
JP4514155B2 (ja) * | 2005-03-04 | 2010-07-28 | 日本電信電話株式会社 | 映像品質評価装置、方法およびプログラム |
US8462985B2 (en) * | 2005-06-29 | 2013-06-11 | Ntt Docomo, Inc. | Video evaluation apparatus, spatio-temporal frequency analyzing apparatus, video evaluation method, spatio-temporal frequency analyzing method, video evaluation program, and spatio-temporal frequency analyzing program |
US20120020574A1 (en) * | 2005-06-29 | 2012-01-26 | Ntt Docomo, Inc. | Video evaluation apparatus, spatio-temporal frequency analyzing apparatus, video evaluation method, spatio-temporal frequency analyzing method, video evaluation program, and spatio-temporal frequency analyzing program |
US8064638B2 (en) * | 2005-06-29 | 2011-11-22 | Ntt Docomo, Inc. | Video evaluation apparatus, spatio-temporal frequency analyzing apparatus, video evaluation method, spatio-temporal frequency analyzing method, video evaluation program, and spatio-temporal frequency analyzing program |
JP4482031B2 (ja) * | 2005-07-11 | 2010-06-16 | 日本電信電話株式会社 | 映像整合装置、方法、およびプログラム |
US8094196B2 (en) | 2005-07-11 | 2012-01-10 | Nippon Telegraph And Telephone Corporation | Video matching device, method, and program |
JPWO2007007750A1 (ja) * | 2005-07-11 | 2009-01-29 | 日本電信電話株式会社 | 映像整合装置、方法、およびプログラム |
JP2008005107A (ja) * | 2006-06-21 | 2008-01-10 | Nippon Telegr & Teleph Corp <Ntt> | 映像整合方法 |
JP2008035357A (ja) * | 2006-07-31 | 2008-02-14 | Kddi Corp | 映像品質の客観評価装置 |
JP2008066856A (ja) * | 2006-09-05 | 2008-03-21 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質客観評価装置、方法およびプログラム |
JP4527699B2 (ja) * | 2006-09-05 | 2010-08-18 | 日本電信電話株式会社 | 映像品質客観評価装置、方法およびプログラム |
JP2009027432A (ja) * | 2007-07-19 | 2009-02-05 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質客観評価方法、映像品質客観評価装置およびプログラム |
JP2011015165A (ja) * | 2009-07-01 | 2011-01-20 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質推定装置、システム、方法およびプログラム |
JP2013102353A (ja) * | 2011-11-08 | 2013-05-23 | Nippon Hoso Kyokai <Nhk> | 映像符号化方式変換装置 |
US9407934B2 (en) | 2013-06-20 | 2016-08-02 | Fujitsu Limited | Image evaluation apparatus and method |
JP2015035696A (ja) * | 2013-08-08 | 2015-02-19 | 株式会社リコー | 通信端末、通信システムおよび通信方法並びにプログラム |
JP2016005111A (ja) * | 2014-06-17 | 2016-01-12 | 日本電信電話株式会社 | 評価映像分析装置及び方法及びプログラム |
JP2020191681A (ja) * | 2016-10-08 | 2020-11-26 | 華為技術有限公司Huawei Technologies Co.,Ltd. | 映像品質評価方法および装置 |
JP7105838B2 (ja) | 2016-10-08 | 2022-07-25 | 華為技術有限公司 | 映像品質評価方法および装置 |
JPWO2021181724A1 (ja) * | 2020-03-13 | 2021-09-16 | ||
WO2021181724A1 (ja) * | 2020-03-13 | 2021-09-16 | 日本電信電話株式会社 | 数理モデル導出装置、数理モデル導出方法及びプログラム |
JP7380832B2 (ja) | 2020-03-13 | 2023-11-15 | 日本電信電話株式会社 | 数理モデル導出装置、数理モデル導出方法及びプログラム |
CN113992943A (zh) * | 2021-10-25 | 2022-01-28 | 上海佰贝科技发展股份有限公司 | 一种监测播出服务器信号异态或劣化的方法及系统 |
CN113992943B (zh) * | 2021-10-25 | 2024-01-30 | 上海佰贝科技发展股份有限公司 | 一种监测播出服务器信号异态或劣化的方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
EP1622395A1 (en) | 2006-02-01 |
US8253803B2 (en) | 2012-08-28 |
KR20060033717A (ko) | 2006-04-19 |
JP5347012B2 (ja) | 2013-11-20 |
US20100157144A1 (en) | 2010-06-24 |
KR100824711B1 (ko) | 2008-04-24 |
JP4257333B2 (ja) | 2009-04-22 |
US7705881B2 (en) | 2010-04-27 |
EP1622395A4 (en) | 2011-06-08 |
EP1622395B1 (en) | 2015-03-18 |
JP2012060672A (ja) | 2012-03-22 |
JP4800376B2 (ja) | 2011-10-26 |
CA2525812C (en) | 2011-11-22 |
CA2646805C (en) | 2012-04-24 |
JP2009095046A (ja) | 2009-04-30 |
JP2009077433A (ja) | 2009-04-09 |
US20060276983A1 (en) | 2006-12-07 |
CA2646808A1 (en) | 2005-03-03 |
JPWO2005020592A1 (ja) | 2007-11-01 |
KR100798834B1 (ko) | 2008-01-28 |
CA2646805A1 (en) | 2005-03-03 |
JP4901848B2 (ja) | 2012-03-21 |
CA2525812A1 (en) | 2005-03-03 |
KR20070092320A (ko) | 2007-09-12 |
CA2646808C (en) | 2013-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005020592A1 (ja) | 映像品質評価装置、映像品質評価方法及び映像品質評価プログラム、並びに映像整合装置、映像整合方法及び映像整合プログラム | |
JP4482031B2 (ja) | 映像整合装置、方法、およびプログラム | |
Huynh-Thu et al. | The accuracy of PSNR in predicting video quality for different video scenes and frame rates | |
Pinson et al. | A new standardized method for objectively measuring video quality | |
US8184164B2 (en) | Method for measuring multimedia video communication quality | |
Barkowsky et al. | Temporal trajectory aware video quality measure | |
KR20080029371A (ko) | 영상 화질 평가 시스템 및 방법 | |
JP5450279B2 (ja) | 映像品質客観評価装置及び方法及びプログラム | |
Konuk et al. | A spatiotemporal no-reference video quality assessment model | |
WO2012000136A1 (en) | Method for measuring video quality using a reference, and apparatus for measuring video quality using a reference | |
WO2010103112A1 (en) | Method and apparatus for video quality measurement without reference | |
Ndjiki-Nya et al. | Efficient full-reference assessment of image and video quality | |
WO2009007133A2 (en) | Method and apparatus for determining the visual quality of processed visual information | |
Zerman et al. | A parametric video quality model based on source and network characteristics | |
Lee et al. | Hybrid no-reference video quality models for H. 264 with encrypted payload | |
Zhang et al. | Overview of full-reference video quality metrics and their performance evaluations for videoconferencing application | |
Uddina et al. | Subjective video quality evaluation of H. 265/HEVC encoded low resolution videos for ultra-low band transmission system | |
Farias | Visual‐quality estimation using objective metrics | |
JP2007519335A (ja) | ビデオ圧縮エンコーダのための3:2プルダウンスイッチオフ信号の生成装置 | |
JP5467030B2 (ja) | 映像品質客観評価装置及びプログラム | |
Lee et al. | Video calibration for spatial-temporal registration with gain and offset adjustments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005513306 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2525812 Country of ref document: CA Ref document number: 2006276983 Country of ref document: US Ref document number: 10556103 Country of ref document: US Ref document number: 20048131394 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004771953 Country of ref document: EP Ref document number: 1020057021820 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2004771953 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057021820 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 10556103 Country of ref document: US |