WO2006043500A1 - 映像品質客観評価装置、評価方法およびプログラム - Google Patents
映像品質客観評価装置、評価方法およびプログラム Download PDFInfo
- Publication number
- WO2006043500A1 WO2006043500A1 PCT/JP2005/019019 JP2005019019W WO2006043500A1 WO 2006043500 A1 WO2006043500 A1 WO 2006043500A1 JP 2005019019 W JP2005019019 W JP 2005019019W WO 2006043500 A1 WO2006043500 A1 WO 2006043500A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- degradation
- amount
- deterioration
- video signal
- video
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/04—Diagnosis, testing or measuring for television systems or their details for receivers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/004—Diagnosis, testing or measuring for television systems or their details for digital television systems
Definitions
- the present invention relates to the ability to measure the physical features of video signals or video files without humans performing a subjective evaluation quality test to evaluate the quality of the video. It relates to an objective video quality evaluation device, evaluation method and program.
- the present invention has been made to solve the above-described problem, and it is possible to estimate the subjective quality of an image even in the case of an image in which deterioration occurs locally in the time-space direction.
- An object of the present invention is to provide an objective video quality evaluation apparatus, evaluation method, and program. Means for solving the problem
- the present invention relates to a spatio-temporal feature that is a feature amount of degradation caused in the degraded video signal from a degraded video signal to be evaluated and a reference video signal that is a signal before degradation of the degraded video signal.
- a spatio-temporal characteristic that is a characteristic amount of deterioration caused in a deteriorated video signal from a deteriorated video signal to be evaluated and a reference video signal that is a signal before deterioration of the deteriorated video signal.
- the spatio-temporal feature amount deriving unit for deriving the collected amount and weighting the spatio-temporal feature amount based on the relationship between the degraded video obtained in advance and the subjective evaluation value of the user,
- a subjective quality estimation unit that estimates quality, for example, even in the case of video in which degradation due to packet loss on the communication network occurs locally in the time and space directions.
- the subjective quality can be estimated, and by replacing the conventional objective evaluation technique with the video quality objective evaluation apparatus of the present invention, the great labor and time required for performing the subjective evaluation are eliminated.
- the spatiotemporal feature quantity deriving unit is provided with first deriving means for deriving a spatial feature quantity that takes into account the spatial locality of degradation that has occurred in the evaluation target frame of the degraded video signal. Therefore, quality evaluation can be performed in consideration of the spatial locality of deterioration, The estimation accuracy of the evaluation value can be increased.
- the spatio-temporal feature quantity deriving unit derives the second derivation means for deriving the temporal feature quantity of degradation occurring in the evaluation target frame of the degraded video signal, the spatial feature quantity, and the temporal feature quantity.
- FIG. 1 is a diagram showing an example of an image in which degradation occurs locally in a space.
- FIG. 2 is a diagram showing an example of the relationship between video frame numbers and video degradation amounts.
- FIG. 3 is a block diagram showing a configuration of a video quality objective evaluation apparatus according to the first embodiment of the present invention.
- FIG. 4 is a flowchart showing the operation of the video quality objective evaluation apparatus of the first embodiment of the present invention.
- FIG. 5 is a flowchart showing a method for deriving a spatial feature amount in consideration of local video degradation in the space in the first embodiment of the present invention.
- FIG. 6 is a diagram showing a histogram of deterioration amounts for each block in the first embodiment of the present invention.
- FIG. 7 is a diagram for explaining how to grasp local video degradation on the time axis in the first embodiment of the present invention.
- FIG. 8 is a flowchart showing a method for deriving a spatio-temporal feature amount considering local video degradation on the time axis in the first embodiment of the present invention.
- FIG. 9 is a diagram showing an example of setting a unit measurement period in derivation of the spatio-temporal feature amount according to the first embodiment of the present invention.
- Fig. 10 is a diagram showing another example of the unit measurement period setting in the derivation of the spatio-temporal feature amount in the first embodiment of the present invention.
- FIG. 11 is a diagram showing a steady average deterioration amount, a deterioration variation amount of local video deterioration, and a duration.
- FIG. 12 shows a derivation function of a local deterioration determination threshold in the first embodiment of the present invention. It is a figure.
- FIG. 13 is a diagram showing the structure of a deterioration intensity database table in the first example of the present invention.
- FIG. 14 is a diagram for explaining a method of total deterioration intensity in the first embodiment of the present invention.
- FIG. 15 is a diagram showing a configuration of a first deterioration amount addition table in the first example of the present invention.
- FIG. 16 is a diagram showing a configuration of a second deterioration amount addition table in the first example of the present invention.
- FIG. 17 is a diagram showing another example of an image in which degradation has locally occurred in a space.
- FIG. 18 is a flowchart showing a method for deriving a spatial feature amount in consideration of local video degradation in the space in the second embodiment of the present invention.
- FIG. 19 is a diagram for explaining motion vectors.
- FIG. 20 is a diagram showing a weighting coefficient with respect to the speed of motion of local video degradation in the second embodiment of the present invention.
- FIG. 21 is a diagram showing a weighting coefficient for the attention level of local video degradation in the second embodiment of the present invention.
- FIG. 1 is a diagram showing an example of an image in which deterioration occurs locally in space. Video degradation due to packet loss and code errors in the communication network occurs locally, mainly in areas with video motion, so it is necessary to consider spatial locality. P in Figure 1
- the actual degraded video and the main video Relational power with view evaluation value Subjective evaluation value is estimated by weighting in advance. Thereby, the estimation accuracy of the subjective evaluation value is increased.
- FIG. 2 is a diagram showing an example of video degradation that occurs locally on the time axis, and is a diagram showing an example of the relationship between the video frame number and the video degradation amount.
- packet loss or code error occurs in the communication network
- one frame freeze failure J in Fig. 2 suddenly occurs due to frame skipping, or degradation that continues until the next I (Intra) frame is decoded ( Local image degradation occurs when the magnitude of K) in Fig. 2 is large.
- the amount of degradation when local degradation does not occur the increment of degradation amount when local degradation occurs (degradation fluctuation amount), and «
- the subjective evaluation value is estimated by weighting the duration in consideration of the subjective evaluation characteristics obtained in advance. Thereby, the estimation accuracy of the subjective evaluation value is increased.
- FIG. 3 is a block diagram showing the configuration of the video quality objective evaluation apparatus according to the first embodiment of the present invention. The outline of the operation will be described below.
- the reference video signal RI which is the previous signal is used.
- the alignment unit 11 searches the reference video signal RI and the deteriorated video signal PI for a match in time and position from the reference video signal RI and the degraded video signal PI in a state where the frame display interval and format are matched.
- the signal RI and the degraded video signal PI are output to the spatiotemporal feature derivation unit 12.
- the spatio-temporal feature quantity deriving unit 12 uses the reference video signal RI and the degraded video signal PI adjusted in the alignment unit 11 and uses a degradation intensity database (hereinafter abbreviated as degradation intensity DB) 13 as necessary.
- the spatio-temporal feature quantity PC which is the physical feature quantity of degradation, is derived and passed to the subjective quality estimation unit 14.
- the spatio-temporal feature quantity deriving unit 12 includes first deriving means 121 for deriving the spatial feature quantity of degradation that has occurred in the evaluation target frame of the degraded video signal PI, and the degradation time that has occurred in the evaluation target frame of the degraded video signal PI.
- the subjective quality estimation unit 14 corresponds to the spatio-temporal feature quantity PC received from the spatio-temporal feature quantity deriving unit 12 between the user's subjective evaluation value for the degraded video and the spatio-temporal feature quantity of the degraded video.
- the objective evaluation value is derived by performing a weighting operation using the objective evaluation value deriving function obtained in advance.
- FIG. 4 is a flowchart showing the operation of the video quality objective evaluation apparatus of FIG.
- the alignment unit 11 matches the frame display interval and format of the degraded video signal PI and the reference video signal RI, and searches the reference video signal RI in the time direction in units of frames, thereby obtaining the degraded video signal PI.
- the degraded video signal PI and the reference video signal RI are adjusted to be the same in pixel units.
- the adjusted reference video signal RI and degraded video signal PI are passed to the spatiotemporal feature quantity deriving unit 12 (step Sl in FIG. 4).
- the spatio-temporal feature quantity deriving unit 12 performs the following processing on the reference video signal RI and the degraded video signal PI received from the alignment unit 11 to derive a plurality of spatio-temporal feature quantity PCs. To the subjective quality estimation unit 14 (step S2).
- FIG. 5 is a flowchart showing a method for deriving the spatial feature DS.
- the first derivation means 121 of the spatiotemporal feature quantity deriving unit 12 calculates the degradation amount S for each block obtained by dividing the evaluation target frame from the reference video signal RI and the degraded video signal PI received from the alignment unit 11. Calculate and store (step S10 in FIG. 5).
- Degradation amount S includes parameters such as PSNR (Peak Signal to Noise Ratio), which is the signal-to-noise ratio, and Average Edge Energy, as defined by ANSI (American National Standards Institute).
- the first deriving unit 121 calculates the average frame degradation amount Xave—all, which is an average value of the calculated degradation amount S for each block over the entire evaluation target frame, and deterioration of the evaluation target frame.
- the local deterioration area average deterioration amount Xave-bad which is an average value of the deterioration amount S within the strong area, is calculated and stored (step S11).
- Figure 6 shows a histogram of the degradation amount S for each block, where the horizontal axis represents the degradation amount S and the vertical axis represents the block where each degradation amount S occurred. Is the number of blocks accumulated for each deterioration amount. In Fig. 6, it is assumed that the video degradation becomes greater as it goes to the right.
- the local deterioration area average deterioration amount Xave-bad is an average value of deterioration amounts S included in a predetermined deterioration strength range (shaded area in Fig. 6).
- the top 10% of blocks with the largest amount of deterioration out of the total number of blocks are within the specified deterioration strength range.
- the first deriving means 121 calculates the following expression using the coefficients A and B that have been determined by subjective evaluation experiments. Calculate and store the spatial feature DS taking into account video degradation (step S12).
- Equation (1) A is a coefficient obtained in advance based on the subjective evaluation characteristics when local video degradation does not occur in the space, and B is a subjective assessment when local video degradation occurs in the space. It is a coefficient obtained in advance by characteristics.
- the spatiotemporal feature quantity deriving unit 12 performs the above processing for each frame in accordance with the transition of time.
- the spatial feature quantity DS is calculated by using the frame average degradation amount Xave-all and the local degradation area average degradation amount Xave-bad, but in addition, various degradation amounts of the evaluation target frame. It is conceivable to use distribution statistics. For example, in the degradation amount distribution of the evaluation target frame shown in FIG. 6, the occurrence frequency, the area, and the number of blocks of each degradation amount may be used for calculating the spatial feature DS. Alternatively, the standard deviation or variance value of the degradation amount may be used, and further, the difference value between the frame average degradation amount Xave—all and the local degradation region average degradation amount Xave—bad may be used. Also, the spatial feature DS may be calculated by combining these statistics.
- the spatio-temporal feature PC is derived by taking into account the effects of degradation of both.
- the influence of deterioration in the unit measurement period ut is calculated only by local deterioration, and the influence of the average deterioration amount Q2 in the unit measurement period ut is calculated, and the effects of both deteriorations are taken into account.
- the unit measurement period ut and the frame have the relationship of unit measurement period ut ⁇ l frame period.
- Q1 in Fig. 7 is the amount of local deterioration.
- FIG. 8 is a flowchart showing a method for deriving the spatiotemporal feature quantity PC.
- the spatiotemporal feature amount deriving unit 12 calculates a degradation amount C for each unit measurement period ut (frame or constant measurement interval) from the reference video signal RI and the degraded video signal PI received from the alignment unit 11. Store it (step S20 in FIG. 8).
- the second deriving unit 122 derives a temporal feature amount.
- This time feature value includes the frame rate, the number of frame skips, the TI value specified by ITU-T-IRecP. 910, and the feature value specified by ANSI.
- the temporal feature amount derived by the second deriving means 122 can be used as the degradation amount C, and when the spatial feature amount DS or the spatial feature amount DS previously derived by the first deriving means 121 is derived.
- the amount of deterioration S used can also be used as the amount of deterioration C. It should be noted that a value (objective evaluation value) that has been converted and estimated in advance as a subjective evaluation value for each frame using the deterioration amount as described above may be used as the deterioration amount C.
- the degradation amount C calculated in time series is derived.
- the third derivation means 123 of the spatio-temporal feature quantity deriving unit 12 calculates the steady-state average degradation amount Dcons from the degradation amount C, the degradation fluctuation amount d of local video degradation, and the duration t thereof for each unit measurement period ut. (Step S21 in FIG. 8).
- the unit measurement periods ut may be set so as not to overlap each other as shown in FIG. 9, or may be set so as to overlap each other as shown in FIG.
- FIG. 11 shows the steady-state average degradation amount Dcons, the degradation fluctuation amount d of local video degradation, and the duration t.
- the steady-state average degradation amount Dcons is an average value of the steady-state degradation amount C excluding the occurrence of local video degradation in the unit measurement period ut, and is calculated for each unit measurement period ut. In the middle of the unit measurement period ut, the steady average deterioration amount Dcons calculated in the immediately preceding unit measurement period ut is used.
- Deterioration fluctuation amount d of local video degradation is defined as local video degradation amount and steady average degradation amount Dcons And the difference value.
- the first degradation amount C as described above is defined as the local image degradation amount, and the difference from the steady average degradation amount Dcons at this time is defined as the degradation fluctuation amount d.
- the duration t of local video degradation is a difference between the amount of degradation C and the steady average degradation amount Dcons that is greater than or equal to (d ⁇ ) and less than or equal to (d + ⁇ ) when local video degradation occurs.
- a time that is within range. ⁇ is a predetermined allowable variation range.
- the local degradation determination threshold for determining whether or not local image degradation has occurred is determined based on the local degradation determination threshold derivation function as shown in Fig. 12 and a value corresponding to the current steady-state average degradation amount Dcons.
- the third derivation is performed by determining the local degradation discrimination threshold deriving function so that the discrimination of local video degradation performed subjectively by the user and the discrimination of local video degradation based on the local degradation discrimination threshold correspond well. It may be stored in the means 123. Note that local image degradation may occur multiple times within the unit measurement period ut, so every time local image degradation occurs, a set of degradation variation d and duration t is obtained and retained. Become.
- the third deriving means 123 refers to the deterioration intensity DB13 based on the deterioration fluctuation amount d calculated in step S21 and the duration t, and determines the deterioration fluctuation amount d and the duration in the unit measurement period ut.
- Deterioration strength D considering the effect of t on the user's subjective evaluation is obtained and stored (step S22 in FIG. 8).
- the degradation strength DB 130 has a duration degradation strength table 130 in which a degradation strength curve indicating the relationship between duration t and degradation strength D is registered in each degradation fluctuation amount d. Prepared in advance.
- the third deriving unit 123 refers to the deterioration strength DB 13 and converts the set of the deterioration variation amount d and the duration t to the deterioration strength D.
- the duration degradation curve shows the user's subjective evaluation characteristics for videos that are subject to local video degradation, while varying the degradation variation d and duration t. Decide so that D corresponds well.
- the third deriving means 123 performs the process of step S22 for each set when a plurality of sets of the deterioration variation amount d and the duration t are obtained within the unit measurement period ut.
- the third deriving unit 123 sums the deterioration strength D for each unit measurement period ut and stores the total value (step S23 in FIG. 8).
- the degradation strength D derived in step S22 may be simply added, but the following points are considered in order to match the user's subjective characteristics.
- the user's subjective evaluation is affected by local deterioration with a strong deterioration strength when both strong local deterioration and weak local deterioration exist in the video.
- the user's subjective evaluation is influenced by the total value of the deteriorations.
- the degradation strengths of multiple local degradations occurring within the unit measurement period ut Dl, D2, D3, ..., DN-1 , DNs are arranged in descending order, and the forces with the least degradation strength are added in order, referring to the first degradation amount addition table 124 as shown in FIG.
- the first deterioration amount addition table 124 stores the deterioration strengths D a and Db and the total deterioration strength Dsum in association with each other, and is prepared in advance in the third derivation means 123.
- Degradation strengths Dl, D2, D3, ⁇ , DN—l, DN rearranged in descending order at step 201 in Fig. 14 are D, l, D '2, D' 3, ⁇ ⁇ , D 'N— 1, D' N, in the first addition, as shown in step 202, the minimum degradation strength D '1 is Da, and the next smallest degradation strength D, 2 is Db. Based on the degradation strengths Da and Db, the total degradation strength Dsum corresponding to the degradation strengths Da and Db is obtained with reference to the first degradation amount addition table 124 (step 203).
- step 204 the previously derived total deterioration strength Dsum is added to Da, and the minimum deterioration strength of unprocessed deterioration strengths is set to Db.
- the total deterioration strength Dsum corresponding to the deterioration strengths Da and Db is obtained (step 205). Thereafter, the processing shown in Steps 204 and 205 is repeated until the deterioration strength D ′ N.
- the third deriving means 123 uses the finally obtained total deterioration strength Dsum as fe total deterioration strength value Dpart in the unit measurement period ut. .
- the first degradation amount addition table 124 shows the subjective evaluation characteristics of the user with respect to the video in which local video degradation has occurred, while changing the two degradation strengths Da and Db. It was determined that the user's subjective evaluation and the total degradation strength Dsum correspond well. According to the first degradation amount addition table 124, when the degradation strength Db is larger than the degradation strength Da, the total degradation strength Dsum is a value close to the degradation strength Db, and the degradation strengths Da and Db are about the same value. Sometimes the total deterioration strength Dsum is close to the sum of the deterioration strengths Da and Db. As a result, as described above, the total value of the degradation strength D can be matched to the user's subjective characteristics.
- the third derivation means 123 uses the second deterioration amount addition table 125 as shown in FIG. 16 based on the total value Dpart of the deterioration intensity in the unit measurement period ut and the steady average deterioration amount Dcons.
- the spatio-temporal feature quantity PC considering the local video degradation on the time axis is acquired and stored (step S24 in FIG. 8).
- the second degradation amount addition table 125 stores the total degradation strength value Dpart, the steady-state average degradation amount Dcons, and the spatio-temporal feature amount PC in association with each other. It is prepared in advance. This second degradation amount addition table 125 examines the user's subjective evaluation characteristics for images where local video degradation has occurred, changing the total degradation strength value Dpart and the steady average degradation amount Dcons. The user's subjective evaluation and the spatio-temporal feature PC are determined so that they correspond well.
- the process of the spatiotemporal feature quantity deriving unit 12 is thus completed.
- the degradation amount C obtained in step S20 has a plurality of types such as the frame rate and the number of frame skips.
- the spatio-temporal feature quantity deriving unit 12 performs the processes of steps S21 to S24 for each type of degradation amount C when obtaining a plurality of types of degradation amounts C in step S20. Therefore, there are multiple spatio-temporal feature values PC obtained for each unit measurement period ut.
- the subjective quality estimation unit 14 calculates an objective evaluation value by performing a weighting operation such as the following equation based on the plurality of spatio-temporal feature amounts PC received from the spatio-temporal feature amount deriving unit 12. (Step S3 in Fig. 4).
- Equation (2) ⁇ is the objective evaluation value, XI, ⁇ 2, ⁇ ⁇ is the spatio-temporal feature PC, and F is the objective evaluation value derivation function.
- XI is a spatio-temporal feature amount PC obtained from the spatial feature amount DS by the processing of steps S21 to S24 when the spatial feature amount DS is used as the deterioration amount C, for example.
- X2 is the spatio-temporal feature amount PC obtained from the frame rate when the frame rate is used as the deterioration amount C, for example.
- ⁇ , ⁇ , and y are predetermined coefficients.
- ⁇ , ⁇ , and ⁇ are predetermined coefficients.
- the coefficients ⁇ , ⁇ , and ⁇ local video degradation occurs, and the user's subjective evaluation characteristics for the video are examined while varying the degradation amount, and the user's subjective evaluation and objective evaluation are performed. Determine the optimal combination of values so that the value ⁇ corresponds well.
- FIG. 17 is a diagram showing another example of an image in which degradation has locally occurred in the space.
- FIG. 17 shows an image of the background moving at a high speed from right to left in order to follow the movement of the subject vehicle 170 with a force lens.
- the user's subjective evaluation differs depending on the speed of movement.
- the local video degradation 171 that occurred in the background area and the local video degradation 172 that occurred in the subject area have a greater effect on the subjective evaluation of the user by the degradation 172 that occurred in the subject area.
- the user's subjective evaluation differs depending on the level of attention (attention level) of the user.
- weighting is performed in consideration of the difference in subjective evaluation depending on the speed of motion of the video, in which deterioration is easily detected depending on the speed of motion of the video! And weighting in consideration of the difference in subjective evaluation depending on the level of attention of the user to the video, such as whether the area where local video degradation occurs is an area of interest like a subject. Increase the estimation accuracy of evaluation values.
- FIG. 18 is a flowchart showing a method for deriving the spatial feature DS in this embodiment.
- the first derivation means 121 of the spatio-temporal feature quantity deriving unit 12 calculates and stores the motion vector quantity for each block obtained by dividing the reference video signal RI force evaluation target frame received from the alignment unit 11 ( Fig. 18 Step S30).
- FIG. 19 is a diagram for explaining a motion vector.
- a motion vector is a vector indicating the amount of movement (direction and distance) between frames of, for example, an 8 ⁇ 8 pixel block. To find the amount of block movement, find the block with the smallest difference between the current frame and the previous frame. For example, the example of FIG.
- the first deriving means 121 calculates a motion vector amount for each block for one frame of the reference video signal RI, and calculates its direction and length (norm) for each block.
- the first deriving means 121 calculates the evaluation target frame necessary for deriving the attention level of each block in accordance with the motion vector distribution characteristics of the reference video signal RI calculated in step S30.
- a threshold value for each attention level is derived (step S31).
- the first deriving means 121 has a region having a plurality of block forces having the same motion vector, and if the block belonging to the parenthesis region is equal to or greater than a constant, this region is used as the background region (attention level 2). And a threshold for classifying the block into two as the subject region (attention level 1). There are two or more levels of attention.
- the first case is when the camera moves up / down / left / right (pan, tilt) as the subject moves. If the camera moves up, down, left, or right as the subject moves, the background area It moves in the direction opposite to the camera movement direction. Therefore, the first deriving means 121 has a plurality of block force areas having the same direction and length as the motion vector, and if the block belonging to this area is more than a constant, this area is set as the background area. And According to this attention level determination method, even when the subject is not powered, it is determined as a background region.
- the second case is a case where the camera performs a zoom operation (enlargement / reduction) on the subject.
- motion vectors are generated radially from the subject position (for example, the center of the image) in all directions in the peripheral part.
- a motion vector is generated from the periphery to the subject position.
- the length of the motion vector is longer for the motion vector in the background area in the peripheral area than for the subject in the vicinity of the center of the video.
- the first deriving means 121 when there are a plurality of regions having a block force whose motion vectors are evenly distributed in each direction and whose motion vector length is equal to or greater than a threshold, The background area.
- a threshold value at this time a predetermined constant value may be used.
- the force may be obtained as follows from the distribution of motion vectors.
- the first derivation means 121 obtains a motion vector histogram with the horizontal axis representing the length of the motion vector and the vertical axis representing the frequency of occurrence of the motion vector (number of blocks). .
- the first deriving means 121 determines an arbitrary boundary value on the horizontal axis of this histogram, obtains the occurrence frequency of a motion vector longer than this boundary value, and the occurrence frequency is, for example, 80% or more of the total number of blocks. When this value is reached, this boundary value is set as a threshold value.
- the first deriving means 121 calculates a degradation amount S for each block obtained by dividing the evaluation target frame from the reference video signal RI and the degraded video signal PI, and calculates the value for each position in the frame.
- the amount of degradation S includes parameters such as PSNR, which is the signal-to-noise ratio, and Average Edge Energy, as defined by ANSI.
- the first deriving means 121 uses the results of steps S30 to S32 to calculate a spatial feature DS that takes into account local video degradation in the space in the evaluation target frame as in the following equation: And memorize it (step S33).
- Equation (4) N is the number of target blocks, Fli is a weighting factor that depends on the direction and length of the motion vector of block i (i is a natural number from 1 to N), and F2i depends on the attention level of block i
- the weighting factor, Si is the amount of deterioration of block i. Equation (4) shows that the result of weighting the degradation amount S with the weighting coefficients Fl and F2 is obtained for each block, and the value obtained by averaging the result for each block over the entire evaluation target frame is defined as the spatial feature DS. means.
- a specific method for deriving weights F1 and F2 is as follows.
- the weighting coefficient F1 is obtained by calculating the length of the motion vector of the deteriorated video and the weighting coefficient F1 with respect to the length of the motion vector for each block for each target block of the equation (4). Is also derived. As shown in Fig. 20, this F1 is weakly weighted when there is no motion in the video (when the motion vector is short) or when the motion of the video is too fast to follow (the motion vector is long). It is a coefficient that gives a strong weight when the video is medium. Note that the correspondence between the length of the motion vector of the degraded video and the weighting factor F1 is the subjective evaluation characteristics obtained when specific local degradation is applied to regions with different motion vector lengths (this is the spatial feature DS Force).
- the weighting coefficient F2 is obtained in advance by determining the attention level from the threshold value derived in step S31 from the length and direction of the motion vector for each block for each target block of equation (4). It is derived from the correspondence between the attention level and the weighting coefficient F2.
- This weighting coefficient F2 is a coefficient that gives a high level of attention as in the subject area, as shown in Fig. 21, a strong weight in the area, a low level of attention as in the background area, and a weak weight in the area! is there.
- the correspondence between the attention level and the weighting coefficient F2 is specific to video that has been classified in advance (classified according to the camera work in which the subject's movement is matched as described above using the motion vector). Subjective evaluation characteristics obtained when local deterioration is added (this is the average value of the spatial feature DS), and the influence of the weighting coefficient F1 is also taken into account to derive the optimal correspondence.
- the weighting value F1 is derived from the motion vector for each block at the stage of step 30 but not at the stage of step 33, and is obtained as a table.
- the motion vector force in one frame is noted.
- the weight value F2 for each block may be obtained as a table, and may be calculated by referring to these tables at the time of calculation of equation (4) in step 33.
- the first deriving means 121 of the spatiotemporal feature quantity deriving unit 12 performs the above processing for each frame according to the transition of time.
- the processing excluding the derivation of the spatial feature DS in step S2 and the processing in steps Sl and S3 are the same as in the first embodiment.
- weighting is performed in consideration of the difference in subjective evaluation due to the speed of motion of the video, and weighting is performed in consideration of the difference in subjective evaluation due to the level of user attention to the video.
- the estimation accuracy of the subjective evaluation value can be increased.
- steps S31 and S32 In video communication services performed in a fixed place (environment where the background is fixed), the processing in steps S31 and S32 only needs to be performed on the subject part. It is also possible to easily calculate the area where there is a difference between frames as the subject area and the area where there is no difference between frames as the background area.
- This embodiment is a combination of the method for deriving the spatial feature DS described in the first embodiment and the method for deriving the spatial feature DS described in the second embodiment.
- the first deriving unit 121 of the spatiotemporal feature quantity deriving unit 12 calculates the deterioration amount for each block based on steps S30 to S32 of the second embodiment in consideration of the motion vector. Subsequently, the first deriving means 121 is based on the steps Sl l and S12 in the first embodiment considering the average deterioration amount in the entire frame and the average deterioration amount in the region where the deterioration strength is strong.
- the spatial feature DS is calculated by 1). In this way, the derivation methods of the first and second embodiments can be combined.
- the video quality objective evaluation devices of the first to third embodiments are realized by a computer having an interface with a CPU, a storage device, and an external device, and a program for controlling these hardware resources. Can do.
- a video quality objective evaluation program for realizing the video quality objective evaluation method of the present invention is a flexible program. It is provided as recorded on a recording medium such as a Blu-ray Disc, CD-ROM, DVD-ROM, or memory card.
- the CPU writes the program read from the recording medium into the storage device, and executes the processes described in the first to third embodiments according to the program.
- the present invention can be applied to a video quality objective evaluation technique for estimating subjective quality from measurement of a physical feature amount of a video signal.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2005800335748A CN101036399B (zh) | 2004-10-18 | 2005-10-17 | 视频质量客观评价设备及评价方法 |
JP2006542965A JP4733049B2 (ja) | 2004-10-18 | 2005-10-17 | 映像品質客観評価装置、評価方法およびプログラム |
CA2582531A CA2582531C (en) | 2004-10-18 | 2005-10-17 | Video quality objective evaluation device, evaluation method, and program |
EP05793444A EP1804519A4 (en) | 2004-10-18 | 2005-10-17 | OBJECTIVE VIDEO QUALITY EVALUATION DEVICE, EVALUATION METHOD AND PROGRAM |
US11/663,679 US8130274B2 (en) | 2004-10-18 | 2005-10-17 | Video quality objective assessment device, assessment method, and program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-303204 | 2004-10-18 | ||
JP2004303204 | 2004-10-18 | ||
JP2005-201916 | 2005-07-11 | ||
JP2005201916 | 2005-07-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006043500A1 true WO2006043500A1 (ja) | 2006-04-27 |
Family
ID=36202916
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/019019 WO2006043500A1 (ja) | 2004-10-18 | 2005-10-17 | 映像品質客観評価装置、評価方法およびプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US8130274B2 (ja) |
EP (1) | EP1804519A4 (ja) |
JP (1) | JP4733049B2 (ja) |
KR (1) | KR100858999B1 (ja) |
CN (1) | CN101036399B (ja) |
CA (1) | CA2582531C (ja) |
WO (1) | WO2006043500A1 (ja) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100811835B1 (ko) | 2006-10-25 | 2008-03-10 | 주식회사 에스원 | 동영상 특징량 추출방법 및 이를 이용한 내용 기반 동영상검색방법 |
JP2008131658A (ja) * | 2006-11-22 | 2008-06-05 | Tektronix Inc | ビデオ・フレーム測定装置及び方法 |
KR100893609B1 (ko) | 2007-06-05 | 2009-04-20 | 주식회사 케이티 | 인간 시각 특성을 이용한 영상 품질 측정 장치 및 방법 |
WO2009116666A1 (ja) * | 2008-03-21 | 2009-09-24 | 日本電信電話株式会社 | 映像品質客観評価方法、映像品質客観評価装置、およびプログラム |
JP2009273127A (ja) * | 2008-04-30 | 2009-11-19 | Thomson Licensing | フレーム系列の歪んだバージョンの品質を評価する方法 |
JP2010206790A (ja) * | 2009-03-05 | 2010-09-16 | Tektronix Inc | 信号フィルタ処理方法及び装置並びにイメージ整列方法 |
JP2011015165A (ja) * | 2009-07-01 | 2011-01-20 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質推定装置、システム、方法およびプログラム |
JP2011510562A (ja) * | 2008-01-18 | 2011-03-31 | トムソン ライセンシング | 知覚上の品質を評価する方法 |
JPWO2009133879A1 (ja) * | 2008-04-30 | 2011-09-01 | 日本電気株式会社 | 画像評価方法、画像評価システム及びプログラム |
US8422795B2 (en) | 2009-02-12 | 2013-04-16 | Dolby Laboratories Licensing Corporation | Quality evaluation of sequences of images |
JP2015520548A (ja) * | 2012-04-23 | 2015-07-16 | 華為技術有限公司Huawei Technologies Co.,Ltd. | マルチメディア品質を評価する方法及び装置 |
US10116929B2 (en) | 2012-08-22 | 2018-10-30 | Huawei Technologies Co., Ltd. | Multimedia quality monitoring method, and device |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090097546A1 (en) * | 2007-10-10 | 2009-04-16 | Chang-Hyun Lee | System and method for enhanced video communication using real-time scene-change detection for control of moving-picture encoding data rate |
JP2009260941A (ja) * | 2008-03-21 | 2009-11-05 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質客観評価方法、映像品質客観評価装置、及びプログラム |
CN101616315A (zh) * | 2008-06-25 | 2009-12-30 | 华为技术有限公司 | 一种视频质量评价方法、装置和系统 |
CN101626506B (zh) | 2008-07-10 | 2011-06-01 | 华为技术有限公司 | 一种视频码流的质量评估方法、装置及系统 |
EP2383665B1 (en) | 2009-01-23 | 2017-10-25 | Nec Corporation | Matching weighting information extracting device |
FR2947069A1 (fr) * | 2009-06-19 | 2010-12-24 | Thomson Licensing | Procede de selection de versions d'un document parmi une pluralite de versions recues a la suite d'une recherche, et recepteur associe |
CN101827280B (zh) * | 2010-02-04 | 2012-03-21 | 深圳市同洲电子股份有限公司 | 视频输出质量的检测方法和装置 |
CN102223565B (zh) * | 2010-04-15 | 2013-03-20 | 上海未来宽带技术股份有限公司 | 一种基于视频内容特征的流媒体视频质量评估方法 |
EP2564592A4 (en) * | 2010-04-30 | 2015-06-17 | Thomson Licensing | METHOD AND APPARATUS FOR MEASURING VIDEO QUALITY USING AT LEAST ONE SEMI-SUPERVISED LEARNING RECTIFIER FOR PREDICTING AN AVERAGE OBSERVATION NOTE |
PL2649801T3 (pl) * | 2010-12-10 | 2015-08-31 | Deutsche Telekom Ag | Sposób i urządzenie do obiektywnej oceny jakościowej wideo na podstawie ciągłych oszacowań widoczności utraty pakietów |
CN102209257B (zh) * | 2011-06-17 | 2013-11-20 | 宁波大学 | 一种立体图像质量客观评价方法 |
US9202269B2 (en) * | 2011-06-21 | 2015-12-01 | Thomson Licensing | User terminal device, server device, system and method for assessing quality of media data |
CN102271279B (zh) * | 2011-07-22 | 2013-09-11 | 宁波大学 | 一种立体图像的最小可察觉变化步长的客观分析方法 |
EP2745518B1 (en) * | 2011-09-26 | 2017-06-14 | Telefonaktiebolaget LM Ericsson (publ) | Estimating user-perceived quality of an encoded video stream |
US9203708B2 (en) | 2011-09-26 | 2015-12-01 | Telefonaktiebolaget L M Ericsson (Publ) | Estimating user-perceived quality of an encoded stream |
FR2982449A1 (fr) * | 2011-11-07 | 2013-05-10 | France Telecom | Procede d'evaluation d'au moins un defaut de qualite dans un signal de donnees, dispositif et programme d'ordinateurs associes |
KR20140101745A (ko) * | 2011-11-28 | 2014-08-20 | 톰슨 라이센싱 | 다수 아티팩트를 고려한 비디오 품질 측정 |
CN103152599A (zh) * | 2013-02-01 | 2013-06-12 | 浙江大学 | 基于有序回归的移动视频业务用户体验质量评估方法 |
US10346680B2 (en) * | 2013-04-12 | 2019-07-09 | Samsung Electronics Co., Ltd. | Imaging apparatus and control method for determining a posture of an object |
CN103281555B (zh) * | 2013-04-24 | 2015-06-10 | 北京邮电大学 | 基于半参考评估的视频流业务QoE客观评估方法 |
CN104463339A (zh) * | 2014-12-23 | 2015-03-25 | 合一网络技术(北京)有限公司 | 多媒体资源制作者的评估方法及其装置 |
US10674180B2 (en) * | 2015-02-13 | 2020-06-02 | Netflix, Inc. | Techniques for identifying errors introduced during encoding |
DE102016201987A1 (de) * | 2015-02-16 | 2016-08-18 | Robert Bosch Engineering and Business Solutions Ltd. | Ein Verfahren zum Testen einer durch eine Displayeinrichtung angezeigten Grafik |
CN105163106B (zh) * | 2015-07-22 | 2017-04-12 | 天津科技大学 | 一种多重数据处理的视频质量评价系统 |
CN106341683A (zh) * | 2016-08-24 | 2017-01-18 | 乐视控股(北京)有限公司 | 全景视频质量判断方法及系统 |
US10798387B2 (en) | 2016-12-12 | 2020-10-06 | Netflix, Inc. | Source-consistent techniques for predicting absolute perceptual video quality |
KR101899070B1 (ko) | 2017-07-19 | 2018-09-14 | 국방과학연구소 | 랜덤신호에 대한 강건성 정량화를 통한 단독 영상 품질 평가 방법 및 장치 |
CN109255389B (zh) * | 2018-09-28 | 2022-03-25 | 中国科学院长春光学精密机械与物理研究所 | 一种装备评价方法、装置、设备及可读存储介质 |
KR102192017B1 (ko) * | 2019-07-16 | 2020-12-16 | 연세대학교 산학협력단 | 인간의 시각 특성을 반영한 비디오의 화질 평가 장치 및 방법 |
CN113473117B (zh) * | 2021-07-19 | 2022-09-02 | 上海交通大学 | 一种基于门控循环神经网络的无参考音视频质量评价方法 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01298890A (ja) | 1988-05-27 | 1989-12-01 | Nippon Telegr & Teleph Corp <Ntt> | 画像誤差評価方法 |
JPH0690468A (ja) | 1992-09-09 | 1994-03-29 | Nippon Telegr & Teleph Corp <Ntt> | 伝送路バースト符号誤り評価装置 |
US5446492A (en) | 1993-01-19 | 1995-08-29 | Wolf; Stephen | Perception-based video quality measurement system |
JPH09200805A (ja) * | 1996-01-11 | 1997-07-31 | Kokusai Denshin Denwa Co Ltd <Kdd> | ディジタル画像品質評価装置 |
JPH09307930A (ja) * | 1996-05-10 | 1997-11-28 | Kokusai Denshin Denwa Co Ltd <Kdd> | ディジタル画像品質評価装置 |
US6239834B1 (en) | 1996-01-11 | 2001-05-29 | Kokusai Denshin Denwa Co., Ltd. | Apparatus for evaluating digital picture quality |
US6704451B1 (en) | 1998-03-02 | 2004-03-09 | Koninklijke Kpn N.V. | Method and arrangement for objective assessment of video quality |
JP2004080177A (ja) | 2002-08-13 | 2004-03-11 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質評価装置、映像品質評価方法、映像品質評価プログラム及びそのプログラムを記録した記録媒体 |
JP2004172753A (ja) * | 2002-11-18 | 2004-06-17 | Nippon Telegr & Teleph Corp <Ntt> | 映像・音声品質客観評価方法及び装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7010159B2 (en) * | 2001-04-25 | 2006-03-07 | Koninklijke Philips Electronics N.V. | Apparatus and method for combining random set of video features in a non-linear scheme to best describe perceptual quality of video sequences using heuristic search methodology |
US6822675B2 (en) * | 2001-07-03 | 2004-11-23 | Koninklijke Philips Electronics N.V. | Method of measuring digital video quality |
CN1157059C (zh) * | 2002-01-29 | 2004-07-07 | 北京工业大学 | 一种结合运动特征的视频质量评价方法 |
US7038710B2 (en) * | 2002-07-17 | 2006-05-02 | Koninklijke Philips Electronics, N.V. | Method and apparatus for measuring the quality of video data |
EP1445958A1 (en) * | 2003-02-05 | 2004-08-11 | STMicroelectronics S.r.l. | Quantization method and system, for instance for video MPEG applications, and computer program product therefor |
US7558320B2 (en) * | 2003-06-13 | 2009-07-07 | Microsoft Corporation | Quality control in frame interpolation with motion analysis |
GB0314162D0 (en) * | 2003-06-18 | 2003-07-23 | British Telecomm | Edge analysis in video quality assessment |
-
2005
- 2005-10-17 WO PCT/JP2005/019019 patent/WO2006043500A1/ja active Application Filing
- 2005-10-17 CA CA2582531A patent/CA2582531C/en active Active
- 2005-10-17 EP EP05793444A patent/EP1804519A4/en not_active Ceased
- 2005-10-17 KR KR1020077007552A patent/KR100858999B1/ko active IP Right Grant
- 2005-10-17 CN CN2005800335748A patent/CN101036399B/zh active Active
- 2005-10-17 US US11/663,679 patent/US8130274B2/en active Active
- 2005-10-17 JP JP2006542965A patent/JP4733049B2/ja active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01298890A (ja) | 1988-05-27 | 1989-12-01 | Nippon Telegr & Teleph Corp <Ntt> | 画像誤差評価方法 |
JPH0690468A (ja) | 1992-09-09 | 1994-03-29 | Nippon Telegr & Teleph Corp <Ntt> | 伝送路バースト符号誤り評価装置 |
US5446492A (en) | 1993-01-19 | 1995-08-29 | Wolf; Stephen | Perception-based video quality measurement system |
JPH09200805A (ja) * | 1996-01-11 | 1997-07-31 | Kokusai Denshin Denwa Co Ltd <Kdd> | ディジタル画像品質評価装置 |
US6239834B1 (en) | 1996-01-11 | 2001-05-29 | Kokusai Denshin Denwa Co., Ltd. | Apparatus for evaluating digital picture quality |
JPH09307930A (ja) * | 1996-05-10 | 1997-11-28 | Kokusai Denshin Denwa Co Ltd <Kdd> | ディジタル画像品質評価装置 |
US6704451B1 (en) | 1998-03-02 | 2004-03-09 | Koninklijke Kpn N.V. | Method and arrangement for objective assessment of video quality |
JP2004080177A (ja) | 2002-08-13 | 2004-03-11 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質評価装置、映像品質評価方法、映像品質評価プログラム及びそのプログラムを記録した記録媒体 |
JP2004172753A (ja) * | 2002-11-18 | 2004-06-17 | Nippon Telegr & Teleph Corp <Ntt> | 映像・音声品質客観評価方法及び装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1804519A4 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100811835B1 (ko) | 2006-10-25 | 2008-03-10 | 주식회사 에스원 | 동영상 특징량 추출방법 및 이를 이용한 내용 기반 동영상검색방법 |
JP2008131658A (ja) * | 2006-11-22 | 2008-06-05 | Tektronix Inc | ビデオ・フレーム測定装置及び方法 |
KR100893609B1 (ko) | 2007-06-05 | 2009-04-20 | 주식회사 케이티 | 인간 시각 특성을 이용한 영상 품질 측정 장치 및 방법 |
JP2011510562A (ja) * | 2008-01-18 | 2011-03-31 | トムソン ライセンシング | 知覚上の品質を評価する方法 |
KR101188833B1 (ko) | 2008-03-21 | 2012-10-08 | 니폰덴신뎅와 가부시키가이샤 | 비디오 품질을 객관적으로 평가하기 위한 방법, 장치, 및 프로그램 |
WO2009116666A1 (ja) * | 2008-03-21 | 2009-09-24 | 日本電信電話株式会社 | 映像品質客観評価方法、映像品質客観評価装置、およびプログラム |
JP2009273127A (ja) * | 2008-04-30 | 2009-11-19 | Thomson Licensing | フレーム系列の歪んだバージョンの品質を評価する方法 |
JPWO2009133879A1 (ja) * | 2008-04-30 | 2011-09-01 | 日本電気株式会社 | 画像評価方法、画像評価システム及びプログラム |
US8699818B2 (en) | 2008-04-30 | 2014-04-15 | Nec Corporation | Method, system, and program for determining image quality based on pixel changes between image frames |
JP5708916B2 (ja) * | 2008-04-30 | 2015-04-30 | 日本電気株式会社 | 画像評価方法、画像評価システム及びプログラム |
US8422795B2 (en) | 2009-02-12 | 2013-04-16 | Dolby Laboratories Licensing Corporation | Quality evaluation of sequences of images |
JP2010206790A (ja) * | 2009-03-05 | 2010-09-16 | Tektronix Inc | 信号フィルタ処理方法及び装置並びにイメージ整列方法 |
JP2011015165A (ja) * | 2009-07-01 | 2011-01-20 | Nippon Telegr & Teleph Corp <Ntt> | 映像品質推定装置、システム、方法およびプログラム |
JP2015520548A (ja) * | 2012-04-23 | 2015-07-16 | 華為技術有限公司Huawei Technologies Co.,Ltd. | マルチメディア品質を評価する方法及び装置 |
US10116929B2 (en) | 2012-08-22 | 2018-10-30 | Huawei Technologies Co., Ltd. | Multimedia quality monitoring method, and device |
Also Published As
Publication number | Publication date |
---|---|
EP1804519A4 (en) | 2010-01-06 |
US8130274B2 (en) | 2012-03-06 |
CA2582531A1 (en) | 2006-04-27 |
CA2582531C (en) | 2013-03-12 |
CN101036399A (zh) | 2007-09-12 |
EP1804519A1 (en) | 2007-07-04 |
CN101036399B (zh) | 2010-05-05 |
JPWO2006043500A1 (ja) | 2008-05-22 |
US20080143837A1 (en) | 2008-06-19 |
JP4733049B2 (ja) | 2011-07-27 |
KR20070061855A (ko) | 2007-06-14 |
KR100858999B1 (ko) | 2008-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006043500A1 (ja) | 映像品質客観評価装置、評価方法およびプログラム | |
US7733372B2 (en) | Method and system for video quality measurements | |
US8442318B2 (en) | Method and apparatus for modifying a moving image sequence | |
US9672636B2 (en) | Texture masking for video quality measurement | |
US6931065B2 (en) | Apparatus and method for motion detection of image in digital video recording system using MPEG video compression | |
US20070263897A1 (en) | Image and Video Quality Measurement | |
JP4104593B2 (ja) | 画像処理システムの損失ブロック復旧装置及びその方法 | |
US9232118B1 (en) | Methods and systems for detecting video artifacts | |
KR101037940B1 (ko) | 압축 영상의 화질 검출장치 및 방법 | |
JP2004096752A (ja) | 動きベクトルを近似する方法、該方法を実行するコンピュータプログラム、該プログラムを記憶するデータ記憶媒体、該方法を実行するように適合した装置、及び該装置を備える受信機 | |
JP4514155B2 (ja) | 映像品質評価装置、方法およびプログラム | |
WO2010103112A1 (en) | Method and apparatus for video quality measurement without reference | |
Ong et al. | Perceptual quality metric for H. 264 low bit rate videos | |
KR20140101745A (ko) | 다수 아티팩트를 고려한 비디오 품질 측정 | |
Gurav et al. | Full-reference video quality assessment using structural similarity (SSIM) index | |
Narwaria et al. | Video quality assessment using temporal quality variations and machine learning | |
JP2004080177A (ja) | 映像品質評価装置、映像品質評価方法、映像品質評価プログラム及びそのプログラムを記録した記録媒体 | |
JP4837909B2 (ja) | 画像処理システムの損失ブロック特性判断装置及びその方法 | |
Cheng et al. | Reference-free objective quality metrics for MPEG-coded video | |
KR100608048B1 (ko) | 움직임 벡터 오류 정정 방법 및 그 장치와 이를 구현하기위한 프로그램이 기록된 기록 매체 | |
Yang et al. | Spatial-temporal video quality assessment based on two-level temporal pooling | |
WO2013159275A1 (en) | Perceived video quality estimation considering visual attention | |
Leszczuk et al. | Study of No-Reference Video Quality Metrics for HEVC Compression | |
Pelagotti et al. | Scalable motion compensated scan-rate upconversion | |
Yang et al. | A New Objective Quality Metric for Frame Interpolation using in Video Compression |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11663679 Country of ref document: US Ref document number: 2582531 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005793444 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580033574.8 Country of ref document: CN Ref document number: 1020077007552 Country of ref document: KR Ref document number: 2006542965 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2005793444 Country of ref document: EP |