JP5450279B2 - Image quality objective evaluation apparatus, method and program - Google Patents

Image quality objective evaluation apparatus, method and program Download PDF

Info

Publication number
JP5450279B2
JP5450279B2 JP2010137758A JP2010137758A JP5450279B2 JP 5450279 B2 JP5450279 B2 JP 5450279B2 JP 2010137758 A JP2010137758 A JP 2010137758A JP 2010137758 A JP2010137758 A JP 2010137758A JP 5450279 B2 JP5450279 B2 JP 5450279B2
Authority
JP
Japan
Prior art keywords
video
feature amount
image
difference
frames
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010137758A
Other languages
Japanese (ja)
Other versions
JP2012004840A (en
Inventor
淳 岡本
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to JP2010137758A priority Critical patent/JP5450279B2/en
Publication of JP2012004840A publication Critical patent/JP2012004840A/en
Application granted granted Critical
Publication of JP5450279B2 publication Critical patent/JP5450279B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a video quality objective evaluation apparatus, method, and program, and more particularly to objectively evaluate video quality in an IPTV (Internet Protocol TeleVisiton) service, a video distribution service using a mobile phone, a videophone service, and the like. The present invention relates to an image quality objective evaluation apparatus, method, and program.

  With the increase in communication line speed and bandwidth and the diversification of terminals that receive broadcast waves, an environment in which video services can be enjoyed using various terminals is being prepared.

  In the video service, deterioration such as block distortion or blur occurs in the video. The cause is that encoding is performed using the fact that the transmission band is generally limited, and bit errors and packet loss occur due to the deterioration of the quality of the transmission path.

  Therefore, in order to confirm that the video service is provided with good quality, the quality of the video experienced by the user is measured before or during service provision, and the quality of the video provided to the user is determined. It is important to confirm and monitor that it is good.

  Therefore, a video quality evaluation technique that can appropriately evaluate the video quality experienced by the user is necessary.

  Methods for evaluating video quality include a subjective evaluation method (for example, see Non-Patent Document 1) and an objective evaluation method (for example, Non-Patent Document 2).

  In the subjective evaluation method, it is necessary to have a large number of users evaluate videos in a facility where the evaluation environment (room illumination, room noise, etc.) can be reproduced. For this reason, the cost is high and the evaluation takes time, so that it is not suitable for use in which quality is evaluated in real time. Therefore, it is desired to develop a video quality evaluation method that derives a physical feature amount from a video signal and outputs a video quality evaluation value.

  Conventional objective evaluation methods include an evaluation technique for deriving video quality from features that affect video quality by comparing the video signals (pixel information) of the reference video and degraded video as input (for example, Non-patent document 2).

  However, the technique of Patent Document 2 is based on the premise that the resolution and format of the reference video and the degraded video match, and there is a problem that quality evaluation cannot be performed if they do not match. Specifically, in the 1Seg service of digital terrestrial television broadcasting service for mobile terminals, the standard video is HDTV (High Definition TeleVision) format video, but it has been abolished with QVGA (Quarter Video Graphics Array) size (320 X240 pixels), but the quality of the video displayed on the mobile device may be evaluated differently because the resolution of the reference video differs from that of the degraded video, such as the resolution and aspect ratio differ by manufacturer and model. Have difficulty.

  Here, a method of matching the reference video and degraded video by applying signal processing so as to match the resolution and format is also conceivable. However, this method has the problem that the effect of the applied signal processing comes out, proper matching cannot be achieved, and quality estimation cannot be performed.

  The present invention has been made in view of the above points, and is capable of estimating a highly accurate video quality evaluation value even when the resolution, format format, and display rate of the reference video and the deteriorated video are different. It is an object of the present invention to provide a reference type video quality objective evaluation apparatus, method and program.

  In the present invention, in order to achieve the above object, the pixel values of the reference image and the deteriorated image are not directly compared as in the prior art. A video quality objective evaluation apparatus, method, and program capable of evaluating video quality even when the resolution and the format are different by deriving the relative deterioration feature quantity to be extracted and using the feature quantity are provided.

The present invention (Claim 1) is a video quality objective evaluation apparatus for objectively evaluating video quality,
The relative degradation feature amount extracting relative deterioration characteristic amount obtained by normalizing the respective difference feature quantity of difference feature quantity as a pixel average between frames of the deteriorated video difference feature quantity between frames of criteria video Extraction means;
Video quality estimation means for deriving a video quality evaluation value based on the relative deterioration feature quantity extracted from the relative deterioration feature quantity extraction means;
It is characterized by having.

Further, the present invention (Claim 2) is the relative deterioration feature quantity extraction unit of Claim 1,
A first deterioration feature amount comparison unit that synchronizes the reference image and the deteriorated video using a timing at which both changes in the difference feature amount between the frames of the reference image and the deteriorated image both increase;
Or
A second deterioration feature amount comparison unit that compares a change amount for each corresponding frame when the resolution of the image of the reference image and the deterioration image is different;
Or
Third deterioration feature amount comparison means for comparing a change amount per unit time when the display frame rate of the reference image and the deteriorated image are different;
Or
As the difference feature amount between the frames of the reference image and the deteriorated image, the square of the difference between the pixel average of the difference feature amount between the frames of the reference image and the pixel average of the difference feature amount between the frames of the deteriorated image A ratio of pixel values of feature amounts, a value obtained by equalizing a pixel difference amount with a pixel value of a reference image, or a value obtained by statistically processing them, a fourth degradation feature amount comparison unit,
Have one of the following.

Further, according to the present invention (Claim 3), in the relative deterioration feature amount extraction unit according to Claim 1,
Reference video feature amount extraction means for deriving a difference feature amount between frames of the reference video;
Degrading image feature amount extraction means for deriving a difference feature amount between frames of the deteriorated image,
The reference image feature amount extraction unit and the deteriorated image feature amount extraction unit include:
When deriving a difference feature amount between frames of the reference image and the deteriorated image , pixel thinning or information on a specific area is used.

Further, according to the present invention (Claim 4), in the relative deterioration feature amount extraction unit according to Claim 1,
Encoding processing means is provided for outputting a video obtained by performing encoder processing on the inputted reference video as a new reference video.

Further, the present invention (Claim 5) is the video quality estimation means according to Claim 1 ,
Information obtained by reading from the storage means storing the parameters of the video to be evaluated extracted in advance or by directly comparing pixel information between the reference video and the degraded video is used.

The present invention (Claim 6) is a video quality objective evaluation method for objectively evaluating video quality,
The relative degradation feature amount extracting relative deterioration characteristic amount obtained by normalizing the respective difference feature quantity of difference feature quantity as a pixel average between frames of the deteriorated video difference feature quantity between frames of criteria video An extraction step;
And a video quality estimation step of deriving a video quality evaluation value based on the relative deterioration feature amount extracted from the relative deterioration feature amount extraction means.

Further, according to the present invention (Claim 7), in the relative deterioration feature amount extraction step of Claim 6,
A first deterioration feature amount comparison step for synchronizing the reference image and the deteriorated image using a timing at which both changes in the difference feature amount between frames of the reference image and the deteriorated image both increase;
Or
A second deterioration feature amount comparison step of comparing the amount of change for each corresponding frame when the resolution of the image of the reference image and the deterioration image is different;
Or
A third deterioration feature amount comparison step for comparing a change amount per unit time when the display frame rates of the reference image and the deterioration image are different;
Or
As the difference feature amount between the frames of the reference image and the deteriorated image, the square of the difference between the pixel average of the difference feature amount between the frames of the reference image and the pixel average of the difference feature amount between the frames of the deteriorated image Either a ratio of pixel values of feature amounts, a value obtained by equalizing pixel difference amounts with pixel values of a reference image, or a fourth deterioration feature amount comparison step using a value obtained by statistically processing them is performed.

The present invention (Claim 8) is characterized in that in the relative deterioration feature amount extraction step of Claim 6,
A reference image feature extraction step for deriving a difference feature between frames of the reference image;
Performing a degraded video feature amount extraction step for deriving a difference feature amount between frames of the degraded video,
In the reference image feature extraction step and the deteriorated image feature extraction step,
When deriving a difference feature amount between frames of the reference image and the deteriorated image , pixel thinning or information on a specific area is used.

Further, according to the present invention (Claim 9), in the relative deterioration feature amount extraction step of Claim 6,
A step of outputting a video obtained by performing encoder processing on the inputted reference video as a new reference video is performed.

Further, the present invention (Claim 10) is the video quality estimation step according to Claim 6 ,
Information obtained by reading from the storage means storing the parameters of the video to be evaluated extracted in advance or by directly comparing pixel information between the reference video and the degraded video is used.

  Moreover, this invention (Claim 11) is the video quality objective evaluation program for functioning a computer as each means which comprises the video quality objective evaluation apparatus of any one of Claim 1 thru | or 5.

  As described above, according to the present invention, even when the resolution, format, and display rate of the reference image and the deteriorated image are different, the difference in the image deterioration amount between the frames before and after the reference image and the deteriorated image is determined. By using the extracted relative deterioration feature quantity, it is possible to improve the point that it has been difficult to evaluate the quality by comparing the pixels directly, and to estimate the video quality value in the video service simply and with high accuracy.

  Accordingly, since the video service provider can monitor the video quality of the video service that the user actually views according to the present invention, the provider can check whether the provided service is provided with an appropriate quality to the user. .

  Thereby, the provider of the video service can appropriately grasp and manage the actual quality of the service before and during the provision.

It is a block diagram of the video quality objective evaluation apparatus in the 1st Embodiment of this invention. It is an example of steady quality DB in the 1st Embodiment of this invention. It is a figure which shows the interframe difference feature-value in the 1st Embodiment of this invention. 3 is a method for synchronizing reference feature / degraded inter-frame difference feature values in the first exemplary embodiment of the present invention. It is a unit time comparison method of the reference feature / degraded interframe difference feature value in the first embodiment of the present invention. It is a block diagram of the video quality objective evaluation apparatus in the 2nd Embodiment of this invention.

  Embodiments of the present invention will be described below with reference to the drawings.

[First Embodiment]
As shown in FIG. 1, the video quality objective evaluation apparatus according to the embodiment of the present invention uses a reference video signal 10 and a degraded video signal 20 to derive a video quality evaluation value 60, which is a full-reference type video quality objective. An evaluation device is realized.

  FIG. 1 shows a video quality objective evaluation apparatus according to the first embodiment of the present invention, which includes a relative deterioration feature quantity extraction unit 100A, a quality estimation unit 200, and a steady quality database 300.

  As shown in FIG. 2, the steady quality database 300 stores steady quality data including encoder conditions, steady quality values, coefficients, and evaluation formulas.

  The relative deterioration feature amount extraction unit 100A includes a reference image feature amount extraction unit 110, a deterioration image feature amount extraction unit 120, and a deterioration feature amount comparison unit 130.

  When the reference video signal 10 is input, the reference video feature amount extraction unit 110 extracts inter-frame difference information as illustrated in FIG. 3 and outputs the reference inter-frame difference feature amount 15 to the degradation feature amount comparison unit 130. . FIG. 3A shows the difference between reference video frames, and FIG. 3B shows the difference between deteriorated video frames. Here, color difference information or RGB information may be used as the reference inter-frame difference feature 15. Here, color difference information or RGB information may be used.

ΔR (i, j, k) is the difference luminance value of the reference image of the pixel (i, j) in the kth frame, and R (i, j, k) is the pixel (i, j) in the kth frame. Represents the luminance value of the reference video.

  When the degraded video signal 20 is input, the degraded video feature quantity extraction unit 120 outputs the degraded inter-frame difference feature quantity 25 to the degradation feature quantity comparison unit 130 by the same processing as the reference video feature quantity extraction unit 110. Here, the following difference frame using a luminance value is used as the degraded inter-frame difference feature 25. Here, color difference information or RGB information may be used.

Here, ΔP (i, j, k) is the difference luminance value of the reference image of the pixel (i, j) in the kth frame, and P (i, j, k) is the pixel (i, j) in the kth frame. Represents the luminance value of the reference video.

  Note that, in the derivation of inter-frame difference information performed by the reference video feature quantity extraction unit 110 and the degraded video feature quantity extraction unit 120, it is assumed that all pixels are used, but the ease of implementation can be reduced by reducing the processing amount. In order to promote the securing and the improvement of the processing speed and to achieve real time, pixel thinning or pixels in a specific video area can be used.

  The deterioration feature amount comparison unit 130 outputs the relative deterioration feature amount 30 by comparing the reference inter-frame difference feature amount 15 and the deteriorated inter-frame difference feature amount 25.

  The deterioration feature amount comparison unit 130 needs to synchronize the reference image with the deteriorated image in order to compare the difference feature amount 15 between the reference frames and the difference feature amount 25 between the deterioration frames. For this reason, as shown in FIG. 4, synchronization is performed using timing at which a large change occurs in the deterioration feature amounts of the frames before and after the reference image and the deteriorated image due to a scene change or the like. 4A shows the peak A of the feature amount of the reference image, and FIG. 4B shows the peak B of the feature amount of the deteriorated image. In the figure, the peak A of the feature amount generated by the scene change of the reference video is synchronized with the peak B of the feature amount generated by the scene change generated by the degraded video.

  The unit for comparing the reference inter-frame difference feature 15 and the deteriorated inter-frame difference feature 25 is, as shown in FIG. 5, the reference inter-frame difference feature and the deteriorated inter-frame difference feature every 1/15 seconds. Compare the amount of change per unit time. Accordingly, evaluation can be performed even when the display frame rate of the reference image and the deteriorated image are different or when a frame is lost due to packet loss or the like.

  Further, as the relative deterioration feature amount 30, the following feature amounts (1) to (3) and combinations thereof are derived. These feature amounts are normalized by using the difference value between frames as a pixel average, and the influence of the difference in the image size between the reference image and the deteriorated image is taken into consideration.

  (1) Difference amount between previous and next frame pixel values (MSE):

However, the target is the kth frame. Further, the resolution of the reference image is X pixels in the horizontal direction and Y pixels in the vertical direction, and the resolution of the deteriorated image is X ′ pixels in the horizontal direction and Y ′ pixels in the vertical direction.

  (2) Ratio of pixel values of previous and next frames:

However, the target is the kth frame. Further, the resolution of the reference image is X pixels in the horizontal direction and Y pixels in the vertical direction, and the resolution of the deteriorated image is X ′ pixels in the horizontal direction and Y ′ pixels in the vertical direction.

  (3) Signal to noise ratio (PSNR) of the front and rear frames:

However, PSNR is a value obtained by equalizing the pixel difference amount with the pixel value of the reference image, and MAX is the maximum pixel value that can be taken by ΔR, and changes depending on how many bits constitute one luminance value. The SN ratio of a plurality of frames is obtained as an average of the SN ratios for each frame. For example, instead of deriving for each frame as shown in Equation (5), when calculating the multiple frames PSNR as shown in Equation (6) It is also possible to derive after calculating the difference sum of a plurality of frames first.

However, M is the number of frames to be measured, MAX is the maximum pixel value that can be taken by ΔR, and varies depending on how many bits constitute one luminance value.

  Further, as the feature amount, a statistic such as the variance value, the maximum value, or the minimum value can be used.

  The quality estimation unit 200 receives the relative degradation feature quantity 30 from the degradation feature quantity comparison unit 130, and sends the known degradation video signal encoding conditions and service conditions to the steady quality database 300, and includes a steady quality value and coefficients. The quality data 40 is received and the following calculation is performed.

However, Q is the video quality evaluation value 30, α S is the S-th relative deterioration feature amount, and the maximum value of S is P. Β is a steady quality value, and a, b, and c are coefficients.

  Further, the steady quality database 300 may return only the steady quality value, but may pass the evaluation formula itself. Furthermore, as long as the encoding scheme remains unchanged, the steady quality data 40 derivation request may not be made at any time.

[Second Embodiment]
Hereinafter, a second embodiment will be described with reference to the drawings.

  In the following, only the parts different from the first embodiment will be described.

  FIG. 6 shows the configuration of the video quality objective evaluation apparatus in the second embodiment of the present invention. In the figure, the same components as those in FIG.

  The video quality objective evaluation apparatus according to the present embodiment includes a relative deterioration feature amount extraction unit 100B, a quality estimation unit 200, and a steady quality extraction unit 400.

  The relative deterioration feature amount extraction unit 100B includes an encoding processing unit 105, a reference image feature amount extraction unit 110, a deterioration image feature amount extraction unit 120, and a deterioration feature amount comparison unit 130.

  When the reference video signal 10 is input, the encoding processing unit 105 outputs an encoded processing reference video signal 12 that is encoded to the reference video feature amount extraction unit 110. This is to improve that the influence of deterioration due to encoding may not be appropriately taken with the difference feature quantity of the reference image itself as a comparison target of the deteriorated image. Although the encoding processing unit 105 performs the encoding process, an appropriate filtering process corresponding to the encoding may be used.

  When the encoding process reference video signal 12 is input, the reference video feature quantity extraction unit 110 outputs the reference interframe difference feature quantity 15 by performing the same processing as in the first embodiment.

  The steady quality extraction unit 400 derives the steady quality data 50 by directly comparing the pixel information of the reference video signal 10 and the degraded video signal 20 in order to derive a reference for the relative deterioration feature amount. This method is difficult to directly estimate quality due to the influence of video resolution, but uses average quality information. The feature quantity used in the steady quality extraction unit 400 is the reference video signal 10 and the degraded video signal instead of the reference frame difference feature quantity 15 and the degraded inter-frame difference feature quantity 25 which are the original inputs of the feature quantity comparison unit 130. 20 and the steady quality extraction unit 400 directly derives the pixel difference from these. In addition, when the video resolution and the display frame rate are different, the steady quality extraction unit 400 performs a process for matching the resolution and the rate.

  Note that the components of the video quality objective evaluation device in the first and second embodiments described above are constructed as a program and installed in a computer used as the video quality objective evaluation device, or executed via a network. Can be distributed.

  Further, the constructed program can be stored in a portable storage medium such as a hard disk, a flexible disk, or a CD-ROM, and can be installed or distributed in a computer.

  The present invention is not limited to the above-described embodiment, and various modifications and applications can be made within the scope of the claims.

  The present invention can be applied to estimation of video quality of transmitted video at the head of the video distribution base, quality confirmation before transmission in the VOD service, and estimation and confirmation after reception on the terminal side. It is.

10 Reference video signal 20 Degraded video signal 12 Encoding reference video signal 15 Reference interframe difference feature 25 Degraded interframe difference feature 30 Relative deterioration feature 40 Steady quality data requirement 50 Steady quality data 60 Video quality evaluation value 100A, 100B Relative deterioration feature amount extraction unit 105 Encoding processing unit 110 Reference video feature amount extraction unit 120 Degraded video feature amount extraction unit 130 Deterioration feature amount comparison unit 200 Quality estimation unit 300 Steady quality DB
400 Stationary quality extraction unit

Claims (11)

  1. A video quality objective evaluation device for objectively evaluating video quality,
    The relative degradation feature amount extracting relative deterioration characteristic amount obtained by normalizing the respective difference feature quantity of difference feature quantity as a pixel average between frames of the deteriorated video difference feature quantity between frames of criteria video Extraction means;
    Video quality estimation means for deriving a video quality evaluation value based on the relative deterioration feature quantity extracted from the relative deterioration feature quantity extraction means;
    A video quality objective evaluation apparatus characterized by comprising:
  2. The relative deterioration feature amount extraction means includes:
    A first deterioration feature amount comparison unit that synchronizes the reference image and the deteriorated video using a timing at which both changes in the difference feature amount between the frames of the reference image and the deteriorated image both increase;
    Or
    A second deterioration feature amount comparison unit that compares a change amount for each corresponding frame when the resolution of the image of the reference image and the deterioration image is different;
    Or
    Third deterioration feature amount comparison means for comparing a change amount per unit time when the display frame rate of the reference image and the deteriorated image are different;
    Or
    As the difference feature amount between the frames of the reference image and the deteriorated image, the square of the difference between the pixel average of the difference feature amount between the frames of the reference image and the pixel average of the difference feature amount between the frames of the deteriorated image A ratio of pixel values of feature amounts, a value obtained by equalizing a pixel difference amount with a pixel value of a reference image, or a value obtained by statistically processing them, a fourth degradation feature amount comparison unit,
    The video quality objective evaluation apparatus according to claim 1, comprising:
  3. The relative deterioration feature amount extraction means includes:
    Reference video feature amount extraction means for deriving a difference feature amount between frames of the reference video;
    Degrading image feature amount extraction means for deriving a difference feature amount between frames of the deteriorated image,
    The reference image feature amount extraction unit and the deteriorated image feature amount extraction unit include:
    The video quality objective evaluation apparatus according to claim 1, wherein when deriving a difference feature amount between frames of the reference video and the deteriorated video, pixel thinning or information on a specific area is used.
  4. The relative deterioration feature amount extraction means includes:
    The video quality objective evaluation apparatus according to claim 1, further comprising: an encoding processing unit that outputs a video obtained by performing an encoder process on the input reference video as a new reference video.
  5. The video quality estimation means includes
    The video according to claim 1, wherein information obtained by reading from a storage unit storing parameters of an evaluation target video extracted in advance or by directly comparing pixel information between the reference video and the deteriorated video is used. Quality objective evaluation device.
  6. A video quality objective evaluation method for objectively evaluating video quality,
    The relative degradation feature amount extracting relative deterioration characteristic amount obtained by normalizing the respective difference feature quantity of difference feature quantity as a pixel average between frames of the deteriorated video difference feature quantity between frames of criteria video An extraction step;
    A video quality estimation step for deriving a video quality evaluation value based on the relative degradation feature quantity extracted from the relative degradation feature quantity extraction unit;
    A video quality objective evaluation method characterized by
  7. In the relative deterioration feature amount extraction step,
    A first deterioration feature amount comparison step for synchronizing the reference image and the deteriorated image using a timing at which both changes in the difference feature amount between frames of the reference image and the deteriorated image both increase;
    Or
    A second deterioration feature amount comparison step of comparing the amount of change for each corresponding frame when the resolution of the image of the reference image and the deterioration image is different;
    Or
    A third deterioration feature amount comparison step for comparing a change amount per unit time when the display frame rates of the reference image and the deterioration image are different;
    Or
    As the difference feature amount between the frames of the reference image and the deteriorated image, the square of the difference between the pixel average of the difference feature amount between the frames of the reference image and the pixel average of the difference feature amount between the frames of the deteriorated image A fourth deterioration feature amount comparison step using a ratio of pixel values of feature amounts, a value obtained by equalizing pixel difference amounts with pixel values of a reference image, or a value obtained by statistically processing them;
    7. The video quality objective evaluation method according to claim 6, wherein any one of the above is performed.
  8. In the relative deterioration feature amount extraction step,
    A reference image feature extraction step for deriving a difference feature between frames of the reference image;
    Performing a degraded video feature amount extraction step for deriving a difference feature amount between frames of the degraded video,
    In the reference image feature extraction step and the deteriorated image feature extraction step,
    The video quality objective evaluation method according to claim 6, wherein pixel decimation or information on a specific area is used when deriving a difference feature amount between frames of the reference video and the deteriorated video.
  9. In the relative deterioration feature amount extraction step,
    7. The video quality objective evaluation method according to claim 6, wherein a step of outputting a video obtained by performing an encoder process on the inputted reference video as a new reference video is performed.
  10. In the video quality estimation step,
    7. The video according to claim 6, wherein information obtained by reading from a storage means storing parameters of a video to be evaluated that has been extracted in advance or by directly comparing pixel information between the reference video and the degraded video is used. Quality objective evaluation method.
  11.   A video quality objective evaluation program for causing a computer to function as each means constituting the video quality objective evaluation apparatus according to any one of claims 1 to 5.
JP2010137758A 2010-06-16 2010-06-16 Image quality objective evaluation apparatus, method and program Active JP5450279B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010137758A JP5450279B2 (en) 2010-06-16 2010-06-16 Image quality objective evaluation apparatus, method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010137758A JP5450279B2 (en) 2010-06-16 2010-06-16 Image quality objective evaluation apparatus, method and program

Publications (2)

Publication Number Publication Date
JP2012004840A JP2012004840A (en) 2012-01-05
JP5450279B2 true JP5450279B2 (en) 2014-03-26

Family

ID=45536324

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010137758A Active JP5450279B2 (en) 2010-06-16 2010-06-16 Image quality objective evaluation apparatus, method and program

Country Status (1)

Country Link
JP (1) JP5450279B2 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6236906B2 (en) * 2013-06-20 2017-11-29 富士通株式会社 Evaluation apparatus, evaluation method, and evaluation program
JP6078431B2 (en) * 2013-07-24 2017-02-08 日本電信電話株式会社 Video quality estimation apparatus, video quality estimation method and program
JP5860506B2 (en) * 2014-06-17 2016-02-16 日本電信電話株式会社 Evaluation video analysis apparatus, method and program
JP6431449B2 (en) * 2015-07-01 2018-11-28 日本電信電話株式会社 Video matching apparatus, video matching method, and program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1622395B1 (en) * 2003-08-22 2015-03-18 Nippon Telegraph And Telephone Corporation Device, method and program for quality evaluation of a video signal after its transmission or encoding
WO2006099743A1 (en) * 2005-03-25 2006-09-28 Algolith Inc. Apparatus and method for objective assessment of dct-coded video quality with or without an original video sequence
JP2007110189A (en) * 2005-10-11 2007-04-26 Shinano Kenshi Co Ltd Image quality evaluation apparatus and image quality evaluation method
JP4594389B2 (en) * 2006-05-09 2010-12-08 日本電信電話株式会社 Video quality estimation apparatus, method, and program
WO2009133879A1 (en) * 2008-04-30 2009-11-05 日本電気株式会社 Image evaluation method, image evaluation system and program
JP4787303B2 (en) * 2008-09-25 2011-10-05 日本電信電話株式会社 Video quality estimation apparatus, method, and program

Also Published As

Publication number Publication date
JP2012004840A (en) 2012-01-05

Similar Documents

Publication Publication Date Title
US8428342B2 (en) Apparatus and method for providing three dimensional media content
Chikkerur et al. Objective video quality assessment methods: A classification, review, and performance comparison
JP5698318B2 (en) Feature optimization and reliability prediction for audio and video signature generation and detection
US8253803B2 (en) Video quality assessing apparatus
CN100461864C (en) Multimedia video communication objective quality appraising method based on digital watermark
Susstrunk et al. Color image quality on the internet
Yasakethu et al. Quality analysis for 3D video using 2D video quality models
Winkler et al. The evolution of video quality measurement: From PSNR to hybrid metrics
US20100053300A1 (en) Method And Arrangement For Video Telephony Quality Assessment
US7593061B2 (en) Method and apparatus for measuring and/or correcting audio/visual synchronization
Winkler et al. Perceptual video quality and blockiness metrics for multimedia streaming applications
US8531531B2 (en) Audio-visual quality estimation
US20080025400A1 (en) Objective perceptual video quality evaluation apparatus
JP4486130B2 (en) Video communication quality estimation apparatus, method, and program
US7873727B2 (en) System and method for evaluating streaming multimedia quality
KR100935650B1 (en) Video Quality Estimation Apparatus, Method, and Computer-Readable Recording Medium for Recording Program
KR100977694B1 (en) Temporal quality metric for video coding
EP2564590B1 (en) Method and apparatus for assessing quality of video stream
Ries et al. Video Quality Estimation for Mobile H. 264/AVC Video Streaming.
Huynh-Thu et al. The accuracy of PSNR in predicting video quality for different video scenes and frame rates
Raake et al. TV-model: Parameter-based prediction of IPTV quality
KR101188833B1 (en) Video Quality Objective Assessment Method, Video Quality Objective Assessment Apparatus, and Program
Kuipers et al. Techniques for measuring quality of experience
Barkowsky et al. Temporal trajectory aware video quality measure
Joveluro et al. Perceptual video quality metric for 3d video quality assessment

Legal Events

Date Code Title Description
A621 Written request for application examination

Effective date: 20121005

Free format text: JAPANESE INTERMEDIATE CODE: A621

A977 Report on retrieval

Effective date: 20130426

Free format text: JAPANESE INTERMEDIATE CODE: A971007

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130521

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130709

A131 Notification of reasons for refusal

Effective date: 20130924

Free format text: JAPANESE INTERMEDIATE CODE: A131

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20131004

A521 Written amendment

Effective date: 20131108

Free format text: JAPANESE INTERMEDIATE CODE: A523

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20131217

A61 First payment of annual fees (during grant procedure)

Effective date: 20131225

Free format text: JAPANESE INTERMEDIATE CODE: A61

R150 Certificate of patent (=grant) or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350