CN101919255B - Reference selection for video interpolation or extrapolation - Google Patents
Reference selection for video interpolation or extrapolation Download PDFInfo
- Publication number
- CN101919255B CN101919255B CN2008801250420A CN200880125042A CN101919255B CN 101919255 B CN101919255 B CN 101919255B CN 2008801250420 A CN2008801250420 A CN 2008801250420A CN 200880125042 A CN200880125042 A CN 200880125042A CN 101919255 B CN101919255 B CN 101919255B
- Authority
- CN
- China
- Prior art keywords
- frame
- unit
- candidate
- video unit
- quality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000013213 extrapolation Methods 0.000 title claims abstract description 198
- 238000004458 analytical method Methods 0.000 claims abstract description 120
- 238000000034 method Methods 0.000 claims description 100
- 230000000007 visual effect Effects 0.000 claims description 48
- 238000001514 detection method Methods 0.000 claims description 18
- 238000004891 communication Methods 0.000 claims description 13
- 238000013139 quantization Methods 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 7
- 230000033001 locomotion Effects 0.000 description 230
- 239000013598 vector Substances 0.000 description 127
- 230000008569 process Effects 0.000 description 57
- 238000005516 engineering process Methods 0.000 description 54
- 238000012360 testing method Methods 0.000 description 39
- 239000000872 buffer Substances 0.000 description 38
- 238000013442 quality metrics Methods 0.000 description 36
- 230000005540 biological transmission Effects 0.000 description 35
- 230000007246 mechanism Effects 0.000 description 31
- 238000006243 chemical reaction Methods 0.000 description 26
- 230000003068 static effect Effects 0.000 description 24
- 230000000903 blocking effect Effects 0.000 description 22
- 230000008859 change Effects 0.000 description 17
- 230000000694 effects Effects 0.000 description 15
- 230000002123 temporal effect Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 14
- 238000012545 processing Methods 0.000 description 14
- 230000000875 corresponding effect Effects 0.000 description 12
- 230000008901 benefit Effects 0.000 description 7
- 230000008447 perception Effects 0.000 description 7
- 230000002349 favourable effect Effects 0.000 description 6
- 230000000712 assembly Effects 0.000 description 5
- 238000000429 assembly Methods 0.000 description 5
- 230000006835 compression Effects 0.000 description 4
- 238000007906 compression Methods 0.000 description 4
- 230000004807 localization Effects 0.000 description 4
- 238000012805 post-processing Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 3
- 238000009432 framing Methods 0.000 description 3
- 230000001105 regulatory effect Effects 0.000 description 3
- 230000002411 adverse Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000008676 import Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000014759 maintenance of location Effects 0.000 description 2
- 230000005055 memory storage Effects 0.000 description 2
- 230000005012 migration Effects 0.000 description 2
- 238000013508 migration Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000001915 proofreading effect Effects 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013075 data extraction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000003467 diminishing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/112—Selection of coding mode or of prediction mode according to a given display mode, e.g. for interlaced or progressive display mode
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0127—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
- H04N7/0132—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter the field or frame frequency of the incoming video signal being multiplied by a positive integer, e.g. for flicker reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/105—Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/132—Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/136—Incoming video signal characteristics or properties
- H04N19/137—Motion inside a coding unit, e.g. average field, frame or block difference
- H04N19/139—Analysis of motion vectors, e.g. their magnitude, direction, variance or reliability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/154—Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/156—Availability of hardware or computational resources, e.g. encoding based on power-saving criteria
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/513—Processing of motion vectors
- H04N19/521—Processing of motion vectors for estimating the reliability of the determined motion vectors or motion vector field, e.g. for smoothing the motion vector field or for correcting motion vectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/573—Motion compensation with multiple frame prediction using two or more reference frames in a given prediction direction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/577—Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/587—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/60—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/60—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
- H04N19/61—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/144—Movement detection
- H04N5/145—Movement estimation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Television Systems (AREA)
Abstract
This disclosure describes selection of reference video units to be used for interpolation or extrapolation of a video unit, such as a video frame. A decoder may apply a quality-focused mode to select a reference frame based on quality criteria. The quality criteria may indicate a level of quality likely to be produced by a reference frame. If no reference frames satisfy the quality criteria, interpolation or extrapolation may be disabled. Display of an interpolated or extrapolated frame may be selectively enabled based on a quality analysis. A decoder may apply a resource-focused frame interpolation mode to enable or disable frame interpolation or extrapolation for some frames based on power and quality considerations. In one mode, frame interpolation may be disabled to conserve power when reference frames are not likely to produce satisfactory quality. In another mode, the threshold may be adjustable as a function of power saving requirements.
Description
The right that No. the 61/012nd, 703, the U.S. Provisional Application case of the application's case opinion application on December 10th, 2007, the full content of described case is incorporated herein by reference.
Technical field
The present invention relates to digital video transcoding, and more particularly relate to the technology for video frame interpolation or extrapolation.
Background technology
The many video coding techniques for the encoded digital video sequence have been developed.For example, motion picture expert group (MPEG) has developed the some technology that comprise MPEG-1, MPEG-2 and MPEG-4.Other example comprises H.263 H.264 standard and corresponding standard ISO/IEC MPEG-4 the 10th part thereof of standard and ITU-T of International Telecommunications Union (ITU)-T, that is, and and advanced video decoding (AVC).These video encoding standards are by supporting effective transmission of video sequence with the compress mode coded data.Compression has reduced the total amount that needs the data of transmission.
Video compression can relate to space and/or time prediction to reduce intrinsic redundancy in the video sequence.The intra-coding usage space predicts to reduce the spatial redundancy between the interior video block of same frame of video.Interframe is deciphered the time redundancy between the video block of predicting to reduce service time in the successive video frames.For interframe decoding, video encoder is carried out estimation to produce the instruction video piece with respect to the motion vector of the displacement of the predicted video block of the correspondence in one or more reference frames.Video encoder is carried out motion compensation producing predicted video block from reference frame, and forms the remaining video piece by deducting predicted video block from just decoded original video block.
In order to meet the low bandwidth requirement, the frame rate encoded video that some Video Applications can reduce and/or skip the coding of some frames.Unfortunately, low frame rate video can produce the time vacation shadow that is motion jerking movement form.Can adopt at the decoder-side place frame interpolation or extrapolation to estimate approximately the frame of being skipped by encoder or the content that exceeds the frame of the basic frame rate that is produced by encoder.Frame interpolation or extrapolation can substantially be known as frame and replace.In fact, replace can be in order to the up-conversion actual frame rate to provide the perception of more level and smooth motion for frame.Frame replaces can be in order to support often to be known as the process of frame rate up-conversion (FRUC).Although FRUC can come Enhanced time quality, the replacement of some frames can introduce the false shadow in undesirable space of destruction visual quality by replacing frame (for example, using interpolation or extrapolation).
Summary of the invention
The present invention is directed to the technology for one or more reference video unit of the replacement (for example, by interpolation or extrapolation) of selecting to be ready to use in video unit.Video unit can be frame of video, section (slice), piece or other unit.Video decoder can be used and focus on quality (quality-focused) video mode to select the reference video unit based on the analysis of one or more quality criterions.Quality criterion can be indicated interpolation that (for example) might be produced by selected reference video unit or the level of extrapolation quality.Quality criterion can comprise space and/or time visual quality.If satisfy quality criterion applicatory without one in the reference video unit, then for particular video frequency unit to be added, the replacement of can stopping using.
Video decoder can be used and focus on resource (resource-focused) video mode optionally to enable based on the sports level of video sequence or to stop using to the replacement of some video units.If one or more reference video unit are roughly static state, then Video Decoder or another device replacement of can stopping using, so saving resource, for example, electric power, calculating and/or memory resource.Sports level and threshold value can be compared, described threshold value can be fixing or can change according to the level of available resources and adjust.If the reference video unit comprises substantial motion, then Video Decoder or another device can be enabled replacement, for example, quality criterion are used for the selection of reference video unit.
For delay-sensitive Video Applications (for example, visual telephone), video decoder or another device can be configured to select the video reference video unit to reduce processing and present delay.For instance, when selecting following reference video unit, video decoder can be configured to based on selecting the reference video unit apart from the distance of video unit to be added.Video decoder also can be configured to analyze one or more mass propertys that are associated with the video unit of interpolation or extrapolation, and optionally enable or the demonstration of inactive video unit based on described analysis, therefore save the resource that needs in some cases to show extra video unit.
In one aspect, the invention provides a kind of method, it comprises: at least one characteristic of analyzing one or more candidate's reference video unit; And at least part of based on described analysis select one or more conducts in described candidate's reference video unit with reference to video unit to be used for interpolation or the extrapolation of extra video unit.
In another aspect, the invention provides a kind of device, it comprises: analytic unit, and it analyzes at least one characteristic of one or more candidate's reference video unit; And selected cell, its at least part of based on described analysis select one or more conducts in described candidate's reference video unit with reference to video unit to be used for interpolation or the extrapolation of extra video unit.
In additional aspect, the invention provides a kind of method, it comprises: the sports level that analyze to be used for one or more candidate's reference video unit of the interpolation of extra video unit or extrapolation; Be identified for the interpolation of described extra video unit or the resource level of extrapolation; And based on described sports level and optionally interpolation or the extrapolation of inactive described extra video unit of described resource level.
In another aspect, the invention provides device, it comprises: motion analyzer, and it is configured to analyze the sports level for one or more candidate's reference video unit of the interpolation of extra video unit or extrapolation; Resource Monitor, it is configured to be identified for the interpolation of described extra video unit or the resource level of extrapolation; And selected cell, it is based on described sports level and optionally interpolation or the extrapolation of inactive described extra video unit of described resource level.
In another aspect, the invention provides a kind of video decoder, it comprises: analytic unit, and it is analyzed and one or more characteristics that are associated through interpolation or the video unit through extrapolating that produced by frame rate up-conversion process; And control unit, it optionally stops using described through interpolation or video unit the presenting on display through extrapolating based on described analysis.
In additional aspect, the invention provides a kind of method, it comprises: analyze and one or more characteristics that are associated through interpolation or the video unit through extrapolating that produced by frame rate up-conversion process; And it is described through interpolation or video unit the presenting on display through extrapolating optionally to stop using based on described analysis.
The technology of describing among the present invention may be implemented in hardware, software, firmware or its combination.If be implemented in the software, then described software can be carried out by one or more processors.Software can be initially stored in the computer-readable media and by processor and load to be used for execution.Therefore, the computer-readable media of the instruction that comprises the technology that causes one or more processors execution as describe is in the present invention contained in the present invention.
For instance, in certain aspects, the invention provides a kind of computer-readable media, it comprises the instruction that causes one or more processors to carry out following operation: at least one characteristic of analyzing one or more candidate's reference video unit; And at least part of based on described analysis select one or more conducts in described candidate's reference video unit with reference to video unit to be used for interpolation or the extrapolation of extra video unit.
In other side, the invention provides a kind of computer-readable media, it comprises the instruction that causes one or more processors to carry out following operation: the sports level of analyzing one or more candidate's reference video unit of the interpolation that is used for extra video unit or extrapolation; Be identified for the interpolation of described extra video unit or the resource level of extrapolation; And based on described sports level and optionally interpolation or the extrapolation of inactive described extra video unit of described resource level.
In another aspect, the invention provides a kind of computer-readable media, it comprises the instruction that causes one or more processors to carry out following operation: analyze and one or more characteristics that are associated through interpolation or the video unit through extrapolating that produced by frame rate up-conversion process; And optionally enable and stop using described through interpolation or through video unit presenting on display of extrapolation based on described analysis.
Illustrated in the accompanying drawings and the following description the details of one or more aspects of the technology that discloses.From describe and graphic and accessory rights claim with apparent further feature, target and advantage.
Description of drawings
Fig. 1 is configured to select the reference video unit for the Video coding in video unit replaces and the block diagram of decode system for explanation.
Fig. 2 A is used for video unit at the figure of the example of the technology of the interpolation of Video Decoder for explanation.
Fig. 2 B uses the figure of selected interpolation video unit, reference video unit for explanation.
Fig. 2 C is used for video unit at the figure of the example of the technology of the extrapolation of Video Decoder for explanation.
Fig. 2 D uses the figure of selected reference video unit extrapolation video unit for explanation.
Fig. 3 is configured to select reference frame for the block diagram of the example of the Video Decoder in frame replaces for explanation.
Fig. 4 is configured to select reference frame for the block diagram of another example of the Video Decoder in frame replaces for explanation.
Fig. 5 can be with the block diagram of the analytic unit that uses such as the Video Decoder of showing among Fig. 3 or Fig. 4 for explanation.
Fig. 6 selects the reference video unit for the explanation Video Decoder and is used for the flow chart of the case technology of video unit replacement.
Fig. 7 is the flow chart that illustrates in greater detail the case technology of selecting for the reference video unit.
Fig. 8 is that explanation is for the flow chart of the case technology of the quality analysis of the reference video unit of supporting to select for the reference video unit of video unit replacement.
Fig. 9 is the flow chart of explanation for generation of the case technology of the quality score of the reference video unit of supporting to select for the reference video unit of video unit replacement.
Figure 10 is used at the flow chart of focusing on the case technology that based on motion analysis selectivity replaces under the resources mode for explanation.
Figure 11 is used at the flow chart of focusing on another case technology that based on motion analysis selectivity replaces under the resources mode for explanation.
Figure 12 is configured to selective enabling for explanation or the block diagram of the example of the Video Decoder of the demonstration of the replacement frame of stopping using.
Figure 13 is used for showing based on the quality analysis selectivity flow chart of the case technology that replaces frame for explanation.
Figure 14 A can be with the block diagram of the analytic unit that uses such as the Video Decoder of showing among Figure 12 for explanation.
Figure 14 B can be with the block diagram of another analytic unit that uses such as the Video Decoder of showing among Figure 12 for explanation.
Figure 15 is for illustrating for generation of the flow chart of quality score with the case technology of the selectivity demonstration of support replacement frame.
Figure 16 is that explanation is for the flow chart of the case technology of the quality analysis of the reference video unit of the reference video unit selection of supporting to replace for video unit when video unit is supported the delay-sensitive Video Applications.
Embodiment
Fig. 1 is configured to select the reference video unit for the Video coding in video unit replaces and the block diagram of decode system 10 for explanation.In aspect various, the video unit that is substituted and selected reference video unit can be (for example) frame of video, video segment or video block.As shown in fig. 1, system 10 can comprise video encoder 12 and Video Decoder 14, its each can substantially be known as video decoder.In the example of Fig. 1, video encoder 12 coding input frame of video 16 are to produce encoded frame of video 18.Video encoder 12 can be transferred to Video Decoder 14 with encoded frame of video 18 via communication channel 19.
Although the technology of describing among the present invention applicable to the various video unit (for example can be, frame, section, piece or sub-block), but for purposes of illustration, the present invention is applied to frame of video with large volume description with described technology, but but the in the present invention restriction of the aspect of broadly described this type of technology.
For reducing the amount of the data that must between encoder 12 and decoder 14, transmit, thereby and observe the bandwidth requirement that reduces to channel 19, video encoder 12 can operate under the elementary video cell decode speed less than source video unit decoding rate.For instance, video encoder 12 can be in the lower operation of the video frame rate that reduces (for example, per second 15,30,60 frames (fps)).
Alternatively or in addition, in some cases, video encoder 12 can operate under given video frame rate, but randomly, comprise or selective actuation cause encoder 12 skip some video units coding skip unit 20.For instance, video unit is skipped unit 20 and can be configured to cause encoder 12 to skip the coding of some frames, thereby (for example) reduces the valid frame speed of video encoder 12 with respect to the source video frame rate.In Fig. 1, the frame of video of skipping is by the shadow frame explanation in the encoded frame 18.
At the frame of skipping or can in the situation of the decoding rate of up-conversion, may replace at the decoder-side place extra video unit (for example, by interpolation or extrapolation), with the actual video frame-rate conversion to the video frame rate that increases.This process sometimes is known as the frame of supporting frame rate up-conversion (FRUC) and replaces.In fact, decoder 14 can be increased to the actual frame rate that is produced by video encoder 12 frame rate through up-conversion.
As an example, if the actual frame rate that is produced by encoder 12 (have or skip without frame) is 30fps, then decoder 14 can be configured to replace extra frame (for example, by interpolation or extrapolation), so that valid frame speed is increased to 60fps or 120fps from 30fps.In fact, if replace the large then frame that can comprise of the basic frame coding speed of the frame skipped or video encoder 12 with extra frame.As mentioned above, owing to optional the skipping (for example, by the optional unit 20 of skipping) than the little basic frame rate of source video rate and/or some frames, may be more undesirable by the frame rate that video encoder 12 produces.
Decipher less frame (as mentioned above), the frame rate encoded video that video encoder 12 can reduce by skipping some frames or per second.Yet low frame rate video can produce the time vacation shadow that is motion jerking movement form.Frame replaces can be by the content of decoder 14 in order to the frame of estimating the frame approximately skipped or other eliminating, and in fact, and actual frame rate is carried out up-conversion so that the perception of more level and smooth motion to be provided.For instance, Video Decoder 14 can comprise frame rate up-conversion (FRUC) unit 22, and its interpolation or at least some extra frame of video of extrapolating are to increase the valid frame speed through the video of decoding.
Again, although describe FRUC unit 22 with respect to the up-conversion of the efficient coding speed of frame, the technology of describing in the present invention can be applicable to other video unit, for example, and section, piece or sub-block.The frame 24 that Video Decoder 14 decodable codes receive and estimate approximately extra frame of video to produce output video frame 26 via FRUC unit 22.Output video frame 26 through decoding can be in order to drive display unit.In Fig. 1, the example of the frame of skipping is by the shade frame of video explanation in the frame of video 24 that receives.
In the example of Fig. 1, FRUC unit 22 is in being showed in Video Decoder 14.In other embodiments, FRUC unit 22 can form the part of video post-processing module.The video post-processing module can be processed the output of Video Decoder 14, and can carry out multiple processing operation, and for example, the cunning that flattens, sharpening, brilliance control and/or contrast strengthen, and the FRUC operation.As another replacement scheme, FRUC unit 22 can form the part of video display processor or mobile display processor (MDP) device (for example, being used for the mobile multimedia device).Therefore, for purposes of illustration, the embodiment of the FRUC unit 22 in Video Decoder 14 and should not be considered in the present invention broadly described technology of restriction in being depicted in Fig. 1 and other figure.
Motion compensation (MC) video frame interpolation (VFI) is the example in order to the technology of the Time Perception quality that strengthens the video in the application at the FRUC at decoder-side place for example.Other interpositioning and extrapolation technique can be applicable to estimate the extra frame of approximately supporting the FRUC process.Although the FRUC technology can be come the Enhanced time quality by estimating the extra frame that the frame approximately skipped or generation exceed the basic frame rate of video encoder 12, the interpolation of some frames or extrapolation can be introduced the false shadow in undesirable space of destroying visual quality.
The visual quality of the frame of video that is substituted for instance, may can not be guaranteed and can decide on the particular reference frame of carrying out interpolation or extrapolation to heavens.In addition, the VFI method may be very complicated, and consume a large amount of electric power and other resource, and this can hinder the Video Applications that VFI is used for some devices, for example, has the mobile device of limited electric power, calculating and/or memory resource.Other frame replacement technique can present similar quality and resource problem.
FRUC unit 22 can be configured to analyze at least one characteristic that is associated with one or more reference video frames that received by Video Decoder 14, and selects one or more in the replacement of the frame of video of being undertaken by Video Decoder in the described reference video frame based on described analysis.Can select reference video frames from the frame that receives 24 that resides in time before or after the frame to be replaced.In other words, FRUC unit 22 can select one or more previous or future frames 24 in estimating extra frame process approximately to be replaced.
Previous frame of video can comprise the frame that was right after before frame to be replaced or near one or more previous frames of frame to be replaced.Future frame can comprise immediately following frame after frame to be replaced or near one or more frames of frame to be replaced.In the situation that replace by interpolation, one or more previous frames and one or more future frames can be in order to the extra intermediate frames of interpolation.In the situation that replace by extrapolation, one or more previous frames or one or more future frames can be used for the extra previous or future frame of extrapolating.
In certain aspects, the quality of reference video frame can be analyzed to select one or more reference frames in the replacement of extra frame of video in FRUC unit 22.In this way, FRUC unit 22 determines which frame is with the reference frame that acts on frame of video replacement (for example, by interpolation or extrapolation).In the case, FRUC unit 22 capable of selecting reference frame of video strengthen the space-time video quality of output video frame 26.In other side, FRUC unit 22 can analyze the quality of reference frame and Video Decoder 14 reside at the inside device resource constraint both.In the case, FRUC unit 22 can strengthen the space-time video quality of output video frame 26, and balance reduces power consumption, saves the interests in computational resource and/or the saving memory resource process simultaneously.FRUC unit 22 can strengthen the frame quality of interpolation or extrapolation and the temporal quality of video sequence.Generally, the consumption of calculating and memory resource can be influential to the power consumption and the stand-by period in some cases that increase.
In addition, in certain aspects, FRUC unit 22 can be configured to select to have the reference video frame that deflection reduces end-to-end processing and/or presents delay.Described delay can be especially undesirably from some in real time or application quasi real time, for example, visual telephone, it can be delay-sensitive.For instance, when following reference video frame when estimating the frame that approximately is substituted, FRUC unit 22 can be configured to facilitate in time relatively near the selection of waiting to estimate the following reference video frame of frame approximately.Perhaps, can stop using to the frame of video replacement of described delay-sensitive applications in FRUC unit 22.
In videophone application as an illustration, selecting frame to extrapolate based on future frame may be more undesirable, or in the situation that video frame interpolation, may need to select more close future frame but not future future frame far away in order to reduce end-to-end delay, and be user's retention time quality therefore.In particular, depend on future resident following reference frame that must be far away can cause delay owing to this type of future frames to be decoded such as needs.Frame future resident must be far away, then waiting for can be longer, this can cause the delay during upsetting visual telephone presents.
In certain aspects, Video Decoder 14 can provide and focus on the quality frame substitute mode as the next analysis selection reference frame based on one or more reference frame quality criterions of the first operator scheme.In addition, Video Decoder 14 can provide and focus on resource frame substitute mode as the second operator scheme next combination selective enabling or inactive frame replacement to some frames based on resource and quality consideration.In certain aspects, emphatically quality and emphatically resources mode can be known as quality optimization and electric power Optimizing Mode.Therefore, in certain aspects, Video Decoder 14 can determine which frame with act on the reference frame of video frame interpolation or extrapolation and also determine which frame interpolation or extrapolation so that conservation of power and strengthen the frame quality of interpolation and the temporal quality of video.Alternatively or in addition, Video Decoder 14 can be configured to stop using based on quality criterion and replace the transmission of frame from the video buffer to the display, though in executed also like this after interpolation or the extrapolation.
In certain aspects, this can be focused on resources mode and be thought of as the electric power Optimizing Mode, as above discuss.For instance, Video Decoder 14 can be configured to the balance visual quality to electric power saving and/or computational load.In some cases, Video Decoder 14 can (for example) according to the resource that can be used for Video Decoder when the FRUC operation of using frame to be replaced at emphatically quality mode and emphatically alternative switching between the resources mode.
At quality mode emphatically and/or emphatically under the resources mode, quality criterion can comprise that (for example) indication might use one or more characteristics of the level of the frame quality that is substituted that selected reference frame produces.In other words, but can select described characteristic as using reference frame to carry out the indication of quality of energy of the frame of interpolation or extrapolation.If satisfy quality criterion without one in the reference frame, then Video Decoder 14 can be stopped using for the frame replacement of particular frame.Therefore, under quality mode emphatically, in the time of might producing the interpolation of satisfactory (for example, on threshold value) or extrapolation quality without one in reference frame, Video Decoder 14 can be stopped using frame interpolation or extrapolation to save electric power.
In certain aspects, can before or after in fact the frame that is substituted is produced by decoder 14, use emphatically quality criterion.For instance, can in frame interpolation or afterwards application quality analysis of extrapolation, in said case, can will replace the frame selectivity based on the result and be applied to display unit.If the quality through interpolation or the frame through extrapolating does not satisfy threshold quality level, then FRUC unit 22 can be abandoned through interpolation or the frame through extrapolating, but not it is sent to drive display unit from the output video frame buffer.
In the case, though executed interpolation or extrapolation, need to not be used for the extra resource of display frame if quality level does not prove, then abandoning frame can be still for favourable.Frame is sent to from video buffer the process of display buffer with driving display, can expend quite a large amount of electric power.Therefore, abandon and replace frame (even in executed interpolation or extrapolation after) and can save the power consumption that produces from the video data service between video buffer and the display in addition.
Under resources mode emphatically, if the sports level of one or more candidate's reference frames less than threshold value, then Video Decoder 14 frame of can stopping using replaces.In the case, when video scene was cardinal principle static state, the difference between the frame of interpolation or the frame through extrapolating and repetition can be insignificant.Therefore, but use frame to repeat but not interpolation or extrapolation saving resource, for example, electric power.The sports level threshold value can be fixing or can require and adjust according to the resource level of Video Decoder or resources conservation.In either case, no matter started emphatically quality or emphatically resources mode, if enabled the frame replacement, then can select reference frame with one or more quality criterions.
Can be through selecting the characteristic of quality that might be by using interpolation that described candidate's reference frame produces or extrapolation as indication in order to one or more quality criterions of analyzing candidate's reference frame.For instance, if the candidate's reference frame in will considering is used for interpolation or the extrapolation of described extra frame, but then quality criterion can be indicated through interpolation or through the quality of energy of the frame of extrapolation.And conduct can be analyzed with reference to the motion vector reliability of another indication of frame quality in FRUC unit 22.The example of the quality criterion of being analyzed by FRUC unit 22 can comprise quantization parameter (QP) value, reach the number of the non-zero transform coefficient that is associated with reference video frame through block mode (CBP) value of decoding.If the CBP value is not equal to zero, then the number of QP value and nonzero coefficient can be coupled to judge the reliability of the motion vector that is provided by reference frame.
The FRUC unit can consider that also objective quality metric (for example, structural similarity tolerance (SSIM), blocking effect (blockiness) and/or ambiguity) is to determine the quality for the candidate's reference frame in interpolation or extrapolation.In addition, can consider frame mode and the motion vector counting of the type except being used for the frame mode of reference frame.
Other quality criterion can be analyzed in FRUC unit 22, for example, the evidence of all or part of video unit loss, it can be known as the video unit loss substantially.For instance, FRUC unit 22 can be in conjunction with section or the frame loss of analyzing reference video frame for the shortage of the availability of the error concealing mechanism of described frame.For instance, but the quality of the level of FRUC unit 22 estimation error and error concealing mechanism.Except the quality criterion of type described above or as to the substituting of the quality criterion of type described above, can use other quality criterion.
In some cases, the reference video frame of one or more quality thresholds can be selected to satisfy in FRUC unit 22.In other cases, can score and classification to the quality of a plurality of reference video frames in FRUC unit 22, and select to produce one or more reference frames of best score.If two frames (for example, two previous frames or two future frames) are classified for identical substantially, then may need on the select time near the frame for the treatment of the frame of skipping of interpolation.
For purposes of illustration, the present invention relates generally to the selection for the reference frame of the interpolation of extra frame or extrapolation.Yet in some embodiments, the selection of estimating reference video unit approximately be used to the extra video unit that is different from frame is contained in the present invention more roughly.For instance, the technology of describing among the present invention can be suitable for analyzing and select in the multiple reference video unit any one, for example, and frame of video, video segment or the video block of macro block for example.
When the video unit of for example frame, section or piece was skipped by video encoder 12, described technology can be used in order to identification frame, section or the piece of the correspondence in the middle of various candidate's reference frames of interpolation or extrapolation.Perhaps, even when skipped frame not, section or piece, described technology also can for example, be used for frame-rate conversion to increase the basic frame coding speed of encoder 12 in order to estimate approximately extra frame.
Even when needs are estimated approximately whole frame (owing to skipping or low frame rate), select individual slice or piece favourable for also can be in the section of waiting to estimate frame approximately in interpolation or the piece process.In the case, can select to come interpolation or extrapolate and wait to estimate section or the piece of the correspondence in the about frame from the section of different candidate frame or piece.For instance, can be used for the interpolation of extra video unit or the reference video unit of extrapolation with selection by the analysis of improving quality of cutting into slices or application class is similar to the quality analysis of describing among the present invention block by block.Therefore, by focusing on the selection for the reference frame of the interpolation of the frame of skipping to be replaced or extrapolation, the aspect that the present invention should be thought of as restriction as describe widely.
Further referring to Fig. 1, video encoder 12 can be connected by transmission channel 19 with Video Decoder 14.Transmission channel 19 can be wired or wireless media, or both combinations, and it can transmit frame of video in bit stream.Channel 19 can be supported two-way or the one-way video transmission.System 10 can be configured for visual telephone, video streaming, video broadcasting etc.Therefore, can provide in the opposite end of channel 19 reciprocal coding, decoding, multiplexed (MUX) and separate multiplexed (DEMUX) assembly.In some embodiments, encoder 12 and decoder 14 can be through being provided in for example in the video communication device of equipment for the mobile radio terminal of video streaming, video broadcasting reception and/or visual telephone (for example, so-called wireless videophone or camera phone).
This type of radio communication device comprises the various assemblies of support of wireless communication, audio coding, video coding and user interface features.For instance, radio communication device can comprise one or more processors, audio/video coder/decoder (CODEC), memory, one or more modulator-demodulators, send-receive (TX/RX) circuit, for example, amplifier, frequency converter, filter etc.In addition, radio communication device can comprise image and audio frequency trap setting, image and audio output device, the driver that is associated, user's input medium etc.
As mentioned above, encoder 12 can be implemented frame and skip to reduce frame rate via the data of transmission channel 19 transmission.In particular, encoder 12 can be configured to (for example) by not deciphering selected frame or not transmitting selected frame through decoding and skip wittingly selected frame.Perhaps, encoder 12 can be in the situation that there be frame to skip or skip with the basic frame coding speed generation frame less than wanted frame rate without frame.Frame is skipped or the decoding of the frame rate that reduces can be permitted the transmission rate request that reduces that encoder 12 is followed channel 19.
In the situation that frame skips, frame can be skipped unit 20 by frame and skip with fixed rate so that at the alternate frame place or every n frame skip.Perhaps, can (for example) skip the speed skipped frame of criterion to change based on intelligent frame.And, encoder 12 can given frame rate in fixing or coded frame adaptively, so that frame rate changes according to the consideration of for example channel condition or other requirement.In either case, frame rate can be by decoder 14 frame rate of up-conversion produce to increase effectively, for example, and from 30fps to 60fps or 120fps.
If might support the frame with enough quality level to replace without one in the reference video frame, then can stop using in FRUC unit 22, and frame replaces and application of frame repeats.When inactive frame replaced, decoder 14 can only repeat previous or future frame, but not formerly interpolation or extrapolation frame between frame and the future frame.In the case, decoder 14 can use the original alternative frame of skipping of duplicate plate of previous or future frame or the extra frame that conduct is used for frame-rate conversion.
By using frame to repeat, decoder 14 can be avoided can be by the false shadow in undesirable space that interpolation or extrapolation are introduced.Because the frame repetition can reduce the temporal quality of institute's perception of video, so in the time can realizing enough quality, will usually more close needs by the frame replacement of interpolation or extrapolation.Yet, electric power, calculating and/or memory resource consume the total value that can reduce replacement technique excessively.Can be in order to balance mass to resource consumption such as the emphatically resources mode of describing among the present invention.
In the example of Fig. 1, the incoming frame 16 (F of video encoder 12 receiver, video information
T-2, F
T-1, F
t, F
T+1, F
T+2).F
tExpression owing to skipped by optional frames that the frame that carries out unit 20 is skipped or owing to the basic frame rate that is produced by encoder 12 at the undecoded frame in time t place.Therefore, it should be noted that and replace such as the frame described among the present invention and substantially to refer to estimate in the frame that is received by decoder 14 and do not provide (owing to frame skip, the basic frame rate of channel loss or encoder 12) frame F
tFrame F
t' interpolation.
If it is applicable that frame is skipped unit 20, then can skip process and skipped frame according to aforesaid fixing, capable of regulating or dynamic frame.F
T-2And F
T-1Expression is in time at frame F
tPast frame before, and F
T+1And F
T+2For in time at frame F
tFuture frame afterwards.Can be used for frame F
tInterpolation or the reference frame of extrapolation can be included in frame F
tReach at frame F before
tBoth many frames afterwards.Yet, in order to be easy to explanation, in Fig. 1, only be illustrated in F
tTwo frames before reach at F
tTwo frames afterwards.
Generally, video encoder 12 coding input frames 16 produce encoded frame 18 as the one in above-mentioned I, P or the B frame.Again, frame F
tReside in time previous frame F
T-2, F
T-1With future frame F
T+1, F
T+2Between.(it comprises encoded frame F to video encoder 12 with encoded frame 18 via transmission channel 19
T-2, F
T-1, F
T+1, F
T+2, but non-frame F
t) be transferred to Video Decoder 14.Usually, encoder 12 is by these frames of the sequence transmission that defines in advance, for example, IBBPBBPBBPBBI, wherein I, B and P refer to respectively I frame, B frame and P frame.
Encoded frame 18 can be deciphered through intra-coding or interframe, and can be through decoding to be created in the video content that exists in the incoming frame 16.In addition, encoded frame 18 can serve as other reference frame through the decoding of the frame of interframe decoding for video sequence, that is, and and as the estimation of the frame that is used for prediction and the reference of motion compensation.As well-known in the technology of predictive interpretation, encoded frame can be characterized by the motion vector of the displacement of the piece of the similar correspondence in the piece of indication in the encoded frame frame (it serves as reference frame) encoded with respect to difference.In addition, encoded frame can be by the residual, information sign of the difference between the video block of indication in encoded frame and the corresponding video block in reference frame.
The decoding of the coding of incoming frame 16 and the frame that receives 24 can be decided according to reference frame, as above describes for predictive interpretation.Yet, in the situation of frame to be replaced, substantially refer to for interpolation or extrapolation so that the frame of extra frame to be provided at the decoder-side place such as the reference frame of describing in the present invention.Therefore, it should be noted that and use differently with the reference frame for predictive interpretation for the reference frame of interpolation or extrapolation at it, even in some instances, can be used as for interpolation and both reference frames of predictive interpretation to framing.The reference frame that is used for predictive interpretation at the coder side place through specifying and being used for predictive interpretation.On the contrary, can be at the decoder-side place through selecting and be used for the replacement (for example, by interpolation or extrapolation) of extra frame for the reference frame of interpolation or extrapolation.
In the example of Fig. 1, frame 16 is illustrated in five frames in the video sequence that contains many frames, and estimates and to reside at approximately in time between the encoded frame 18 (for example, frame F formerly in order to description
T-2, F
T-1With future frame F
T+1, F
T+2Between) frame F
tExtra (that is, additional) frame F
t' interpolation or extrapolation.In certain aspects, may need to add a plurality of frames, in said case, residing at two above frames between the frame of transmission may need interpolation or extrapolation.In order to be easy to explanation, the present invention will be referred to use the formerly single frame F between frame and the future frame of selected reference frame interpolation
t' example case.
Continuous frame in the FRUC unit 22 receiver, video sequences.For each extra frame for the treatment of interpolation, there are at least one previous frame and at least one future frame that can be used as for the reference frame of interpolation.The frame for the treatment of interpolation resides between a plurality of previous frames and a plurality of future frame in time.Be used for estimating approximately if will extrapolate (but not interpolation), then frame to be extrapolated can reside in time after one or more reference frames or reside in time before one or more reference frames.Previous some that reach in the future frame can produce the frame better than other frame and replace the result.Can carry out interpolation by in the multiple technologies any one, for example, the linearity of motion compensated interpolation (MCI), linear interpolation, bilinear interpolation, bicubic interpolation, batten (spline) interpolation, nearest-neighbors interpolation, non-linear interpolation, candidate frame or nonlinear filtering etc.Interpolation can be utilized single reference frame to be used for unidirectional interpolation or utilize two or more frames to be used for two-way interpolation.Similarly, extrapolation can utilize single reference frame or two or more frames to be used for unidirectional extrapolation.
If analyze to show in candidate's reference frame to have the enough quality level that replaces for frame without one, then FRUC unit 22 frame of can stopping using replaces.In the case, FRUC unit 22 can detect potential FRUC failure (quality that replaces the result with regard to frame).Be alternative in waste electric power and computational resource and produce low quality interpolation result, for instance, but FRUC unit 22 application of frame repeat to produce estimating approximately of the frame skipped.In the situation that frame repeats, as described previously, decoder 14 uses the version that copies of the one in previous or the future frame to substitute the frame of skipping.Therefore, under quality mode emphatically, FRUC unit 22 can be configured to select particular reference frame to be used for interpolation or extrapolation, and acceptable quality level may not the time frame of stopping using replace.When decoder 14 was focusing on to operate under the resources mode, FRUC unit 22 can be sought the better quality level and prove that frame replaces.
In some configurations, system 10 can provide one or more benefits.For instance, in some configurations, system 10 can by unlikely generation favourable as a result the time frame of stopping using replace to reduce power consumption in the Video Decoder 14.In addition, in some configurations, system 10 can be by selecting particular reference frame for the quality that makes to strengthen the frame that is substituted in frame replaces.Selection for the better quality reference frame in the interpolation of the frame that is substituted or extrapolation can be useful in various video is used, and for example, uses variable-digit speed (VBR) rate control techniques to compress the application of encoded video.In the situation that VBR, quality can change in the middle of different frame, so that when usefulness acted on the reference frame of interpolation or extrapolation, comparable other frame of some frames was good.
And, Video Decoder 14 can be configured to detect and be cut into slices, candidate's reference frame of frame or piece loss, and when not having applied error concealing mechanism, if or the unlikely good quality frame that provides of available error concealing mechanism, then this type of frame of elimination from consider.When significant transmission loss occured, for example, in the situation that videophone application, the elimination with candidate's reference frame of inadequate error concealing can be useful.Keep simultaneously in the process of rational objective and well as subjective video quality (for example, in the harmonic motion video segment) reducing power consumption, resources mode can be useful emphatically.
Fig. 2 A is used for the interpolation of extra frame with the figure of the example of the simple technique of the FRUC technology of support Video Decoder 14 for explanation.Generally, at selected previous frame F
T-NWith selected future frame F
T+MBetween the frame F through interpolation
t' middle interpolation macro block (MB) 28, Video Decoder 14 can be complied with formerly frame F
T-NIn MB 30 and future frame F
T+MIn the MB 32 of correspondence between the motion vector v that extends
NMAnd decide.In this example, time t position instruction time, that is, the extra frame for the treatment of interpolation will come across the time in the video sequence.Frame F
T-NAnd F
T+MFor needing the extra frame F of interpolation respectively in time
t' before (t-N) and after the frame of (t+M).In the example of Fig. 2 A, frame F
T-NAnd F
T+MServe as for extra frame F
t' the reference frame of interpolation.
N and M indication is with respect to the time migration of time t, and can be equal to each other or unequal.For instance, if N=1 and M=2, then frame F
T-NCan be the frame that is right after before the frame of interpolation, and frame F
T+MCan be at the second frame after the frame of interpolation.In the simplified example of N=1 and M=1, for interpolation, at frame F
T-NWith frame F
T+MBetween the vector v of extending
13Can be substantially divided by two (for 1: 2 frame-rate conversion) to produce motion vector v
NM/ 2 reach-v
MN/ 2, and the frame F of interpolation is treated in identification
t' in the MB 28 of correspondence.Therefore, in this simplified example, the position of MB 28 is according to motion vector v
NM/ 2 reach-v
NM/ 2 and become, wherein for the purpose of this example, N=1 and M=1.MB 28 can have sets of pixel values corresponding to the mean value of the pixel value of MB 30 or MB 32 or MB 30 and 32 through appointment.For higher or low frame up-conversion (for example, 1: X changes), correspondingly scaled motion vector.For other situation, for example, wherein at least one among N and the M is not equal to one, can use via estimation and motion vector and process the different motion vector that obtains.
In addition, for the interpolation of some types, FRUC unit 22 can be decided according to a plurality of reference frames, for example, and two or more previous frames and two or more future frames.Generally, reference frame refers to separately or in conjunction with the frame of one or more other reference frames in order to interpolation frame (frame of for example, skipping).In interpolation process, the pixel value that is associated with the macro block that exists in one or more reference frames can be in order to interpolated pixel values in the corresponding macro block in the extra frame for the treatment of interpolation, for example, and as shown in Fig. 2 A.Pixel value can comprise brightness and/or chroma pixel value.
As an example, can comprise pixel value in pixel value in the macro block that equals in the previous frame, the macro block in the future frame through the macro block of interpolation, or before reach the pixel value of the mean value of the pixel value in the macro block of the correspondence in the future frame.Can be motion-compensated with respect to the piece of the correspondence in the reference video frame at the macro block in the frame of interpolation, as shown in Fig. 2 A.Macro block can be by formerly and the motion vector that extends between future frame identification, as shown in Fig. 2 A.The interpolation of showing among Fig. 2 A be illustrated as an example, and should be considered the broadly described technology among the present invention that is not limited in.Extensive multiple different interpositionings can be used for replacing according to frame of the present invention.
Fig. 2 B uses the figure of selected reference frame interpolation additional video frame for explanation.In the example of Fig. 2 B, reference frame F is selected in FRUC unit 22
T-1And F
T+2Supply at interpolation extra frame F
tUse in ' the process.A plurality of previous frame F can be analyzed in FRUC unit 22
T-1, F
T-2And F
T-3And a plurality of future frame F
T+1, F
T+2And F
T+3One or more characteristics.In the example of Fig. 2 B, for purposes of illustration, three previous reference frames and three following reference frames are analyzed in FRUC unit 22.In this example, FRUC unit 22 can be analyzed based on this and select a previous reference frame and a following reference frame to supply at the frame F through interpolation
t' interpolation in use.Yet the actual number that had before reached following reference frame can be different from the example of Fig. 2 B.The number of the previous frame of being analyzed by FRUC unit 22 in addition, can be from different by the number of the future frame of FRUC element analysis.Generally, previous frame and the future frame of the interpolation result with acceptable quality level can be selected to produce based on quality analysis in FRUC unit 22.In the example of Fig. 2 B, selected reference frame F
T-1And F
T+2Indicated by crosshatch.
Fig. 2 C is the figure of explanation for the example of the technology of the video units of extrapolating at Video Decoder 14.In the example of Fig. 2 C, use two previous reference frame F
T-MAnd F
T-NExtra frame F extrapolates
t' replace to support frame.Generally, at selected previous frame F
T-NAnd selected previous frame F
T-MFrame F afterwards
t' middle extrapolation MB 31, Video Decoder 14 can be complied with formerly frame F
T-NIn the MB 33 and frame F formerly of correspondence
T+MIn the MB 35 of correspondence between the vector v of extending and deciding.In this example, t position instruction time, that is, the extra frame for the treatment of interpolation will come across the time in the video sequence.Frame F
T-NAnd F
T+MBe the extra frame F that extrapolates at needs respectively in time
t' before (t-N) and (t+M) frame.In the example of Fig. 2 C, previous reference frame F
T-NAnd F
T+MServe as for extra frame F
t' the reference frame of extrapolation.Yet, can use one or more previous reference frames or one or more following reference frame extrapolation extra frame F
t'.In other words, can be respectively with previous frame or the future frame extra frame of extrapolating forward or backward.
As in the example of Fig. 2 A, the N among Fig. 2 C and M indication is with respect to the time migration of time t, and can equate each other or unequal.For instance, if N=2 and M=1, then frame F
T-MCan be the frame that is right after before the frame of extrapolation, and frame F
T-NCan be at two frames before the frame of extrapolation.MB 31 can have corresponding to MB 33 or MB through appointment
35, or the sets of pixel values of the mean value of the pixel value of MB 33 and 35.Extrapolation process can be utilized the motion compensation extrapolation.As in the situation that interpolation, for extrapolation, reference frame can refer to separately or in conjunction with one or more other reference frames in order to extrapolation to be added to the frame through the additional frame of the frame of video of decoding.
Extrapolation can be motion-compensated by the piece extrapolation motion vector v of the correspondence from reference frame, as shown in Fig. 2 C.In extrapolation process, with the pixel value that the MB that exists in one or more reference frames is associated can be in order to the pixel value of extrapolating among the corresponding MB in extra frame to be extrapolated.The extrapolation of showing among Fig. 2 C be illustrated as an example, and should be considered the broadly described technology among the present invention that is not limited in.Extensive multiple different extrapolation techniques can be used for replacing according to frame of the present invention.
Fig. 2 D uses the figure of selected reference frame extrapolation additional video frame for explanation.In the example of Fig. 2 D, reference frame F is selected in FRUC unit 22
T-1And F
T-2Supply at extrapolation extra frame F
tUse in ' the process.According to being previous or future frame to be used for extrapolation decide, a plurality of previous frame F can be analyzed in FRUC unit 22
T-1, F
T-2And F
T-3Or one or more characteristics of a plurality of future frames.In the example of Fig. 2 D, for purposes of illustration, four previous reference frame F are analyzed in FRUC unit 22
T-1, F
T-2, F
T-3, F
T-4In this example, FRUC unit 22 can be analyzed based on this and select two previous reference frames to supply at frame F
t' extrapolation in use.Yet the actual number that is used for the reference frame of extrapolation can be different from the example of Fig. 2 D.Generally, the reference frame of the extrapolation result with acceptable quality level can be selected to produce based on quality analysis in FRUC unit 22.In Fig. 2 D, selected reference frame F
T-1And F
T-2Indicated by crosshatch.
Fig. 3 is the block diagram of example that illustrates in greater detail the Video Decoder 14 of Fig. 1.In the example of Fig. 3, Video Decoder 14 comprises that received frame buffer 34, decoding unit 36, frame replace unit 38, output frame buffer 40, FRUC analytic unit 42 and selected cell 44.Frame replaces the part that unit 38, FRUC analytic unit 42 and selected cell 44 can form the FRUC unit 22 of Video Decoder 14.In the example of Fig. 3, FRUC unit 22 resides in the Video Decoder 14.Yet as above mentioned, in other embodiments, FRUC unit 22 can reside at the outside of Video Decoder 14, for example, and in video post-processor module or video display processor or MDP device.The encoded frame that received frame buffer 34 receives and storage is transmitted from video encoder 12 via channel 19.Decoding unit 36 uses frame that decode procedure decoding applicatory receives and the frame through decoding is positioned in the output frame buffer 40.
The frame 34 that receives can be got rid of the various frames for the treatment of interpolation or extrapolation.This type of frame can comprise the frame skipped by encoder 12, the frame of losing or the part of frame between the transmission period of crossing over channel 19, and the frame do not supported of the basic frame rate of encoder 12.For room for promotion-temporal quality, frame replaces unit 38 and can be configured to based on analysis and for the selection interpolation of the specific frame that receives in interpolation or extrapolation or the extra frame (as applicable) of extrapolating.
As previously mentioned, replace the interpolation of carrying out unit 38 by frame and can comprise in the multiple interpositioning any one, for example, motion compensated interpolation (MCI), linear interpolation, bilinear interpolation, bicubic interpolation, spline interpolation, nearest-neighbors interpolation etc.Interpolation can be utilized single reference frame to be used for unidirectional interpolation or utilize two or more frames to be used for two-way interpolation.Similarly, extrapolation can be decided according to one or more frames.In some cases, frame replace unit 38 can (for example) by having repeated previous or future frame substitutes frame to be added stop using frame replacement and alternatively application of frame repetition.
Frame replaces unit 38 and adds the frame that is substituted or repeats to video output frame buffer 40.The frame through decoding in the video output frame buffer 40 and the frame that is substituted or repeats can be in order to drive for example video output device of display.As an example, Video Decoder 14 can form any one the part in the multiple device that comprises Digital video capabilities (comprising radio communication device such as the mobile radio phone, digital media player, personal digital assistant (PDA), Digital Television etc.).Perhaps, the frame in the output frame buffer 40 can through be transferred to one or more other install to be used for filing or show.In each situation, replace the additional frame by decoding unit 36 decodings of the frame that is substituted or repeats that unit 38 produces by frame, for example, with the time visual quality of augmented video fragment.
As further showing in Fig. 3, frame replaces unit 38 can receive from the output of decoding unit 36 frame through decoding for use as the reference frame interpolation or extrapolation frame process.The frame through decoding that receives from decoding unit 26 can be the pixel domain frame that is produced based on the encoded frame from received frame buffer 34 by decoding unit 36.Frame replaces unit 38 and can use the reference frame conduct through decoding to be used for the interpolation of extra frame or the reference frame of extrapolation.The particular reference frame that is used for interpolation or extrapolation can be identified based on the analysis of the candidate's reference frame that is undertaken by FRUC analytic unit 42 by selected cell 44.
Can act on the meaning of reference frame of replacement of extra frame by replacing unit 38 usefulness in the frame of video that obtains from decoding unit 36, it can be thought of as candidate's reference frame.For each frame to be added, analytic unit 42 can have been analyzed pixel domain information and/or the bitstream information of the subset (with respect to frame to be added) of previous and/or future frame, and with output be provided to selected cell 44 with identification should through select for by replace the frame interpolation of carrying out unit 38 or extrapolate in frame.Selected cell 44 can be configured to based on selecting one or more frames to be used for interpolation or the extrapolation of extra frame by the analysis of analytic unit 42 outputs.In addition, in some cases, selected cell 44 can be configured to instruct to replace that unit 38 is enabled or inactive frame replaces, for example, and when in analyzing indication candidate reference frame, being suitable for frame under acceptable quality level without one and using in replacing.
As shown in Figure 3, selected cell 44 can produce guidance frame and replaces unit 38 and select one or more in the interpolation of frame to be added or frame selection signal or order in the extrapolation the frames from received frame buffer 34.For instance, for each frame to be added, but replacing unit 38, selected cell 44 guidance frames select a previous frame and a future frame in interpolation.Frame replaces unit 38 then can use selected frame as the reference frame of the interpolation of the frame that is used for skipping.As an illustration, can come based on inserted block in the motion vector that extends between the piece that formerly reaches the correspondence in the future frame is in frame to be added with previous frame and future frame, for example, as shown in Fig. 2 A.As another explanation, but replacing unit 38, selected cell 44 guidance frames select a pair of previous frame in the extrapolation of extra frame.
Through select being used for that reference frame that frame replaces can be the frame of the most close frame to be added or away from the frame of frame to be added.In some cases, even it will be usually through selecting to be used for general FRUC process, hithermost frame (namely, nearest previous and nearest future frame) can in fact have and make it be not suitable for the characteristic of in interpolation or extrapolation, using than other reference frame, for example, owing to VBR decoding or other factors.Analytic unit 42 is analyzed candidate's reference frame so that the indication for the well-formedness of the described frame in frame replaces to be provided.Be alternative in and only use the frame item by default that is adjacent to the frame of skipping, analytic unit 42 and selected cell 44 can permit the reference frame that frame replacement unit 38 uses the result who provides better.
Fig. 4 is the block diagram of another example that illustrates in greater detail the Video Decoder 14 of Fig. 1.In the example of Fig. 4, Video Decoder 14 is substantially corresponding to the Video Decoder of Fig. 3 A.Yet the FRUC unit 22 of the Video Decoder 14 of Fig. 4 further comprises mode selecting unit 46 and Resource Monitor 48.In addition, FRUC unit 22 optionally comprises delay detecting unit 51.Mode selecting unit 46 can be supported two or more operator schemes of FRUC unit 22.For instance, FRUC unit 22 can be configured under as the emphatically quality mode of the first operator scheme or operating under the resources mode emphatically as the second operator scheme.In the example of Fig. 4, FRUC unit 22 resides in the Video Decoder 14.In other embodiments, FRUC unit 22 can reside at the outside of Video Decoder 14, for example, and in video post-processor module or video display processor or mobile display processor.
Resource Monitor 48 can be configured to monitor, detect, estimate or otherwise determine available power, calculating and/or memory resource in the device that Video Decoder 14 is provided.In some cases, Resource Monitor 48 can be configured to monitor the resource budget applicable to the processing of frame.Therefore, Resource Monitor 48 can be configured to monitor at frame at the Video Decoder 14 interior treated available real resources in place's preset times, or monitors with respect to the estimated resource consumption applicable to the resource budget of the processing of frame.In response to the different resource level, but Resource Monitor 48 trigger mode selected cells 46 select different mode (for example, emphatically quality or emphatically resource) to be used for the operation of FRUC unit 22.Mode selecting unit 46 can be transferred to model selection FRUC analytic unit 42 and be used for reference frame selection with its analysis of revising candidate frame.Perhaps, as will describing, but mode selecting unit 46 transmission mode selection replace (for example, interpolation or extrapolation) to enable or to stop using.
As an example, level of power is determined in Resource Monitor 48 consumption (with respect to the budget to the consumption of processing resource) that can be configured to by monitoring or estimate the processing resource in the decoder 14.Generally, be to have correspondence between the MIPS (1,000,000 instructions of per second) that spends in the processor of digital signal processor (DSP) for example and the electric power that consumes for the DSP operation.Between the data volume of extracting from external memory storage and the electric power that consumes for the described data of extraction, also there is correspondence.In addition, between the amount of the frame data that send to display and the electric power that spends for this operation, there is correspondence.For known devices or chipset, this correspondence can be through setting up reliably, and then by the item expression in the look-up table.For instance, can be with MIPS, data extracted amount and demonstration amount as the index that is mapped to the power consumption value item in the look-up table.
Use for the FRUC in the given chipset, might determine with regard to each action need of MIPS what, extract how many data and how many data sent to display from external memory storage.Under a kind of situation, Resource Monitor 48 can be configured to calculate for whenever once interpolation or through the generation of the frame of extrapolation and the electric power that demonstration consumes, and the power budget of the level of the electric power that consumes with set through being assigned to frame or frame compared.Power budget can be through specifying conduct for predetermined, the fixing or capable of regulating level of power of the designing requirement of the functional unit (for example, the Video Decoder (or whole CODEC) in mobile wireless hand-held set or other device) of device.
Can distribute power budget for series of frames (for example, group of picture (GOP) or other video sequence).Along with processed frame and power consumption, the available power in the power budget reduces.Resource Monitor 48 can be configured to (for example) and estimate will need how much electric power to come interpolation or extrapolate new FRUC frame based on the known correspondence (it can be stored in the look-up table) between extracting and show through MIPS, the data of interpolation or the frame through extrapolating.As above mentioned, power budget can be fixing (that is, can't help from the feedback control loop adjustment of installing), and perhaps it can be based on adjusting from the feedback control loop that installs.
For giving framing, if in power budget, exist enough electric power to remain to carry out interpolation, then can enable interpolation process.Perhaps, if the electric power of Shortcomings remains in power budget, the interpolation process of then can stopping using.According to an aspect of the present invention, be alternative in and enable and inactive interpolation, Resource Monitor 48 and mode selecting unit 46 can be configured to select different interpolations/outer push-model.In particular, based on the available resources level of being determined by Resource Monitor 48, mode selecting unit 46 can be selected emphatically quality mode or emphatically resources mode.
For instance, if the MIPS of the previous frame in the video sequence, data extraction and/or demonstration are reduced to the level that is lower than the first predetermined threshold with the available power budget, then Resource Monitor 48 can should be selected emphatically resources mode to mode selecting unit 46 indications.Perhaps, if the available power budget is higher than the first predetermined threshold, then Resource Monitor 48 can should be selected emphatically quality mode to mode selecting unit 46 indications.In certain aspects, if the available power budget is lower than the second predetermined threshold less than the first predetermined threshold, then Resource Monitor 48 can indicate the interpolation-extrapolation of can stopping using to repeat to be conducive to frame, or possibly, should not enable interpolation, extrapolation or frame and repeat, so that the electric power in the economy system.
In the process of the available power in monitoring power budget, the spent MIPS of Resource Monitor 48 traceable every frames, data that every frame extracts and on given video sequence through sending the amount for the frame data that show.For instance, Resource Monitor 48 can be kept the sum for the continuous accumulation of the MIPS of specific GOP or other video sequence, the data of extracting and shown data, and described value is mapped to corresponding power consumption value.In particular, Resource Monitor 48 can be mapped in the described value each corresponding power consumption value (for example, in one or more look-up tables), and then described value is sued for peace to produce the total electricity consumption figures.Scheme as an alternative, but Resource Monitor 48 accesses are with the combinatorial mapping of the index relevant with MIPS, the data of extracting and the shown data multidimensional lookup table to the total electricity consumption figures of frame or other video unit.
As another replacement scheme, but Resource Monitor 48 frame by frames are mapped to power consumption value with described value, and add cumulatively power consumption value to be created in the accumulative total of the power consumption on the video sequence.As an illustration, if video sequence (for example, the sequence of 30 frames) has power budget X, and n frame place in sequence, estimate that MIPS, the data of extracting and shown data have consumed electric power amount Y, so that the available power budget is X-Y=Z, then Resource Monitor 48 can compare to select emphatically quality mode or emphatically power mode with available power budget and predetermined first threshold.
The power budget of described sequence can divided by in described sequence and in many ways on the process of sequence through upgrade be provided on the whole sequence the frame that distributes more uniformly or in described sequence the number of the frame of provable additional power budget so that in video sequence, do not expend more unevenly power budget.Various replacement schemes can be used for estimating the power consumption with respect to power budget, in order to support emphatically quality mode or the emphatically selection of power mode.Therefore, above example is to provide for purposes of illustration, and should not be considered in the present invention broadly described technology of restriction.
Focusing under the quality mode, reference frame selection can be carried out based on one or more reference frame quality criterions exclusively, mainly or substantially in FRUC unit 22.Quality criterion can be indicated interpolation that (for example) might be produced by selected reference frame or the level of extrapolation quality.If satisfy quality criterion without one in the reference frame, then for particular frame to be added, frame replaces and can be stopped using by selected cell 44.In the case, the frame interpolation is estimated frame approximately to be added but frame replaces that unit 38 application of frame repeat.
Under resources mode emphatically, both combination selective enablings can be considered or the frame of stopping using to some frames replaces based on resource and quality in FRUC unit 22.For instance, when FRUC analytic unit 42 determined that video sequences are substantially static near frame to be replaced, selected cell 44 frame of can stopping using replaced.If video sequence contains the harmonic motion level, then in carrying out the replacement process, may exist less or not have advantage because interpolation or the extrapolation vision difference between repeating with frame is possible little or even perception less than.In the case, by stop using replacing, decoder 14 can avoid expending electric power and resource for interpolation or extrapolation in the situation that visual quality is had minimal effects.
When existing enough motions to prove the replacement of focusing under the resources mode, analytic unit 42 can be carried out and be equal to or similarly quality analysis in the quality analysis of focusing on to use under the quality mode.Therefore, emphatically resource and emphatically quality mode may not be pattern fully independently.But, when resources mode emphatically when starting, emphatically quality mode can be only in the situation that sports level suffices to show that interpolation or extrapolation (as applicable) proceed.Emphatically quality mode can be through independent startup, or resources mode can operate to replace and therefore cancel emphatically quality mode or enable with the frame of stopping using and replaces and start emphatically quality mode emphatically.Merit attention, even focusing under the quality mode, if satisfy quality criterion without one in candidate's reference frame, the replacement of then can stopping using.
As option, under resources mode emphatically, when enabling when replacing, the quality threshold that is used for quality analysis can be (for example) and can require according to the resources conservation of Video Decoder 14 and adjust.For instance, can adjust threshold value based on the available interpolation resource (for example, available power, computational resource and/or memory resource) of Video Decoder 14.In some embodiments, can adjust quality threshold based on the level of available power resource (for example, with Video Decoder 14, video post-processor and/or the level of the available power resource that is associated of the video display processor of mobile display processor (MDP) for example).In either case, pipe threshold is not fixing or adjustable, if enabled replacement, then analytic unit 42 can be used one or more quality criterions to select reference frame or the replacement of stopping using.
As an example, if near the video sequence frame to be added is characterized by very low sport video content, then the benefit of video frame interpolation or extrapolation may not be very remarkable.Therefore, FRUC unit 22 may not only determine which frame reference frame that acts on the frame of video replacement, and also determines to replace which frame of interpolation by frame, that is, and and interpolation or which frame of extrapolating.For some frames, the cost of interpolation or extrapolation may not be proved by enough enhancings of time visual quality.For instance, by avoiding the interpolation for some frames of skipping, under resources mode emphatically, the saving in the situation that FRUC unit 22 can strengthen at the time visual quality of the frame quality of interpolation and video sequence in EQUILIBRIUM CALCULATION FOR PROCESS resource and the power consumption that is associated.Zero motion vector counting and/or little motion vector counting can be used as the decision rule of determining interpolation or extrapolation particular frame for the based on motion content.Can derive by different way count threshold.For instance, threshold value can be fixing be used for zero motion vector counting and little motion vector counting one or both judge motion activity.Perhaps, can (for example) adjust one or both in the threshold value based on the resource level of decoder 14.
In the example of Fig. 3 and Fig. 4, to select for the emphatically quality of the reference frame that is used for replacing, Video Decoder 14 can be decided according to the multiple criterion relevant with quality.For instance, FRUC unit 22 can select reference frame in the frame of video replacement in conjunction with the reliability of the error concealing method through providing to rebuild the structure reference frame and type based on the section of reference frame and/or frame loss of information.FRUC unit 22 can be configured to analyze the level owing to the mistake of loss, and can be used for the quality of the error concealing mechanism of error recovery.Alternatively or in addition, QP value and the CBP value that is associated with candidate frame can be analyzed in FRUC unit 22, as the indication of the relative quality level of described frame.For instance, the coupling of QP value and CBP value can be judged the quality of reference frame.
Objective visual quality tolerance is also used in FRUC unit 22.Objective visual space quality metric can be non-with reference to tolerance, for example, and structural similarity tolerance (SSIM), blocking effect and/or ambiguity.Objective quality metric comprises that alternately or extraly look overflows.Objective quality metric can be in order to produce the quality score of candidate's reference frame.Other example of quality criterion can be included in the type of the frame mode that uses in candidate's reference frame, or each frame mode and the motion vector counting in candidate's reference frame.Also can utilize on demand additional criterion.
In certain aspects, as described previously, when given Video Applications needed shorter delay in end-to-end processing, FRUC unit 22 can be configured to inactive frame replacement or make the selection of reference video frame be partial to previous frame but not future frame.In videophone application, for example, the user may participate in real time or video conference quasi real time, its may need the processing of video and present in the reducing or eliminating of delay.If the frame replacement technique is decided (for example, for interpolation or extrapolation) according to future frame, then decode and process described future frame so that the delay that produces in the extra frame process may not allowed.In the case, when Video Applications is forced when postponing to require, FRUC unit 22 frame that can be configured to stop using replaces, forbids the selection of following candidate's reference frame, or forbids not being in the selection of the interior following candidate's reference frame of the predetermined number frame of frame to be added.
For instance, as shown in Figure 4, FRUC unit 22 optionally comprises delay detecting unit 51, it detects maximum delay or other quality of the service request of being forced by given Video Applications, and instructs selected cell 44 inactive frames replacements maybe to need to select frame in the past or previous future frame conduct to be used for the reference frame of interpolation or extrapolation.Detecting unit 51 detects videophone application or other delay-sensitive Video Applications needs the minimum delay if postpone, and then postpones detecting unit 51 and can instruct selected cell 44 inactive frames to replace.Delay-sensitive applications or can be delay-sensitive type application can by postpone detecting unit 51 based on from the reception of the signal specific of the device of decoder 14 through being embedded in the inside, with side information (side information) that the video data that is received by decoder 14 is associated the reception of the signal specific that provides, or by any one detection in the multiple alternate manner.In the situation of the detection of delay-sensitive applications, repeat or fully cancel FRUC in order to avoid upsetting the delay that can destroy for the quality of user's video conference but replace unit 38 application of frame.
Perhaps, postpone detecting unit 51 and can instruct selected cell 44 (and/or analytic unit 42) to enable the frame replacement, but need to use reference frame in the past or previous future frame to be used for interpolation or extrapolation (for example, time-based required distance).Therefore, postpone detecting unit 51 and can have enough quality but apart from the specific following reference frame of the too much time gap of frame to be added to avoid selecting to be designated as to selected cell 44 impulsive constraints.
For instance, for respect to frame to be added future one or two frame frame can be acceptable, the reference frame that replaces for use as frame.Yet, if future frame is in the time gap place away from the some frames of frame to be added, can from wait for that frame is decoded and analyze and the processing that produces and present postpone can be unacceptable.Can be after quality analysis or before the quality analysis distance analysis be applied to candidate's reference frame.If applications distances analysis before quality analysis if then candidate's reference frame is that the distance of future frame and described frame pitch frame to be added is excessive, then postpones the quality analysis that detecting unit 51 can instruct analytic unit 42 to suspend the particular candidate reference frame.
Scheme as an alternative, when postponing for concerned issue, postpone detecting unit 51 and can instruct analytic unit 42 to adjust its quality analysis, be presented than the low quality score so that be positioned at apart from the following reference frame of the too much distance of frame to be added, or in quality analysis, be excluded outside considering.In either case, effect can be some following reference frames and is excluded and is chosen as reference frame, can proceed so that frame replaces, and can not adversely affect the lag characteristic of delay-sensitive Video Applications (for example, visual telephone).In some cases, for visual telephone or other delay-sensitive Video Applications, postpone requirement and can eliminate frame and replace, for example, owing to videophone application in real time or service request quasi real time.Yet for video streaming and playback, delay issue can not too be paid close attention to usually.
As mentioned above, in some embodiments, can be at least part of select one or more in the interpolation of extra video unit or in the candidate's reference video unit in extrapolating based on the time gap of the extra frame of video of one or more distances in candidate's reference video unit.When detecting delay-sensitive applications, but time-based is apart from selecting candidate's reference video unit.Perhaps, can be termly or in response to a certain other trigger event time-based apart from selecting candidate's reference video unit.Therefore, in certain aspects, time gap can form the part for the quality analysis of selecting to use for the reference frame in interpolation or extrapolation.
Fig. 5 for explanation for the block diagram with the reference frame analytic unit 42 of the Video Decoder 14 of Fig. 3 or Fig. 4.As previously mentioned, although described for purposes of illustration the analysis and selection of reference video frame, but the structure of analytic unit 42 and functionally be suitable for for example cutting into slices or the analysis and selection of other reference video unit such as piece (for example, macro block or smaller piece).In the example of Fig. 5, analytic unit 42 comprises objective metric detector 50, error concealing (EC) detector 52, quantization parameter (QP) detector 54, block mode (CBP) detector 56, quality score calculator 58, comparing unit 59 and motion vector (MV) marginal testing device 60 through deciphering.
The various unit of showing among Fig. 5 can be used for focusing on the quality operator scheme and focus in the resource operation pattern (when enabling interpolation).When (for example) selected to focus on the resource operation pattern by mode selecting unit 46, analytic unit 42 can start motion analyzer 64.In addition, randomly, analytic unit 42 can focused on start-up mode adjustment unit 62 under the resources mode.Selected cell 44 can consider that the output of MV marginal testing device 60 and motion analyzer 64 replaces to determine whether the frame of carrying out frame to be added, and if so, then determine whether to select just analyzed frame as adding frame for the reference frame in frame replaces.
Objective metric detector 50 can be configured to analyze the degree of candidate's reference frame to determine that SSIM value and/or the blocking effect that is associated with described frame, ambiguity or look overflow, and based on described definite generation quality score.When detecting essence blocking effect, ambiguity and/or look and overflow, for the particular candidate reference frame, the quality score that is produced by objective metric detector 50 can be low, and when overflowing without blocking effect, ambiguity and/or look substantially, quality score can be height.The quality score of different candidate's reference blocks can change between high and low according to described objective visual quality metric characteristic.Perhaps, can be based on high or low with relatively quality score being expressed as of predetermined threshold.
When analytic unit 42 utilizes objective metric detector 50, described objective metric detector can be applied to the frame through decoding of being rebuild structure by decoding unit 36.Therefore, analytic unit 42 can be analyzed the encoded frame that obtains from received frame buffer 34, and via objective metric detector 50, can receive and analyze by via decoding unit 36 decodings from the frame of received frame buffer 34 obtain through rebuilding the frame of structure.Objective metric detector 50 can overflow or other objective quality metric is analyzed candidate's reference frame through rebuilding structure to produce quality score for SSIM value, blocking effect, ambiguity, look.
For instance, can evaluate each error concealing mechanism for the validity in the processes such as the number of the diverse location of dealing with the loss of different numbers, loss, affected or unit, error detection time.EC detector 52 can determine with the above characteristic of particular slice or frame Loss Correlation connection (namely, the loss number, position, error detection time) in one or more, and then determine whether available error concealing mechanism will be provided those characteristics effectively, and be given in the knowing in advance of performance of this error concealing mechanism under the conditions of similarity.
As an illustration, if in decoder 14, used error concealing mechanism X, and mechanisms known X is used for effectively by the position less than the loss of the loss number of Y1, specific region, and less than the loss that characterizes the affected/number of unit of Y2, preset time of processing greater than the error detection of time Z, then following the loss of above characteristic should be able to hide by employment mechanism X, in said case, EC detector 52 can produce high EC score.
Under various situations, if if not satisfying one or more combinations of any one or characteristic in the described characteristic is present in section or the frame loss, then error concealing may failure.If above characteristic is not followed in loss, then EC detector 52 can produce low EC score.Therefore, EC detector 52 can knowing and decide hiding section with different characteristics or the validity in the frame loss process according to the type of error concealing mechanism.
QP detector 54 is analyzed quantization parameter (QP) value that is associated with each candidate's reference frame.Generally, the quantization step of the conversion coefficient of QP value indication in the relevant block of encoded frame of video.For all pieces in the frame, the QP value can be identical, or change for different masses.As an example, QP detector 54 can be analyzed the average QP value of the piece that forms reference frame.Perhaps, but the maximum or minimum QP value of QP detector 54 analysis frames.If the average QP value of frame high (indication is in the situation that the H.264 thick quantification of the video data of decoding), then QP detector 54 can produce relatively low score.Thinner quantization step and the cardinal principle higher video quality of less QP value indication in H.264 deciphering.Therefore, for H.264 decoding, if average QP value is lower, then QP detector 54 can produce the better quality score.
If the QP value is lower than predetermined QP threshold value and the CBP value is roughly zero (indicate substantially and have nonzero coefficient without one in the piece), then the quality of frame can be by high score indication.When the CBP value indicates the number of the piece of at least one nonzero coefficient to be lower than the CBP threshold value, can determine substantially to be regarded as zero CBP value.Perhaps, if the QP value is higher than the QP threshold value, then score can be low.Very high QP reaches zero or medium CBP value can be indicated low score.If CBP is zero and the QP value is lower than threshold value, then very accurate to the estimation of piece, it should produce the high-quality score.
As an illustration, in some embodiments, quality score calculator 58 can will be added to the score of being exported by QP detector 54 and CBP detector 56 than the large weight of score that is added to by EC detector 52 and 50 outputs of objective metric detector.In other embodiments, the output of EC detector 52 may be more important.As above state, quality score calculator 58 can be to the output weighting.Scheme as an alternative, each in the detector 50,52,54,56 can be configured to produce weight with pre-appointment individually through the scores of weighting.The 58 output indications of quality score calculator are for the PTS of the quality of the candidate's reference frame in the interpolation of the frame of skipping.
The frame whether given mass property (for example, objective metric, EC characteristic, QP and CBP characteristic) in order to count the score, PTS can indicate candidate's reference frame might produce and have acceptable quality level replaces the result.Comparing unit 59 can compare PTS and quality threshold.If PTS is satisfactory, for example, above quality threshold, then comparing unit 59 indication candidate reference frames have for selecting conduct for additionally (that is, the additional) interpolation of frame or the reference frame of extrapolation (as applicable) are acceptable quality level.If PTS is less than quality threshold, then comparing unit 59 determines that described candidate's reference frame is for selecting as also unacceptable for the reference frame of interpolation or extrapolation (as applicable).In each situation, analytic unit 42 can then continue to analyze for next candidate's reference frame of current extra frame under consideration or continue to analyze the reference frame that is used for next frame to be added.Through considering that being used for specific can consider to change according to design for the number of candidate's reference frame of frame.
If PTS satisfactory (for example, meeting or surpass quality threshold), then comparing unit 59 can indicate the candidate's reference frame in quality base is considered to be suitable for being chosen as reference frame.Perhaps, comparing unit 59 can be selected the highest classified reference frame, as mentioned above.In either case, in some embodiments, be used for selecting in order to ratify reference frame, analytic unit 42 can further comprise motion vector (MV) marginal testing device 60.MV marginal testing device 60 can be analyzed the reliability of the motion vector in candidate's reference frame, to guarantee in the situation that frame of video replace and to utilize motion prediction compensation method to be used for interpolation or the selected reference frame of extrapolation will produce quality frame replacement result.
If motion vector is reliable, but then MV marginal testing device 60 can be to the candidate's reference frame in the selected cell 44 indication selection analysis as the reference frame that is used for the frame replacement.Yet if motion vector is unreliable, MV marginal testing device 60 can be refused described candidate's reference frame, even its quality requirement that has satisfied comparing unit 59 is also like this.In the case, MV marginal testing device 60 can not answer the candidate's reference frame conduct in the selection analysis to be used for the reference frame that frame replaces to selected cell 44 indications.
In some embodiments, comparing unit 59 can be indicated for selecting as the suitable frame (when it is considered) with reference to frame.In other words, along with analytic unit 42 is analyzed each candidate's reference frame, if the suitable candidate's reference frame of its identification, then it can be indicated and should select described candidate's reference frame.Process can continue, until candidate's reference frame of the analytic unit 42 enough numbers of identification and type, at described some place, analytic unit 42 can stop the analysis for candidate's reference frame of the interpolation of present frame or extrapolation, and continues to for the analysis at candidate's reference frame of the interpolation of video sequence next frame to be added.
As simple declaration, analytic unit 42 can be identified single previous reference frame and the single following reference frame for the interpolation that resides in time the frame between previous and the future frame.Perhaps, for the interpolation of complicated type more, analytic unit 42 can be identified a plurality of previous and following reference frame of interpolation that can be through selecting to be used for frame to be added.In each situation, analytic unit 42 can be analyzed the subset of the previous and future frame that is adjacent to frame to be added, until for selecting, identify till enough numbers and the type (for example, if necessary, previous and following).Can analyze a finite population frame.If without the enough quality scores of one generation, then analytic unit 42 can be indicated without frame and is selected in the frame of analyzing, and frame replacement unit 38 answers application of frame to repeat but not the frame replacement.
In some embodiments, analytic unit 42 can use the quality criterion of other type, for example, and the motion vector of the intra-coding pattern of in candidate's reference frame, using and/or the type of frame mode and candidate's reference frame counting.As an example, if the piece through intra-coding in the particular candidate reference frame (for example, macro block) outnumber mode decision threshold, then analytic unit 42 can be indicated the low interpolation quality of described reference frame, for example, so that described candidate's reference frame should be used for the interpolation of extra frame.Mode decision threshold can be static state or through dynamically adjusting, and for example, is higher than the number of piece of a plurality of decoded bits of threshold value based on having in candidate's reference frame.Can use extensively multiple other quality criterion.The described information that is used for quality analysis can be through (for example being thought of as out of Memory that the present invention describes, objective quality metric, QP and CBP characteristic, substituting and EC characteristic), or except the out of Memory of describing in the present invention, can consider the described information for quality analysis.
Can with treat other of interpolation or extrapolation near in the identical reference frame of frame some come interpolation or extrapolate some extra frames.Reason for this reason, in some cases, in case for quality analysis candidate reference frame, then may need to store the information relevant with the quality of described candidate's reference frame.In this way, if the particular candidate reference frame is considered candidate's reference frame after a while, then can determines rapidly its quality, and need not again execution analysis.Candidate's reference frame can be used as for treating at the potential reference frame of a little frame of the part interpolation of video sequence or extrapolation and is correlated with.
Along with video sequence is proceeded, it is out-of-date or at least not too relevant that institute's canned data can become, this owing to candidate's reference frame with respect to time of the increase of the frame for the treatment of interpolation or extrapolation away from property.Therefore, a bit can abandon described information at certain, for example, after video sequence has proceeded to greater than the point away from the predetermined number frame of candidate's reference frame.By the storage quality information, may be necessary only in analytic unit 42, to carry out a quality analysis for each candidate's reference frame.Perhaps, can be whenever frame implementation quality analysis when being identified as the candidate's reference frame for frame to be added.
In some embodiments, analytic unit 42 can be configured to the candidate's reference frame by relevant quality level analysis is carried out classification.For instance, analytic unit 42 can comprise the stage unit (67A or 67B) that candidate's reference frame is carried out classification.In the case, be alternative in when the suitable number of the frame of finding to have the quality score that satisfies quality threshold and type and stop, analytic unit 42 can carry out classification produces the best in quality score with identification frame to candidate's reference frame.For instance, candidate's reference frame that analytic unit 42 can satisfy by the inferior ordered pair of gross mass score quality level (for example, meeting or surpass quality threshold) (as by comparing unit 59 indications) carries out classification, and selects the highest candidate's reference frame of classification.
As an option, referring to Fig. 5, analytic unit 42 can comprise stage unit 67A, and its assessment and classification are identified as candidate's reference frame of satisfactory (that is, for meeting or surpass quality threshold) by comparing unit 59.Stage unit 67 can as one man be selected the highest reference frame of a plurality of classifications with the number of the needed reference frame of type that is replaced the frame replacement of using unit 38 by frame.Candidate's reference frame of selected (that is, classification is the highest) then can be delivered to MV marginal testing device 60 to determine whether it has reliable MV content from stage unit 67A.If so, then MV marginal testing device 60 can should select described frame in frame replaces to selected cell 44 indications.If the highest candidate's reference frame of classification does not have reliable MV content, then 60 pairs of selected cells of MV marginal testing device, 44 indication frames replacement unit 38 answer application of frame to repeat but not the frame replacement.
As another option, because classification can be subject to the impact of insecure MV content, so candidate's reference frame of all classifications of the quality score that analytic unit 42 can be satisfactory with having (for example, meeting or surpass quality threshold) is delivered to MV marginal testing device 60.In particular, comparing unit 59 can be provided to MV marginal testing device 60 with having all candidate's reference frames that transmit score.MV marginal testing device 60 identification has gratifying quality score (for example, meet or surpass quality threshold, for example, as by comparing unit 59 indications) and both candidate's reference frames of reliable MV content.Analytic unit 42 optionally comprises stage unit 67B, and it receives the output of MV marginal testing device 60 and by quality score candidate's reference frame is carried out classification.Stage unit 67B follows the highest candidate's reference frame of an optional majority classification, and will select reference frame and be communicated to selected cell 44.But selected cell 44 notification frame replace unit 38 in frame replaces with the selected reference frame of interpolation frame of video.
MV marginal testing device 60 can be ignored the candidate's reference frame with reliable MV content, stays remaining candidate's reference frame and is used for by stage unit 67B classification and is chosen as reference frame.In particular, stage unit 67B can identify the highest candidate's reference frame of classification with reliable MV content to selected cell 44, is the reference frame as interpolation for you to choose.Again, the number of selected reference frame can be according to being replaced the type (that is, the type of interpolation or extrapolation) that the frame that uses unit 38 replaces by frame and supporting frame to replace the number of the needed reference frame of process and become.
MV marginal testing device 60 can be analyzed the motion vector reliability with in the multiple different technologies any one.As an example, MV marginal testing device 60 can be configured to use the method based on differences in motion.MV marginal testing device 60 can operate X (level) direction analyzed in frame of video and Y (vertically) the direction motion vector information on both.In the case, MV marginal testing device 60 can determine on the directions X of the piece (for example, macro block) in candidate's reference frame motion vector and formerly the difference between the motion vector on the directions X of the piece of the common location in the frame whether surpass threshold value.In the case, MV marginal testing device 60 can determine that the motion vector in candidate's reference frame is unreliable.For instance, MV marginal testing device 60 can be counted the number of the insecure motion vector in candidate's reference frame, or determines that the overall average between the motion vector in candidate's reference frame and previous reference frame is poor.
Except determining outside the motion vector reliability on X (for example, the level) direction, MV marginal testing device 60 can be determined the motion vector reliability on Y (for example, vertical) direction in a similar manner.If MV marginal testing device 60 detects directions X MV unreliability or Y-direction MV unreliability, current candidate's reference frame during then MV marginal testing device can put behind one, and candidate's reference frame selection should be the reference frame for the interpolation of the frame of skipping to selected cell 44 indication.Scheme as an alternative, in some embodiments, but use angle information assess motion vector magnitude and direction both.Yet if the MV of candidate's reference frame is reliable, 44 indications of 60 pairs of selected cells of MV marginal testing device can select described candidate frame conduct for the interpolation of additional frame or the reference frame of extrapolation.
Scheme as an alternative, MV marginal testing device 60 can be used frame to change detection technique to the frame motion and analyze the MV reliability.According to this technology, MV marginal testing device 60 can be configured to detect the motion time that changes substantially of the motion from another contiguous frames (for example, previous or future frame) in current candidate's reference frame.If the value that changes is greater than threshold value, MV marginal testing device 60 reference frame determining that the MV of candidate's reference frames is unreliable and should not be selected as replacing for frame then.
Change detection technique for frame to the frame motion, whether continuous for the motion that detects in candidate's reference frame and contiguous frames, can use following two methods.The first, but motion changes detection based on motion statistics.In the case, can calculate the motion vector statistics of two frames (that is, candidate's reference frame and contiguous frames).Statistics can comprise motion vector (value and angle) mean value and standard deviation.The second, but motion changes detection based on motion vector mark.In the case, the motion change based on statistics detects scope and fixed the making a policy that can comply with the motion on the frame level.May not detect motion poor of the macro block of each the common location in two frames.Yet, for head it off, can use the motion of based on motion vector mark to change detection.
As another replacement scheme, MV marginal testing device 60 can use the technical Analysis MV reliability of based on motion track.In the case, determine whether should be by the motion vector (if macro block is along going with tracks identical in contiguous frames) from contiguous frames of where using of checking that macro block will be in candidate frame for the method for based on motion track.If the object that is carried by macro block have with candidate frame in the district that pays close attention to (that is, the position of the macro block of losing) significantly overlapping, then its MV can be thought of as reliably, and described candidate frame should be used for frame and replace.Otherwise if it moves away from the described district that pays close attention to, then its MV is unreliable, and described candidate frame should be used for the frame replacement.
The quality threshold that is used by comparing unit 59 can it is believed that the quality level relevant with acceptable frame replacement result with expression through predetermined and selection.Quality threshold can be fixing or adjustable.In some embodiments, for instance, the mass value threshold value can be and can adjust according to the patterns of change of decoder 14 operations.In operation, analytic unit 42 can check at first that resources mode is to open or close emphatically.If mode selecting unit 46 indications are quality mode emphatically, then existing quality threshold may not adjusted in mode adjustment unit 62.Yet, if resources mode, mode adjustment unit 62 capable of regulating quality thresholds are focused in mode selecting unit 46 indications.For instance, in some embodiments, mode adjustment unit 62 can increase quality threshold and select the better quality frame as for the interpolation of frame or the reference frame of extrapolation with needs.
In some embodiments, when selecting to focus on resources mode, quality threshold can be increased to the second added value from the first value.In other embodiments, when selecting to focus on resources mode, quality threshold can be increased to the value of calculating according to the variation of available resources level.For instance, along with the resource of for example available power or available computational resources becomes lower, quality threshold can be higher.In this way, quality threshold can be inversely proportional to the available resources level, so that when resource level is low, need better quality to prove the cost that frame replaces, that is, and the cost of frame interpolation or extrapolation.Therefore, in some instances, under resources mode emphatically, analytic unit 42 can use that motion activity threshold is determined to enable or inactive frame replaces, and when enabling the frame replacement, the quality threshold that uses the fixed mass threshold value or adjust according to the variation of available resources level.
Produce the quality score that satisfies at the quality threshold of focusing on the increase under the resources mode without one if be used for the interpolation of particular frame or candidate's reference frame of extrapolation, then be used for interpolation or the extrapolation of frame without frame through selection.In the case, selected cell 44 inactive interpolations, in said case, unit 38 pause frames replace and alternatively application of frame repetition but frame replaces.By the quality threshold that needs to increase under resources mode emphatically, in fact mode adjustment unit 62 forces frame to replace unit 38 frames of stopping using to replace, replace unless the better quality reference frame can be used for frame.
When resource-constrained, for example, when electric power, calculating and/or memory resource were rare, mode selecting unit 46 can be selected emphatically resources mode.In the time of when the benefit of video frame interpolation and not bery significantly, for example, in the situation of very low sport video content, the resources mode video frame interpolation of can stopping using emphatically.In this way, when resource-constrained, better quality is forced in the 62 pairs of frame interpolations in mode selecting unit 46 and mode adjustment unit or extrapolation to be proved.In other words, for the cost (with regard to resource consumption) that proves that frame replaces, the actual result that frame replaces should have relatively high visual quality.Rareness is so that mode selecting unit 46 when selecting emphatically quality mode when resource, and mode adjustment unit 62 can reduce quality threshold.The quality threshold allowance frame that reduces replaces to be proceeded more continually, because more candidate frame may might satisfy the quality threshold that reduces.
Focusing under the resources mode, mode selecting unit 46 also can start motion analyzer 64.In some embodiments, except start-up mode adjustment unit 62 to adjust the quality threshold, motion analyzer 64 can be analyzed the motion activity of one or more candidate's reference frames to determine whether relevant video scene relatively static.Motion analyzer 64 can be analyzed from the number of motion vectors of candidate's reference frame and determine that according to this video scene is to be characterized by very little motion or significant motion.For instance, motion analyzer 64 motion vector that can analyze current anchor frame is enabled or the decision-making that inactive frame of video replaces to make.Anchor frame can be and be adjacent to the frame of the frame that (for example, front or rear) skip.
If there is the very little motion by the anchor frame indication, so that scene is relatively static, then motion analyzer 64 can produce to export with the frame of stopping using and replace.If scene is relatively static, then the interpolation of time skipped frame will not be to be evincible (repeating with respect to frame) usually, and will be like this even selected reference frame will produce top-quality frames replacement result yet.As an illustration, when scene is relatively static, the visual quality difference that is repeated to produce by frame interpolation and frame relatively perception less than.Reason for this reason, interpolation and can't help any significant gain of quality by the cost of the electric power of the video data service consumption between video buffer and the display and prove, and should be deactivated.In the case, it seems from the viewpoint of quality and resource saving, frame repeats more to close needs.Should stop using frame when replacing when motion analyzer 64 indication, can similarly stop using for the quality analysis of focusing on of the reference frame of current consideration.
In some embodiments, motion analyzer 64 can compare motion activity and threshold value.Movement threshold can be fixing or adjustable.For instance, motion analyzer 64 can be based on the horizontal adjustment movement threshold of available electric power, calculating and/or memory resource.In adjustable situation, for instance, high level is lower relatively if electric power resource is in, and then motion activity threshold can be relatively low.For instance, relatively low-level lower if electric power resource is in, then movement threshold can be relatively high.In either case, at the threshold value place or the motion activity that is higher than threshold value can trigger interpolation, stand the selection for one or more reference frames in frame replaces, this is with to focus on the quality operator scheme consistent.For the higher power level, low threshold value means may need less motion to come trigger frame to replace.Yet for low level of power, higher thresholds means may need higher motion to come trigger frame to replace.
When resources mode is focused in mode selecting unit 46 selections, can start motion analyzer 64 and (randomly) mode adjustment unit 62, and revocable motion analyzer 64 reaches (randomly) mode adjustment unit 62 when mode selecting unit 46 is selected to focus on quality mode.Focusing under the quality mode, decoder 14 can operate to produce the visual quality that closes needs.Yet, to focus under the resources mode, decoder 14 quality capable of being combined and resource are saved target.Mode selecting unit 46 can (for example) by with available resources and one or more resource threshold compare and select emphatically resources mode in response to the detection of limited resources.Therefore, in some embodiments, mode selecting unit 46 can be selected emphatically quality mode by acquiescence, and selects emphatically resources mode based on the level of available resources.Perhaps, resource-based pattern can be default mode, and in said case, when available resources were higher than one or more resource threshold, mode selecting unit 46 was selected emphatically quality mode.
In fact, under resources mode emphatically, mode selecting unit 46 can instruct selected cell 44 optionally to enable or interpolation or the extrapolation of the extra video unit of stopping using, and under quality mode emphatically, instructs selected cell to enable interpolation or the extrapolation of extra video unit.In particular, focusing under the resources mode, mode selecting unit 46 can instruct selected cell 44 optionally to enable or inactive interpolation or extrapolation by triggering motion analyzer 64.Selected cell 44 can follow based on motion analyzer 64 output selectivity enable or inactive interpolation or extrapolation.Perhaps, selected cell 44 can instruct selected cell 44 (for example) to enable interpolation or extrapolation by not triggering motion analyzer 64.The quality analysis that the enabling of the interpolation of being undertaken by selected cell 44 can still stand to be undertaken by analytic unit 42 is to select reference frame or inactive interpolation or extrapolation in situation about can use without suitable reference frame.
As extra optional feature, in certain aspects, analytic unit 42 can comprise range unit 63.As above describe referring to Fig. 4, postpone the operation that detecting unit 51 can detect delay-sensitive Video Applications (for example, videophone application).Postpone detecting unit 51 and can instruct selected cell 44 to avoid the selection at following reference frame far away in future with respect to frame to be added, so as to avoid diminishing visual telephone visual quality processing and present in delay.Therefore, selected cell 44 can be at least part of based on the analysis of the time gap of described frame pitch frame to be added and refuse some candidate's reference frames, even analytic unit 42 can indicate the relative high-quality of described frame also like this.Although can trigger based on the detection of delay-sensitive applications the selection of the reference frame of time-based distance, in some embodiments, time gap can be used for regular reference frame selection, that is, have or without the detection of delay-sensitive applications.
As shown in Figure 5, postponing feature can be through building in analytic unit 42.In particular, be alternative in via selected cell 44 refusal with respect to frame to be added future resident too far (namely, based on too much time gap) following candidate's reference frame, analytic unit 42 can be configured to relative low quality score is assigned to described frame.Selected cell 44 may be direct time-based distance but indirectly must assign to refuse described frame based on the low quality that produces from time gap.In either case, selected cell 44 can avoid the too much stand-by period can being incorporated into the upward selection of remote reference frame of time in the frame replacement process.Again, when detecting the operation of delay-sensitive applications (for example, visual telephone) or termly (for example, no matter whether detect delay-sensitive applications), time gap can be used for optionally selecting reference frame.In some cases, can be at least part of the time-based distance select termly reference frame, and need not to relate to the operation of the detection of delay-sensitive applications.
As the one in the various frame characteristics, range unit 63 can determine that following candidate's reference frame is apart from the distance of frame to be added.Range unit 63 can be configured to produce the gradually lower score further from following candidate's reference frame of frame to be added.In some cases, if candidate's reference frame is more than the future frame with respect to frame to be added of maximum number, then range unit 63 can produce zero score.
When detecting the delay-sensitive Video Applications, range unit 63 can start by postponing detecting unit 51.Perhaps, when detecting delay-sensitive applications, postpone detecting unit 51 and can make the output of range unit 63 carry the weighting of increase.When the application of for example video playback be not for remarkable delay-sensitive the time, can stop using range unit 63 or its output score can carry the weighting that reduces in the gross mass score.
Be included in various assemblies, unit or module in system 10, encoder 12 and the decoder 14 of Fig. 1 in Fig. 5, and run through other assembly that the present invention describes and to be realized by any appropriate combination of hardware and/or software.In Fig. 5, various assemblies are depicted as independently assembly, unit or module at Fig. 1.Yet, the whole or some assemblies in the various assemblies of describing to Fig. 5 referring to Fig. 1 can unit or module through being integrated into common hardware and/or the combination in the software in.Therefore, in order to be easy to explanation, to be that assembly, unit or module are to wish outstanding specific functional features with character representation, and may not to realize described feature by independent hardware or component software.In some cases, various unit can be embodied as the process able to programme of being carried out by one or more processors.For instance, aspect various in, motion analyzer, Resource Monitor and selected cell can be realized by independent hardware and/or software unit or identical hardware and/or software unit or its combination.
Fig. 6 selects reference frame for explanation Video Decoder 14 and is used for the flow chart of the exemplary technique of frame of video replacement.When enabling the frame replacement, can focus under the quality mode or focus on the process shown in the execution graph 6 under the resources mode.As shown in Figure 6, Video Decoder 14 receives input video frame (68).In the situation that decoder 14, can receive input video frame as the encoded frame in the bit stream that imports into, wherein to skip owing to frame or the basic frame rate of encoder 12, some in the frame are lost.In the situation that encoder 12, input video frame can be the source frame of video for the treatment of by encoder encodes.The frame of getting rid of on time can be the frame of being had a mind to skip by video encoder 12 or the frame of losing in the transmission of crossing over channel 19, or the basic frame rate of encoder 12 is not supported and need to be through being formed for extraly the frame of frame-rate conversion.In the example of Fig. 6, one or more characteristics (70) that analytic unit 42 is analyzed for the set of the candidate's reference frame in frame replaces.
Candidate's reference frame can be selected from N previous frame and M the future frame with respect to frame to be added, and wherein N can equate with M or be unequal.As an explanation, can consider three previous frames and three future frames, but a given number frame is for purposes of illustration, and should be considered restrictive.Can consider before to have reached future frame and be used for two-way interpolation.In certain aspects, for unidirectional interpolation or extrapolation, can only consider previous frame or future frame only.
Described one or more characteristics of frame can be relevant with the quality of frame, and can analyze with pixel thresholding, conversion thresholding, bit stream data or other.Based on analysis, the one or more conducts in the described candidate frame of analytic unit 42 selections are with reference to frame (72).Use selected candidate frame as the reference frame, frame replacement unit 38 is carried out frame by the interpolation (if or being suitable for extrapolation) of frame to be added and is replaced (74).The process of summarizing among Fig. 6 can repeat to be used for extra frame of video in the interpolation (or extrapolation) of the bit stream that is received substantially continuously by decoder 14.
Fig. 7 is the flow chart that illustrates in greater detail for the case technology of reference frame selection.Generally, analytic unit 42 can determine such as power conservation pattern etc. emphatically resources mode be to open or close.If resources mode is for opening emphatically, then motion analyzer 64 is analyzed the motion activity of candidate's reference frame.For instance, analyze the motion vector that receives of candidate's reference frame.The decision-making of the frame to be added of interpolation (or extrapolation) whether is made in the based on motion vector analysis.If decision-making is interpolation, then with emphatically quality mode is consistent, next analyzes the quality of reference frame.Analyze based on this, make another decision-making of interpolation or extrapolation.If select interpolation or extrapolation, then select reference frame based on quality analysis.Then carry out video frame interpolation or extrapolation with described at least selected reference frame and a plurality of reference frames of possibility.
For each frame to be added or selected frame to be added, can in the video sequence that is received by Video Decoder 14, repeat continuously substantially the process of summarizing among Fig. 7.In the example of Fig. 7, decoder 14 receives the bit stream that imports into (76) that contains the encoded input video frame of video sequence and get rid of some frames.For interpolation or the extrapolation of the frame of losing, Video Decoder 14 can determine that resources mode is unlatching or closes (78) emphatically.For instance, can to indicate the frame replacement process of decoder 14 be or to operate under the quality mode emphatically at resources mode emphatically to mode selecting unit 46.
If start emphatically resources mode (78), then analytic unit 42 can start motion analyzer 64 with dissecting needle to each the indicated motion (80) in candidate's reference frame.For instance, motion analyzer 64 can be analyzed from the number of motion vectors of candidate's reference frame and determine that according to this video scene is relative to static state or comprises significant the movement.Motion analyzer 64 can be analyzed the motion vector of anchor frame (for example, being adjacent to the frame of replacement frame to be added on the time), enables or the decision-making that inactive frame of video replaces to make.
If more than or equal to threshold value sports level (82), then motion analyzer 64 can suffice to show that to the level of selected cell 44 indication motions the frame that the interpolation (or extrapolation) by frame to be added is carried out replaces by the sports level of anchor frame indication.In the case, analytic unit 42 can continue to analyze the quality (84) of described candidate's reference frame.Yet if sports level is lower than the threshold value sports level, 44 indications of 64 pairs of selected cells of motion analyzer should not use frame to replace, and in said case, but decoder 14 repeated reference frames (92) substitute the frame of getting rid of, but not the described frame of interpolation.In the case, can repeat previous or future frame carries out up-conversion with the frame rate to video sequence effectively.
In some embodiments, repeat for frame, analytic unit 42 can only be selected previous frame or future frame and repeat described frame to substitute frame to be added.Owing to having selected the frame repetition, do not need to carry out the various quality analysis operations for the emphatically quality mode of present frame.Yet, scheme as an alternative, but analytic unit 42 application quality analyses come the one from a plurality of candidate's reference frames to select reference frame, and then use described selected reference frame as the frame that repeats.In particular, analytic unit 42 can select to have satisfied (for example, meet or surpass) quality threshold the gross mass score reference frame and described selected frame be used for frame repeat.Therefore, in some cases, the selection based on quality of the reference frame consistent with focusing on quality mode can not only be used for frame interpolation or extrapolation, repeats but also be used for frame.
If as above discuss, sports level is in or is higher than threshold level (82), and then analytic unit 42 can be proceeded emphatically the quality operator scheme and analyzes the quality of described candidate's reference frame and gained quality score and quality threshold are compared (86).As above describe referring to Fig. 5, for instance, quality score calculator 58 can calculate the gross mass score based on the quality score by the one or more generations in objective metric detector 50, EC detector 52, QP detector 54 and the CBP detector 56.Comparing unit 59 can then compare the gross mass score and itself and quality threshold are compared (86).
If quality score is satisfactory, for example, more than or equal to threshold value (86), then analytic unit 42 can replace to frame the selection of the described reference frame of unit 38 indications.For instance, selected cell 44 can be identified described reference frame to be used for replacing unit 38 selections for the interpolation (or extrapolation) at frame to be added by frame.Frame replaces unit 38 can follow (for example) by using described selected reference frame interpolation or extrapolation frame to continue to carry out frame replacement (90).If have the gross mass score of satisfactory (for example, meeting or surpass quality threshold (86)) in candidate's reference frame without one, then comparing unit 59 can should be selected without reference frame selected cell 44 indications.In the case, selected cell 44 can replace unit 38 to frame and indicate the frame of should stopping using to replace, and answers application of frame repetition (92) but not the frame replacement.
As shown in Figure 7, analytic unit 42 optionally is configured to candidate's reference frame (93) that classification has the gross mass score that satisfies quality threshold.For instance, for two-way interpolation, analytic unit 42 can be by the previous candidate's reference frame of the order classification of quality level, by the following candidate's reference frame of the order classification of quality level, wherein previous frame in time before the frame to be added and future frame in time after frame to be added, and then select the highest future frame of the highest previous frame of classification and classification in frame replaces.Be in the classification in the equal substantially situation, analytic unit 42 can be selected the in time frame of the most close frame to be added.For unidirectional interpolation, on being that handle is previous or future frame is decided as reference frame, analytic unit 42 can be selected the highest previous frame of classification or the highest future frame of classification.As another example, in some cases, classification can repeat and the frame replacement to be used for frame in order to select the highest frame of classification.
In some embodiments, the selection of frame can stand the MV fail-safe analysis undertaken by MV marginal testing device 60, as described in reference to Figure 5.In addition, in some embodiments, quality threshold and/or other criterion or operation can focused on quality mode and change between the resources mode emphatically.For instance, when resource-constrained, mode adjustment unit 62 can increase the quality threshold of focusing under the resources mode proves interpolation with the higher interpolation quality of needs.
Fig. 8 is that explanation is used for supporting to be used for the emphatically flow chart of the case technology of the quality analysis of the reference frame of the reference frame selection of the video frame interpolation of quality mode of basis.Generally, estimate or analyze the quality of candidate's reference frame.Can provide quality score for each candidate's reference frame.Candidate's reference frame can be the previous or future frame with respect to the frame of skipping.If the quality score of particular reference frame is not greater than or equal to threshold value, then it is not chosen as in frame of video replaces and uses.If the quality score of reference frame is enough, then can be in video frame interpolation or extrapolation, to use described reference frame selection.In some embodiments, analytic unit 42 may need the motion vector that is associated with candidate's reference frame for reliably.For instance, if frame of video replaces motion prediction compensation method is used for interpolation or extrapolation, then can checks motion vector.
As shown in the example of Fig. 8, analytic unit 42 can be analyzed in succession a plurality of candidate's reference frames and be used for the interpolation of frame or the reference frame that comprises previous frame and future frame of extrapolation with identification.From the received frame buffer 34 next candidate's reference frames of retrieval (94) afterwards, analytic unit 42 is namely estimated quality (96) and the calculated mass score (98) of candidate's reference frame, for example, as above describes referring to Fig. 5.If score satisfactory (for example, more than or equal to quality threshold (100)), then comparing unit 59 is delivered to MV marginal testing device 60 with the MV content of determining frame whether reliable (102) with candidate's reference frame.
If the gross mass score is not greater than or equal to quality threshold (100), then analytic unit 42 can be set to candidate's reference frame " closing " with the purpose (104) of the selection that is used for frame and replaces.In the case, selected cell 44 does not select " closing " frame to be used for being replaced by frame interpolation or the extrapolation of the replacement frame that carries out unit 38.If the gross mass score is determined MV contents reliable (106) more than or equal to quality threshold (100) and MV marginal testing device 60, then analytic unit 42 can be set to candidate's reference frame " unlatchings " with the purpose (106) of the selection that is used for the frame replacement.
If analytic unit 42 is not yet considered all candidate's reference frames, that is, it not yet arrives the ending (108) of the candidate's reference frame in the scope that defines in advance of candidate's reference frame, and then the next candidate's reference frame of analytic unit 42 retrievals is to be used for analysis.The number of candidate's reference frame of being analyzed by analytic unit 42 can be through preselected, and can comprise one or more previous frames and one or more future frames with respect to the frame for the treatment of interpolation or extrapolation, for example, and as describing to Fig. 2 D referring to Fig. 2 A.When arriving the ending (108) of candidate's reference frame, selected cell 44 can be selected to be set to candidate's reference frame (110) of " unlatching " and selected frame is communicated to frame and replace unit 38.Interpolation is followed in unit 38 or the frame to be added of extrapolating replaces (112) to use described selected reference frame to carry out frame but frame replaces.
If there is no " unlatching " frame of the reference frame through being set to " unlatching " or Shortcomings number, then selected cell 44 can indicate the frame of should stopping using to replace, and frame replaces unit 38 and should alternatively repeat frame application of frame to be added.In some embodiments, optionally enable or inactive FRUC unit 22 in the FRUC process.When enabling FRUC, can in the video sequence that is received by Video Decoder 14, repeat continuously substantially the process summarized among Fig. 8 for each frame to be added or selected frame.
Again, as described in reference to Figure 7, the process of showing among Fig. 8 can comprise progressive operation, wherein the candidate's reference frame that satisfies quality threshold is carried out classification.In the case, can select the highest frame of classification as the reference frame that is used for the frame replacement.
Fig. 9 is for illustrating for generation of the flow chart of support for the case technology of the quality score of the reference frame of the reference frame selection of video frame interpolation.Quality score can be in order to produce the gross mass score, for example, and as described in reference to Figure 5.Generally, as initial step, can check average QP value and the CBP value of each reference frame (with respect to the following or previous reference frame of the frame of skipping).If the QP value then can provide the high-quality score to reference frame than threshold value little (for example, H.264 the less QP value in the decoding is corresponding to thinner quantization step).For being different from H.264 wherein less QP value corresponding to some decode procedures of thicker quantization step, reverse situation can be very.
In addition, can carry out being present in determining in candidate's reference frame about a slice loss or a plurality of slice loss.If if existence is lost or application error is hiding, then can reduce the quality score of reference frame.If have loss and hidden mistake, then can set the better quality score for reference frame.In some embodiments, objective nothing can be applied to through rebuilding candidate's reference frame of structure with reference to visual quality tolerance (for example, blocking effect, ambiguity and/or look overflow).If described tolerance provides high result, then can increase the gross mass score of reference.If reference frame has high gross mass score, then it can use during skipped frame in the described time of interpolation.
As shown in Figure 9, analytic unit 42 is retrieved next candidate's reference frames (114) from received frame buffer 34, and analyzes QP and the CBP value of described frame.In the example of Fig. 9, analytic unit 42 produces the score based on QP of combination based on QP and CBP value.If the QP value is not equal to zero (116) less than QP threshold value applicatory (QP_th) and CBP value, then analytic unit 42 is set to " height " (118) with the score based on QP of described frame.If the QP value is substantially equal to zero (116) more than or equal to QP threshold value (QP_th) or CBP value, then analytic unit 42 is set to " low " (120) with the score based on QP of described frame.
As further showing in Fig. 9, analytic unit 42 also can be configured to candidate's reference frame is checked slice loss (122).Referring to Fig. 5, the slice loss inspection can be carried out by EC detector 52.Slice loss can produce from the loss of leap channel 19 or the deterioration of some other losses or data.If there is slice loss, then analytic unit 42 can determine whether sufficient error concealing (EC) mechanism can be used for proofreading and correct slice loss (124).If not, then analytic unit 42 is set to " low " (126) with the score based on EC of candidate's reference frame.If sufficient EC mechanism available (124), then analytic unit 42 will be set to based on the score of EC " height " (128).In the case, if can use EC mechanism to reappear the section of losing, then the quality of candidate's reference frame can be suitable for the reference frame that acts on the frame replacement.Yet, if can not reappear section, candidate's reference frame should be used for frame and replace.
Figure 10 and Figure 11 are the flow chart of explanation for the case technology that replaces at the selectivity frame of focusing under the resources mode.Focusing under the resources mode, analytic unit 42 can determine as measuring whether the video scene that is represented by reference frame be static with motion activity.Zero motion vector counting and/or little motion vector counting can be used as decision rule.Generally, can two modes derive threshold value.For instance, in the non-self-adapting situation, can use for the fixed threshold of zero motion vector counting and for the fixed threshold of little motion vector counting and judge motion activity.In the self adaptation situation, can be based on the one or both in resource level (for example, the level of available electric power, computational resource or memory) the adjustment threshold value of (for example) decoder 14.
Generally, when between the result that frame replaces and frame repeats, not having the appreciable difference of essence, for example, the use that difference between interpolation and frame repeat is not enough to prove interpolation (for example, in view of with device that decoder 14 is associated in electric power, calculating or memory constraints) time, resources mode can be the optional pattern of the selectivity frame replacement of disconnection (that is, stopping using) interpolation or extrapolation emphatically.Yet when motion activity was remarkable, in fact decoder 14 can be returned to the emphatically reference frame of quality mode to select to replace for frame.
In some embodiments, can be characterized by power conservation pattern or electric power optimization selectivity FRUC pattern with focusing on resources mode.Focusing under the resources mode, motion activity can decided as measuring whether video scene is static.If it is static that algorithm is determined scene, then can be with simple frame repeat techniques but not on substantially calculating frame of video more intensive and that consume more electric power replace and be used for FRUC.Yet if video scene is not substantially static, frame replaces comparable frame and repeats more to close needs.
For determining that whether scene is static, can analyze the motion vector of current anchor frame and enable or the decision-making of inactive video frame interpolation or extrapolation to make.Motion vector can be directly uses, obtains after the processing of bit stream motion vector decoder 14 from bit stream, or obtains from the motion estimation module of decoder 14.In some cases, some of motion vector that are used for motion analysis can be the identical motion vector for interpolation or extrapolation.Anchor frame can be the frame that is adjacent to frame to be added, for example, is in close proximity to frame to be added or near the previous or future frame of frame to be added.On the basis of this analysis, enable or stop using to the decision-making of the video frame interpolation of frame at present under consideration so analytic unit 42 can be made closing.The number of the zero motion vector that exists in described frame in an example, can be used as decision rule.
Zero motion vector is to have zero or be substantially the motion vector of zero value.For instance, in some embodiments, the motion vector with the value that is lower than threshold value can be considered zero motion vector.In some embodiments, motion vector can be the treated value from the embedded motion vector of bit stream.If zero motion vector counting (that is, the number of motion vectors of zero values), can determine then that scene is static greater than threshold value, in said case, the video frame interpolation of stopping using.For static scene, can use frame to repeat.By using mode decision information (for example, in the frame or interframe decoding mode decision information), can add further enhancing.For instance, can count to obtain more accurate zero motion vector counting to the number of zero motion vector for the macro block of non-intra-coding.
In another example, except zero motion vector counting, can be with little motion vector counting as decision rule.Little motion vector can be the non-zero motion vector with the value that is lower than predetermined threshold.A reason of adding little motion vector counting is that (for example) some scenes (even it may have the static macro block of being indicated by zero motion vector in a large number) also can contain the fast moving object of relative a small amount of, the vehicle of the ball of for example, throwing away, the bird that circles in the air or process.The new object that enters fast or leave the video scene on a series of frame of video can produce significant motion activity in frame.
Although the object of fast moving can occupy the fraction of whole frame, can be important with the retention time quality with inserting in it in the frame that FRUC produces.Therefore, can add look little motion vector counting (for example, little motion vector count threshold) and fixed the second criterion to guarantee that when the inactive frame interpolation, scene is static state definitely.In addition, this criterion can impliedly be taken into account the small articles problem of fast moving.
In the example of Figure 10, the number of the motion vectors of zero values in 64 pairs of frames of motion analyzer counts to produce zero motion vector counting (Zmv_c) (140).Generally, each macro block in the frame has motion vector.If the motion vector of macro block has null value (indication is without mobile), then described motion vector is used as zero motion vector.In some cases, if motion vector has the little nonzero value that is lower than threshold value, then it can be used as motion vectors of zero values again.In either case, in case the number of zero motion vector is counted, then motion analyzer 64 determine the number (that is, zero MV counting Zmv_c) of zero motion vectors be greater than or equal threshold value Th applicatory (142).If so, then video scene is relatively static.In the case, should stop using interpolation or spread to frame outward and replace unit 38 and should alternatively repeat to be communicated to selected cell 44 (144) for frame application of frame to be added of motion analyzer 64.As previously mentioned, frame repeats to use the simple scheme of selecting hithermost previous or future frame, or uses based on the scheme of quality and select the better quality reference frame as the frame that repeats.
If zero MV counting Zmv_c is less than threshold value (142), then motion analyzer 64 should be carried out frame replacement (146) and be communicated to selected cell 44.In the case, may or may not carry out interpolation (or extrapolation), the result as the quality analysis of being carried out by analytic unit 42 stands the selection of one or more suitable reference frames.In fact, when scene is not static state, can use if having for the frame of enough quality of interpolation, then analytic unit 42 can be returned to emphatically quality mode to select one or more reference frames.If quality analysis produces one or more suitable reference frames, then selected cell 44 can be communicated to selected frame frame replacement unit 38 in the interpolation of frame to be added.
Threshold value Th (142) can be fixing.Perhaps, can replace resource (for example, level of power, computational resource level and/or memory resource level) based on the available frame of Video Decoder 14 and adjust threshold value Th.As shown in Figure 10, focusing under the resources mode, analytic unit 42 optionally detects resource level (143) and adjusts threshold value Th (145) based on the resource level that detects.For instance, analytic unit 42 can be determined number and/or the available memory resource of the instruction that available power resource (for example, battery levels) and/or available computational resources, per second can be used.Can (for example) based on the known relation direct-detection between calculating operation and the resource consumption or estimating resource level.In the situation that mobile device, if battery resource is low, then analytic unit 42 can reduce threshold value Th, replaces so that enable frame when having a large amount of motions.In either case, pipe threshold is not fixing or adjustable, replaces if enable frame, and then analytic unit 42 can be used one or more quality criterions to select reference frame or inactive frame to replace.
In the example of Figure 11, motion analyzer 64 is counted both based on zero MV counting and little MV and is analyzed motion.As shown in Figure 11, the number of the motion vectors of zero values in 64 pairs of frames of motion analyzer counts to produce zero motion vector counting (Zmv_c) (148), and the number of the little motion vector in the frame is counted to produce little motion vector counting (Smv_c) (150).Little motion vector can be the motion vector with the nonzero value that is lower than threshold value.Even when having dominant zero motion vector, also can have a plurality of non-zero motion vector, comprise the non-zero motion vector (being called in this article little motion vector) of little (having on the meaning of the value that is lower than threshold value at it).Little motion vector can be associated with one or more little moving articles, for example, and ball, bird, automobile etc.Motion analyzer 64 compares (152) with respective threshold Th1 and Th2 respectively with zero MV counting (Zmv_c) and little MV counting (Smv_c).
If zero MV counting (Zmv_c) is counted (Smv_c) less than threshold value Th2 more than or equal to threshold value Th1 and little MV, then motion analyzer 64 instructs selected cell 44 indications to answer application of frame to repeat but not frame replacement (154).In the case, the zero MV counting indicator above threshold value Th1 is roughly static state by the video scene that frame presents.Simultaneously, the little MV counting indicator less than threshold value Th2 does not exist the remarkable small articles in the video scene that is presented by frame to move.In view of the cardinal principle static content of frame, frame repeats as suitable.
If zero MV counting (Zmv_c) is counted (Smv_c) more than or equal to threshold value Th2 less than threshold value Th1 or little MV, then motion analyzer 64 indications should be carried out frame replacement (156), and it stands the selection as the reference frame of the part of the quality analysis of being carried out by analytic unit 42.In the case, the zero MV counting indicator video scene less than threshold value Th1 comprises significant motion.
Even zero MV counting is not less than threshold value Th1, frame also can comprise one or more relatively little moving articles that will present better by interpolation.Therefore, motion analyzer 64 can be indicated when little MV counts more than or equal to threshold value Th2 (indicating the existence of one or more little fast moving objects), should carry out interpolation.Described object substantially can be little and is more mobile rapidly than other object in the frame.When content frame is not substantially static or when static frames comprises little moving article substantially, frame replace interpolation or the extrapolation of frame (for example, by) can be suitable.
As in the example of Figure 10, the threshold value Th1 among Figure 11, Th2 can be fixing or adjust based on available interpolation resource.In particular, can be based on the one or both among determined resource level adjustment threshold value Th1, the Th2.As shown in Figure 11, focusing under the resources mode, analytic unit 42 optionally detects resource level (143) and adjusts threshold value (145) based on the resource level that detects.Again, analytic unit 42 can detect or estimate number and/or the available memory resource of the instruction that available power resource (for example, the battery levels in the mobile device) and/or available computational resources, per second can be used.For instance, if battery resource is low, then analytic unit 42 can reduce threshold value Th1 and increase threshold value Th2, so that enable interpolation when having a large amount of motion.
In some cases, can be fixing in order to the threshold value that motion vector is categorized as little motion vector or can adjust based on the form size.As above discuss, little motion vector can be the non-zero motion vector with the value that is lower than certain threshold level.In some embodiments, in order to determine that the whether little threshold value of non-zero motion vector can adjust based on the form size of the video unit of decent decoding and interpolation or extrapolation.For instance, little motion vector can be by the different threshold values classification for QCIF, the CIF, QVGA and the VGA frame that have gradually larger form size.The value of the little motion vector threshold value of CIF frame can be less than the value of the little motion vector threshold value of VGA frame.In particular, consider large total size of larger format frame, motion vector magnitude can be thought of as in than the small-format frame greatly, but in larger format frame, be thought of as little.Therefore, in some embodiments, motion analyzer 64 can be adjusted little motion vector threshold value based on the form size of the video unit of decent interpolation or extrapolation.For instance, being used for can be less than the little motion vector threshold value that is used for larger format frame than the little motion vector threshold value of small-format frame.
Figure 12 is configured to optionally to enable for explanation or the block diagram of the example of the Video Decoder 158 of the demonstration that replaces frame of the stopping using analysis of one or more mass propertys (for example, based on).For instance, as describing, Video Decoder 158 can be configured to analyze with produce by the frame rate up-conversion process in Video Decoder through interpolation or one or more characteristics of being associated through the frame of video of extrapolation, and optionally enable and stop using through interpolation or through video unit presenting on display of extrapolation based on described analysis.
Analysis can relate to any one the analysis in the extensive multiple mass property.Mass property can comprise at least one in pixel domain characteristic, transform domain characteristic and/or the motion vector reliability.Described one or more mass propertys can be in order to be formulated the quality metric that comprises space quality tolerance, temporal quality metric or other tolerance.Described quality metric can be in prediction through interpolation or useful in impact (when presenting to the user by the display) process of frame on visual space and/or the temporal quality of video sequence of extrapolation.If with replace the quality level that frame is associated and do not satisfy quality threshold, then decoder 158 can be configured to optionally stop using and replace the demonstration of frame, even it is also like this to replace interpolation or extrapolation of frame.
The quality level that is associated with the replacement frame can be based on the quality level that replaces frame self.In some embodiments, quality level can be based on the quality level that replaces one or more reference frames of frame in order to interpolation or extrapolation.In other embodiments, quality level can reach in order to produce the quality level of one or more reference frames that replace frame based on replacing frame.In each situation, quality level can be indicated the degree of enhancing that can be by show replacing the visual quality that frame realizes substantially.
Even when replacing interpolation or extrapolation of frame, transmission and the demonstration of the replacement frame in Video Decoder that still may need optionally to stop using.For instance, although executed interpolation or extrapolation may be not enough to prove expending for the extra resource of the transmission that replaces frame and demonstration by show replacing visual quality that frame produces.Alternatively, abandon replacement frame and repetition contiguous frames (for example, by the frame during fixing shows within the long time cycle, but not transmitting new frame) and may more close needs, will be necessary to show electric power or other resource that replaces frame thereby save.In some embodiments, except the transmission and demonstration of some frames of optionally stopping using, Video Decoder 158 one or more post-processing operation of also can stopping using for example, strengthen to save extra resource for level and smooth, the sharpening that replaces frame, brilliance control and/or contrast.
In the example of Figure 12, Video Decoder 158 comprises: received frame buffer 160, and its (for example) receives encoded frame 24 from encoder 12; Decoding unit 162, the frame that its decoding receives; Output frame buffer 164, its storage is through the frame of decoding; Replace unit 166, it is carried out frame by interpolation or extrapolation and replaces frame to be added to output frame buffer 164, and therefore supports the FRUC process; And selectivity display analysis unit 168.Analytic unit 168 can be configured to produce signal or order.In response to signal or the order from analytic unit 168, control unit 172 optionally enable or stop using to replace frame from video buffer (for example, the output frame buffer 164) to display 170 with for the transmission that user's vision is presented.In certain aspects, when decoder 158 was in emphatically under the resources mode (for example, as describing referring to Fig. 4, Fig. 5 and Fig. 7), optionally the startup analysis unit 168.Perhaps, in other embodiments, selectivity display analysis unit 168 can operate termly optionally to enable or stop using by replacing unit 166 in the situation that have or non-selectivity starts the demonstration of the replacement frame that produces.
Replace unit 166 and carry out frame replacement (for example, frame interpolation or extrapolation) frame added to the output frame that is stored in the output frame buffer 164 and therefore to support the FRUC process.Video Decoder 158 may or not the application reference frame select technology (as other place in the present invention describing) to select particular reference frame for by replacing unit 166 in the interpolation that replaces frame or in extrapolating.In certain aspects, replacing unit 166 can select described frame based on the analysis of one or more reference frames, or only uses one or more contiguous frames, produces the replacement frame.Yet in either case, analytic unit 168 can be through further being configured to analyze and replacing quality level that frame is associated and replace frame to determine whether to show.With replace quality that frame is associated can comprise with replace one or more mass propertys that frame self is associated, with in order to one or more one or more mass propertys that are associated in the reference frame of interpolation or extrapolation replacement frame, or both combinations.In order to analyze quality, analytic unit 168 can be applied to objective quality metric replacement (for example, through interpolation or through extrapolation) frame and/or replace the reference frame of frame in order to interpolation or extrapolation.
In the time will not replacing frame and send to display 170, can (for example) by using with by the described frame of frame overwrite of decoding, interpolation or extrapolation it being abandoned from output frame buffer 164.In the case, the previous frame that display 170 can only repeat to send to display from output frame buffer 164 (for example, by the previous frame during fixing shows within the long time cycle), or transmission and repeat (for example, fixing) and follow next frame after replacing frame.In the case, display 170 can show previous frame within the extra time cycle, but not shows the replacement frame at described time durations.By using frame to repeat, Video Decoder 159 can avoid transmitting and showing the needs that replace frame, therefore saves electric power and/or other resource.As above mentioned, can be separately or in conjunction with as other is located selectivity that described reference frame selection utilization replaces frame and shows in the present invention.
Therefore, can be by analyzing in pixel domain by replacing replacement frame that unit 166 produces or coming the implementation quality analysis by analyzing in pixel domain by what decoding unit 162 produced through decoding, reference frame or both combinations through rebuilding structure.In the embodiment of analyzing reference frame, reference frame can be the reference frame that replaces frame in order to interpolation or extrapolation.When specific when interpolation or (or in order to one or more reference frames that produce described frame) detect essence in the replacement frame of extrapolation blocking effect, ambiguity and/or look overflow by what replace that unit 166 produces, the quality score of described replacement frame can be low, and when not existing substantially blocking effect, ambiguity and/or look to overflow, quality score can be height.
The different quality scores that replace frame can change and change between high and low according to described objective visual quality metric characteristic.Perhaps, can be based on high or low with relatively quality score being expressed as of predetermined threshold.In either case, do not satisfy (for example, less than) quality threshold if replace the quality score of frame, then analytic unit 168 can instruct control unit 172 frames 170 the transmission from output frame buffer 164 to display of stopping using.Perhaps, satisfy (for example, more than or equal to) threshold value if replace the quality score of frame, then analytic unit 168 can instruct control unit 172 to enable to replace frame from output frame buffer 164 to display 170 for the transmission of presenting to the user.
As mentioned above, analytic unit 168 can be by analyze replacing frame pixel value, in order to producing the pixel value of one or more reference frames that replace frame by interpolation or extrapolation, or analyze and the mass property that replaces frame and be associated at the pixel value that replaces near one or more other frames the frame.If the analysis reference frame, then replace unit 166 can be to analytic unit 168 indication which or which reference frames in order to interpolation or extrapolation specific for frame.
Use pixel value (as mentioned above), analytic unit 168 can be analyzed and one or more space quality tolerance that replace frame and/or be associated in order to one or more reference frames that produce the replacement frame, for example, SSIM, blocking effect, ambiguity and/or look overflow and measure.In some embodiments, alternatively or in addition, can with the pixel value analytical applications to approach in time other frame of replacing frame (except reference frame replace the frame or as to reference frame or replace the alternative of frame).If replace the discontented threshold value that can be suitable for completely of space quality tolerance of frame, reference frame or near another frame, then analytic unit 168 can instruct the control unit 172 inactive demonstrations that replace frame.
As simple declaration, if by surpassing threshold value with the amount that replaces the blocking effect that pixel value that frame is associated presents, then analytic unit 168 can be stopped using and be replaced the demonstration of frame.In this way, decoder 158 can avoid showing the replacement frame that can adversely affect quality, or the replacement frame that provides the quality that expends that is not enough to prove extra demonstration resource to strengthen.
As an alternative or operation bidirectional, for quality analysis, analytic unit 168 can be analyzed one or more temporal quality metrics, for example, and the time fluctuation of space quality tolerance.For instance, analytic unit 168 can be analyzed and replace frame and fluctuation in order to the space quality between one or more reference frames of interpolation or the described replacement frame of extrapolating (for example, SSIM, blocking effect, ambiguity and/or look excessive).If the temporal quality fluctuation is greater than the fluctuation threshold value, then analytic unit 168 can instruct the control unit 172 inactive demonstrations that replace frame.
As further improvement, in some embodiments, analytic unit 168 can be configured to analyze and replace frame, reference frame or near the position of the false shadow in the frame other.False shadow can be can be by blocking effect, ambiguity or look excessive in the specific portion zone of the shown frame undesirable visual properties that produces.But overall block effect, ambiguity or the look of analytic unit 168 analysis frames overflow, or consider described characteristic in the situation of the localized areas of frame.For instance, but the pixel value variance of the localized areas of analytic unit 168 analysis frames (for example, 3 take advantage of 3 pixel regions), to be created in the indication of the texture in the described zone.Smooth region by low variance indication can be easier to be subject to by blocking effect, ambiguity substantially, look excessive or the impact of the visual false shadow that other false shadow causes.Yet described false shadow can be not too visual in having the higher variance zone of more textures.
As the space quality of frame tolerance substitute or except the space quality tolerance of frame, analytic unit 168 can produce the localization space quality tolerance of a plurality of localized areas in the frame.If the localization space quality of frame tolerance does not satisfy the threshold value applicatory in smooth region any one, wherein false shadow will more visual, then analytic unit 168 can stop using replace frame demonstration with minimizing or avoid presenting of visual false shadow.Yet, if localization space quality tolerance does not satisfy the only quality threshold in having the higher variance zone of more textures, then analytic unit 168 can be in the situation that recognize that false shadow will the not visible or more not visible demonstration of permitting replacing frame concerning the user in described zone.As another replacement scheme, analytic unit 168 can be used for different localized areas with different threshold values.In level and smooth low variance zone, analytic unit 168 can use the better quality threshold value, in fact, needs better quality to reduce or avoids the introducing of visual false shadow.In higher variance zone, analytic unit 168 can use than the low quality threshold value, thereby permits replacing the demonstration of frame when only or substantially only remarkable false shadow occurring in high texture region.
But analytic unit 168 scanning frames are to analyze aforesaid localized areas.If the localized areas of any one in the localized areas or predetermined percentage represent low variance (being lower than variance threshold values) and low quality metric (being lower than quality threshold) both, it can indicate the possibility of the visual false shadow in the shown frame, and then analytic unit 168 can instruct the control unit 172 inactive demonstrations that replace frame.Predetermined percentage can be fixing or adaptive threshold value.In other embodiments, analytic unit 168 can be configured to (for example) by whether considering to have low texture and both localized areas of low quality in abutting connection with analyzing the size that false shadow can be visual zone.Therefore, analytic unit 168 can be analyzed with respect to the percentage of all localized areas with remarkable false shadow of percentage threshold and/or with respect to the size of any one in the described false shadow (for example, by low quality/low texture region indication) of size threshold value and replace the demonstration of frame to determine to enable or stop using.
Additionally or alternati, analytic unit 168 can use the data (for example, indicating the data of the reliability of the motion vector that is associated with the reference frame that replaces frame in order to interpolation or extrapolation) of other type to analyze the quality that is associated with the replacement frame.Can comprise compressed domain information by other example that analytic unit 168 is used for the data of quality analysis, for example, the information relevant with the discrete cosine transform (DCT) that is associated with one or more reference frames for the interpolation that replaces frame or extrapolation or wavelet conversion coefficient.Compressed domain information can comprise (for example) QP and CBP value.As shown in Figure 12, in some embodiments, analytic unit 168 can receive QP and CBP value from decoding unit 162.QP and/or CBP value can be associated with the reference frame that replaces frame in order to interpolation or extrapolation.Use QP, CBP value and/or other compressed domain information, analytic unit 168 can be analyzed the quality that replaces frame.
As an example, analytic unit 168 mode that can be similar to aforesaid way (for example, referring to Fig. 5 and Fig. 8) is assessed the motion vector reliability.If the motion vector reliability of one or more reference frames is unsatisfactory, then analytic unit 168 can instruct the control unit 172 inactive demonstrations that replace frame.Analytic unit 168 can (for example) be used for the multiple technologies of determining the motion vector reliability any one in the mode that is similar to the above mode of describing with respect to the MV marginal testing device 60 of Fig. 5, for example, poor method, the frame of based on motion moves to frame and changes the method for detection method or based on motion track.Analytic unit 168 can be considered the motion vector reliability to replace the demonstration of frame to determine to enable or stop using separately or in conjunction with other quality metric of describing herein (for example, comprising pixel domain quality metric, time measure or localization tolerance).
As mentioned above, analytic unit 168 can be analyzed quality based on compressed domain information (QP and/or the CBP value that for example, are associated with the reference frame that replaces frame in order to interpolation or extrapolation).Can (for example) obtain compressed domain information by the reference frame of bit stream from received frame buffer 160 that analysis is associated with the frame that receives.As an example, analytic unit 168 can be configured to analyze QP and/or the CBP that is associated with reference frame by the mode that is similar to referring to the mode of the QP detector 54 of Fig. 5 and CBP detector 56 and quality score calculator 58 and the process prescription that illustrates in the flow chart of Fig. 9.If QP and CBP value (for example) are by comparing QP and determining whether the CBP value is the relative high-quality that non-zero (as shown in Figure 9) is indicated reference frame with QP threshold value QP_th, then analytic unit 168 can instruct control unit 172 to enable the demonstration of the replacement frame that uses relevant reference frame interpolation or extrapolation.Yet if QP and CBP value indication low quality, analytic unit 168 can instruct the control unit 172 inactive demonstrations that replace frame.
The analysis of compressed domain information and MV reliability can make to determine whether to show the replacement frame separately or in conjunction with other quality information (for example, space quality information or temporal quality information) by analytic unit 168.In some embodiments, analytic unit 168 can be configured and analyze multiple quality information with compound mode.For instance, analytic unit 168 (for example can be analyzed objective space in the pixel domain and/or temporal quality metric, the time fluctuation that SSIM, blocking effect, ambiguity and look overflow and/or be associated), and transform domain information (for example, QP and CBP value) and possible errors hide (EC) reliability information producing quality score, and then use motion vector (MV) fail-safe analysis to determine whether the acceptable replacement frame of display quality score.
For error concealing (EC) reliability, but being similar to the EC of the analysis of describing referring to the processor of the EC detector of Fig. 5 and Fig. 9, analytic unit 168 application class analyze.For instance, analytic unit 168 can be analyzed in order to produce one or more reference frames that replace frame and to determine whether to exist slice loss.If so, then analytic unit 168 can determine whether suitable EC mechanism is available.If there is no suitable EC mechanism, the then demonstration of the analytic unit 168 replacement frame that can instruct control unit 172 to stop using to be undertaken by display 170.Yet if acceptable EC mechanism is available, analytic unit 168 can permit replacing the demonstration of frame.The analysis of slice loss and EC mechanism can be carried out individually as being used for and optionally enable or the inactive basis that replaces the demonstration of frame, or carries out in conjunction with the analysis of other mass property (for example, space quality, temporal quality, motion vector reliability etc.).
In some embodiments, analytic unit 168 can be configured to operate by the mode of the analytic unit 42 that is similar to Fig. 5, determines whether show to replace frame except analytic unit 168 but not determines whether to select particular reference frame in interpolation or the described replacement frame process of extrapolating.Merit attention, because replacing frame produces by replacing unit 166, so except the mass property of analyzing one or more reference frames or as the substituting of the mass property of analyzing one or more reference frames, analytic unit 168 can be analyzed the objective quality characteristic of the replacement frame in the pixel domain.For instance, analytic unit 168 can be individually or the quality of considering to replace frame in conjunction with quality and/or the motion vector reliability of one or more reference frames.Perhaps, analytic unit 168 can be considered quality and/or the motion vector reliability of reference frame in the situation of the quality of not considering to replace frame.
When replacing frame when probably advantageously influential to vision and/or temporal quality, optionally described frame is being provided in the process of display 170, analytic unit 168 can be effectively.Even executed interpolation or extrapolation need to not be used for transmitting and the extra resource of display frame if quality level does not prove, then abandon frame and still can be favourable.May need a large amount of electric power that the replacement frame is transferred to display 170 from output frame buffer 164.
Generally, analytic unit 168 can be configured to analyze one or more quality and/or the kinetic characteristic through interpolation or the frame of video through extrapolating that produces by the FRUC process of being carried out by the replacement unit 166 in the Video Decoder 158.Based on described analysis, analytic unit 168 can instruct control unit 172 optionally to enable and the inactive transmission through interpolation or the frame of video through extrapolating for presenting in display unit 170.Although can analyze for example SSIM, blocking effect, ambiguity and/or look excessive objective visual quality metric characteristic for quality analysis, can use other quality metric.
In certain aspects, decoder 158 can be configured to operate under different mode.Under first mode (for example, focusing on resources mode), decoder 158 can (for example) optionally be enabled based on quality analysis via control unit 172 and the inactive transmission through interpolation or the frame of video through extrapolating that is used in display unit 170 demonstrations.Under the second operator scheme, the control unit 172 of decoder 158 can in the situation that not the implementation quality analysis or do not enable in the result's of considering quality analysis the situation for present in display unit 170 through interpolation or through the transmission of the frame of extrapolation.
Perhaps, be alternative in and enable demonstration, analytic unit 168 can reduce one or more quality thresholds applicatory for current replacement frame, has shown higher probability so that replace frame.Therefore, in either case, analytic unit 168 can be based, at least in part, on the predetermined number previous frame demonstration that the number of the replacement frame that not yet shows determines whether to enable current replacement frame.As an illustration, if the number of the previous replacement frame in considering is N=10, and the threshold number M of the frame that shows not is 5, if then in the end the number of the frame that does not show in 10 frames be 5 or larger, then analytic unit 168 can instruct control unit 172 to permit the demonstration of current replacement frame.
As improvement, in some embodiments, do not consider last N absolute number that replaces the replacement frame that not yet shows in the frame, analytic unit 168 can be considered the number of the continuous replacement frame that not yet shows.For instance, analytic unit 168 can be used continuous counter threshold value M.If (for example) show not yet that owing to mass property previous M+1 replaces frame continuously, then analytic unit can instruct control unit 172 to enable the demonstration of current replacement frame.Any one next not demonstration based on previous replacement frame that analytic unit 168 can be widely used in the multiple threshold scheme determines whether to show current replacement frame.Therefore, provide this to describe with the purpose that is used for explanation and unrestricted.
Figure 13 is used for the flow chart that selectivity shows the case technology that replaces frame for explanation.The decoder 158 of Figure 12 can be configured to carry out the technology of Figure 13.As shown in Figure 13, decoder 158 can receive input video frame (174), to frame decode (176) and carry out that frame replaces in the frame that (178) received to use some produce the replacement frame as the reference frame for interpolation or extrapolation.Analytic unit (for example, be provided in the decoder 158 analytic unit 168) can be analyzed all or some one or more mass propertys (180) that replace in the frame.In some embodiments, analytic unit can be in being provided in Video Decoder 158, video post-processing unit, or in the video display processing unit (for example, mobile display processor (MDP)).For instance, analytic unit 168 can be analyzed the various objective quality characteristics in interpolation or the frame through extrapolating that produce by replacing unit 166, and for example, SSIM, blocking effect, ambiguity or look overflow.
Satisfy quality threshold if replace the quality of frame, for example, more than or equal to quality threshold (182), then analytic unit 168 can (for example) by enable frame from output frame buffer 164 to display 170 transmission instruct control unit 172 to enable the demonstration (184) that replaces frame.If quality does not satisfy quality threshold (182), for example, be less than or equal to threshold value, then analytic unit 168 can instruct the control unit 172 inactive demonstrations (186) that replace frame.In some cases, can (for example) adjust adaptively quality threshold based on available resources.Therefore, when replacing the quality unsatisfactory (for example, with respect to the threshold quality level that defines in advance) of frame, produce the replacement frame even expended some resources, decoder 158 also can be avoided the consumption of extra resource.
In the example of Figure 13, analytic unit 168 can be configured to analyze the quality of replacement frame self to determine whether to show described replacement frame.Therefore, in some cases, may there be the needs of analyzing other frame.Yet in other embodiments, the quality conduct that analytic unit 168 can be analyzed in order to produce one or more reference frames that replace frame is used for determining whether to show the basis that replaces frame.As by piece 188 indication, analytic unit 168 optionally separately or analyze the quality of one or more reference frames in conjunction with the analysis of the quality that replaces frame.The quality analysis that replaces frame can relate generally to the analysis of the pixel thresholding that replaces frame.As mentioned above, the quality analysis of reference frame can relate to the analysis of pixel thresholding, conversion thresholding, MV reliability, EC reliability etc.
Figure 14 A is for illustrating the block diagram of the analytic unit 168 that can use with Video Decoder 158 as shown in Figure 12.In the example of Figure 14 A, analytic unit 168 is configured to use the quality that is associated with the replacement frame for objective one or more quality metric analyses that replace frame.For instance, objective metric detector 173 can be analyzed objective space quality tolerance, and for example, the SSIM, blocking effect, ambiguity or the look that replace frame overflow.Can (for example) obtain described tolerance from the pixel value that replaces frame.In some embodiments, objective metric detector 173 can be whole or the analysis frame that comes up on the basis of localized areas.Quality score calculator 175 can produce quality score based on described objective quality metric.Comparing unit 177 can compare quality score and threshold value to determine to enable or stop using to replace the demonstration of frame.If quality score satisfies quality threshold, then comparing unit 177 can instruct control unit 172 to permit showing the replacement frame via display 170.If quality score does not satisfy quality threshold, then comparing unit 177 can instruct the control unit 172 inactive demonstrations that replace frame.In the case, previous or future frame (but not described replacement frame) can be repeated by display 170.
Figure 14 B is for illustrating the block diagram of another analytic unit 168 that can use with Video Decoder 158 as shown in Figure 12.In the example of Figure 14 A, analytic unit 168 is configured to the quality that Direct Analysis replaces frame.In the example of Figure 14 B, analytic unit 168 can be configured to analyze the quality of reference frame, the quality that replaces frame or both combinations.In addition, in Figure 14 B, analytic unit 168 can be configured to analyze the information of pixel value, conversion thresholding, motion vector and/or other type.Generally, can slightly be similar to the mode Allocation Analysis unit 168 of the analytic unit 42 of Fig. 5.For instance, analytic unit 168 can comprise one or more in objective metric detector 173, quality score calculator 175, comparing unit 177, EC detector 179, QP detector 181, CBP detector 183 and the MV marginal testing device 185.
Objective metric detector 173 can analyze replace frame and/or in order to one or more reference frames of producing described replacement frame to produce the indication of objective space quality and/or temporal quality, as mentioned above.In some embodiments, objective metric detector 173 can be whole or the analysis frame that comes up on the localized areas basis.EC detector 179 can detect slice loss, and determines whether acceptable EC mechanism is available.The indication that QP detector 181 and CBP detector 183 can produce quality based on QP and the CBP value of reference frame.
In some embodiments, quality score calculator 175 can (for example) calculates the Consideration of gross mass score to be used for showing in the selectivity that replaces frame in the mode that is similar to the mode of describing referring to Fig. 5 and Fig. 9.Can replace frame to each and reset quality score.Yet, if useful to current replacement frame for the quality metric of previous replacement frame analysis, can and re-use its reservation.As an example, if being used for another, same reference frame replaces frame, then can re-use the QP and CBP data, MV reliability data or the EC reliability data that have been identified for for the particular reference frame of the interpolation that replaces frame or extrapolation.
Can indicate the replacement frame whether might produce acceptable vision quality level (when be current by display 170) in order to the mass property (for example, objective metric, EC characteristic, QP and CBP characteristic) of calculating PTS.Comparing unit 177 can compare PTS and quality threshold.If PTS satisfactory (for example, meeting or surpass quality threshold), then comparing unit 177 indications replace frames to have for being presented by display 170 are acceptable quality level.If PTS is less than quality threshold, then comparing unit 177 determines that described replacement frame is unacceptable for being presented by display 170.In each situation, analytic unit 168 can then continue to analyze the quality that the next selectivity that replaces frame shows.
Even quality score indication quality is satisfactory, MV marginal testing device 185 also can be through providing whether reliable with the motion vector that is associated in order to one or more reference frames that produce the replacement frame to determine.The reliability that MV marginal testing device 185 can be analyzed the motion vector in one or more reference frames to guarantee in the situation that frame replace and utilize motion prediction compensation method to be used for interpolation or the selected reference frame of extrapolation might to produce quality frame and replace the result.If motion vector is reliable, then MV marginal testing device 185 can signal be transferred to control unit 170 so that replace frame by display 170 demonstrations with enabling.Yet if motion vector is unreliable, MV marginal testing device 185 can transfer signals to control unit 172, can't help display 170 and shows so that replace frame.Alternatively, control unit 172 can be stopped using and be replaced the demonstration of frame, so that display 170 can repeat previous or future frame comes the vicarious substitute frame.
Figure 15 is for illustrating for generation of the flow chart of quality score with the case technology of the selectivity demonstration of support replacement frame.The process of Figure 15 can be similar to the process of Fig. 9.Yet as describing to Figure 14 referring to Figure 12, the selectivity that the score process can be used for the replacement frame of interpolation or extrapolation shows, but not is used for carrying out the selection of the reference frame of interpolation or extrapolation.As shown in Figure 15, considering whether show in the process that replaces frame, analytic unit 168 can be retrieved one or more reference frames that are used for interpolation or extrapolation from institute's received frame buffer 160 and replace frame (190) to produce, and analyzes QP and the CBP value of reference frame.As in the example of Fig. 9, as shown in Figure 15, analytic unit 168 can produce the score based on QP of combination based on QP and CBP value.If the QP value is not equal to zero (192) less than QP threshold value applicatory (QP_th) and CBP value, then analytic unit 168 is set to " height " (194) with the score based on QP of described frame.If the QP value is substantially equal to zero (192) more than or equal to QP threshold value (QP_th) or CBP value, then analytic unit 42 is set to " low " (196) with the score based on QP of described frame.
As further showing in Figure 15, analytic unit 168 also can be configured to described reference frame is checked slice loss (198).Referring to Figure 14 B, the slice loss inspection can be carried out by EC detector 179.If there is slice loss, then analytic unit 168 can determine whether sufficient error concealing (EC) mechanism can be used for proofreading and correct slice loss (200).If not, then analytic unit 168 is set to " low " (202) with reference to the score based on EC of frame.If reliable and sufficient EC mechanism available (124), then analytic unit 168 will be set to based on the score of EC " height " (204).In the case, if can use EC mechanism to reappear reliably the section of losing, it is reliable that then the quality of reference frame can be indicated the replacement frame.Yet if can not reappear section, reference frame can indicate the replacement frame can comprise the error message that should not be shown.
Figure 16 is used for supporting being used for the flow chart of the case technology of the motion of the reference video unit reference video unit that video unit replaces selected and/or quality analysis when video unit is supported the delay-sensitive Video Applications for explanation.The flow chart of Figure 16 is substantially corresponding to the flow chart of Fig. 8, but further comprise determine the Video Applications supported by decoder 14 whether be delay-sensitive applications (for example, visual telephone) operation (216), and if so, then determine whether to meet time gap criterion applicatory (218).
In the example of Figure 16, when retrieving candidate's reference frame (94) and detecting delay-sensitive applications (216), decoder 14 can determine whether described frame is future frame with respect to frame to be added.If so, then decoder 14 determines that whether the distance of following reference frame is less than threshold distance.If candidate's reference frame is less than previous frame or future frame away from the threshold distance of frame to be added, then meet distance criterion (218).In certain aspects, can express distance by a plurality of frames that candidate's reference frame and frame to be added are separated.
When meeting the time gap criterion, decoder 14 can be proceeded quality analysis, for example, shows as described in reference to Figure 8 and in operation (96) to (112).Yet if do not meet distance criterion, decoder 14 can continue to remove described candidate's reference frame conduct and be used for the reference frame of interpolation from consider, and continues the next candidate's reference frame of retrieval (94) to be used for consideration.Therefore, distance criterion is applied to delay-sensitive applications can stands multiple different embodiments, for example, based on the control of getting rid of the selected cell 44 (Fig. 4) of particular reference frame apart from the time gap that replaces frame, part distance-based via range unit 63 (Fig. 5) calculated mass score, the maybe control (for example, as shown in Figure 16) of the analytic unit 42 of inactive quality analysis when not meeting distance criterion.
Generally, technology such as the reference frame that be used for to select is used for interpolation described among the present invention can provide the FRUC embodiment, in aspect various, described FRUC embodiment can replace to reduce resource consumption by the frame of stopping using when unnecessary or favourable, and by selecting good reference frame for the quality that in frame replaces, makes to strengthen through the frame of interpolation.Select the high-quality reference frame by analyzing QP and CBP value and analyzing based on this, can realize the advantage of the quality that strengthens, it can be particularly useful in the video that compresses by variable-digit speed (VBR) controlling mechanism.
In addition, if if there is no error concealing mechanism or error concealing mechanism do not provide the good quality frame, the quality that then strengthens can realize as the reference frame by the frame of not selecting to have section or frame loss.When transmission loss occurs, for example, in the situation that videophone application, have the avoiding and to be particularly useful of frame of the loss of section or frame and inadequate error concealing.Emphatically resources mode also can be kept in the process of the objective and well as subjective video quality in the harmonic motion video segment effectively substantially in the consumption that reduces power consumption or other resource simultaneously.
Technology described herein may be implemented in hardware, software, firmware or its any combination.But any feature of the module of being described as, unit or assembly can be implemented together in the integrated logic device or individually as the discrete but logic device of co-operate.In some cases, various features can be embodied as integrated circuit (IC) apparatus, for example, integrated circuit (IC) chip or chipset.If be implemented in the software, then described technology can be at least in part realized by the computer-readable media of include instruction, and when carrying out, described instruction causes processor to carry out one or more in the said method.
Computer-readable media can form the part of computer program, and computer program can comprise encapsulating material.Computer-readable media can comprise the computer data medium, for example, random access memory (RAM), Synchronous Dynamic Random Access Memory (SDRAM), read-only memory (ROM), nonvolatile RAM (NVRAM), Electrically Erasable Read Only Memory (EEPROM), flash memory, magnetic or optical data storage media etc.Additionally or alternati, described technology can realize by computer-readable communication medium at least in part, described computer-readable communication medium with the form of instruction or data structure carry or pass on code and can be by computer access, read and/or carry out.
Code or instruction can be carried out by one or more processors, for example, and one or more DSP, general purpose microprocessor, ASIC, field programmable logic array (FPGA), or the integrated or discrete logic of other equivalence.Therefore, term " processor " can refer to aforementioned structure or be suitable for implementing in any other structure of technology described herein any one as used herein.In addition, in certain aspects, functional being provided in dedicated software modules or the hardware module of describing herein.Any one in the multiple integrated circuit (IC) apparatus that comprises the one or more circuit in the technology of implementing to describe among the present invention also contained in the present invention.But can be provided in described circuit in the single IC for both chip or in the integrated circuit (IC) chip of a plurality of co-operate in so-called chipset.Described integrated circuit (IC) apparatus can be used in the multiple application, and some of them can be included in the radio communication device (for example, mobile phone hand-held set) and use.
The various aspects of the technology that discloses have been described.These and other aspect is in the scope of appended claims.
Claims (41)
1. method of being carried out by video decoder, described method comprises:
Determine and can produce extra video unit between two continuous videos unit in a plurality of candidate's reference video unit, wherein said a plurality of candidate's reference video unit comprises at least one candidate's reference video unit and described two continuous videos unit;
Can produce described extra video unit in response to determining, to analyze each at least one characteristic of described candidate's reference video unit by described video decoder; And
By described video decoder at least part of based on described analysis select one or more conducts in described candidate's reference video unit with reference to video unit to be used for interpolation or the extrapolation of extra video unit, wherein be different from one of at least described two the continuous videos unit that produce described extra video unit between it in selected candidate's reference video unit.
2. method according to claim 1, each in wherein said candidate's reference video unit is candidate's reference video frame, and described extra video unit is extra frame of video.
3. method according to claim 1, wherein said at least one characteristic comprise that indication uses the one or more characteristics of carrying out the quality level of interpolation or extrapolation in described candidate's reference video unit.
4. method according to claim 3 is wherein analyzed described at least one characteristic and is comprised one or more quantization parameter QP and the block mode CBP value through deciphering of analyzing in described candidate's reference video unit.
5. method according to claim 3, wherein analyze described at least one characteristic and comprise the video unit loss that detects in one or more in described candidate's reference video unit, and wherein select select one or more in described candidate's reference video unit one or more comprising based on the quality of error concealing as the reference video unit in described candidate's reference video unit.
6. method according to claim 3 is wherein analyzed described at least one characteristic and is comprised one or more one or more objective visual quality tolerance of analyzing in described candidate's reference video unit.
7. method according to claim 3 is wherein selected one or more one or more candidate's reference video unit of selecting described quality level to satisfy at least one threshold value that comprise as the reference video unit in described candidate's reference video unit.
8. method according to claim 7, it further comprises based on described quality level in the described candidate's reference video unit that satisfies described at least one threshold value at least some is carried out classification, and selects the highest one or more of classification in described candidate's reference video unit.
9. method according to claim 7, it further comprises based on described at least one threshold value of the horizontal adjustment of available power resource.
10. method according to claim 1, described method further comprises based on described analysis interpolation or the described extra video unit of extrapolating.
11. method according to claim 1, wherein said at least one characteristic comprise the characteristic of the time gap of the described extra video unit of one or more distances in the described candidate's reference video of the indication unit.
12. method according to claim 11, it further comprises:
Detect the delay-sensitive Video Applications; And
When detecting described delay-sensitive Video Applications, at least part of described time gap based on the described extra video unit of one or more distances in described candidate's reference video unit is selected described one or more in described candidate's reference video unit.
13. method according to claim 11, it further comprises at least part of described time gap based on the described extra video unit of one or more distances in described candidate's reference video unit and selects described one or more in described candidate's reference video unit.
14. a device of being carried out by video decoder, it comprises:
Analytic unit, it is determined and can produce extra video unit between two continuous videos unit in a plurality of candidate's reference video unit, wherein said a plurality of candidate's reference video unit comprises at least one candidate's reference video unit and described two continuous videos unit, and can produce described extra video unit in response to determining, it analyzes each at least one characteristic of described candidate's reference video unit; And
Selected cell, its at least part of based on described analysis select one or more conducts in described candidate's reference video unit with reference to video unit to be used for interpolation or the extrapolation of extra video unit, wherein be different from one of at least described two the continuous videos unit that produce described extra video unit between it in selected candidate's reference video unit.
15. device according to claim 14, each in wherein said candidate's reference video unit is candidate's reference video frame, and described extra video unit is extra frame of video.
16. device according to claim 14, wherein said at least one characteristic comprise the one or more characteristics of carrying out the quality level of interpolation or extrapolation in the described candidate's reference video of the indication use unit.
17. device according to claim 16, wherein said analytic unit are analyzed one or more quantization parameter QP and the block mode CBP value through deciphering in described candidate's reference video unit.
18. device according to claim 16, wherein said analytic unit detects the video unit loss in one or more in described candidate's reference video unit, and described selected cell is selected one or more in described candidate's reference video unit based on the quality of error concealing.
19. device according to claim 16, wherein said analytic unit are analyzed one or more the one or more objective visual quality tolerance in described candidate's reference video unit.
20. device according to claim 16, wherein said selected cell select described quality level to satisfy one or more candidate's reference video unit of at least one threshold value.
21. device according to claim 20, wherein said analytic unit carries out classification based on described quality level in the described candidate's reference video unit that satisfies described at least one threshold value at least some, and described selected cell is selected the highest one or more of classification in described candidate's reference video unit.
22. device according to claim 20, it further comprises adjustment unit, described at least one threshold value of its horizontal adjustment based on the available power resource.
23. device according to claim 14, described device further comprises the replacement unit, and it is based on described analysis interpolation or the described extra video unit of extrapolating.
24. device according to claim 14, wherein said at least one characteristic comprise the characteristic of the time gap of the described extra video unit of one or more distances in the described candidate's reference video of the indication unit.
25. device according to claim 24, it further comprises the delay detecting unit, it detects the delay-sensitive Video Applications, wherein when detecting described delay-sensitive Video Applications, described selected cell is at least part of selects described one or more in described candidate's reference video unit based on described one or more candidate's reference video unit apart from the described time gap of described extra video unit.
26. device according to claim 24, wherein said selected cell is at least part of selects described one or more in described candidate's reference video unit based on described one or more candidate's reference video unit apart from the described time gap of described extra video unit.
27. device according to claim 14, wherein said device comprises wireless communication device handsets.
28. device according to claim 14, wherein said device comprises integrated circuit (IC) apparatus.
29. a device of being carried out by video decoder, it comprises:
Be used for determining to produce between two continuous videos unit of a plurality of candidate's reference video unit the device of extra video unit, wherein said a plurality of candidate's reference video unit comprises at least one candidate's reference video unit and described two continuous videos unit;
Be used for determining in response to enabling to produce the described of described extra video unit, analyze each the device of at least one characteristic of described candidate's reference video unit; And
Be used at least part of one or more conducts of selecting described candidate's reference video unit based on described analysis with reference to video unit with the interpolation that is used for extra video unit or the device of extrapolation, wherein be different from one of at least described two the continuous videos unit that produce described extra video unit between it in selected candidate's reference video unit.
30. device according to claim 29, each in wherein said candidate's reference video unit is candidate's reference video frame, and described extra video unit is extra frame of video.
31. device according to claim 29, wherein said at least one characteristic comprise the one or more characteristics of carrying out the quality level of interpolation or extrapolation in the described candidate's reference video of the indication use unit.
32. device according to claim 31, wherein said device for analyzing described at least one characteristic comprise the device be used to the one or more quantization parameter QP that analyzes described candidate's reference video unit and the block mode CBP value through deciphering.
33. device according to claim 31, wherein said device for analyzing described at least one characteristic comprises the device for detection of the loss of the video unit in described candidate's reference video unit one or more, and described device for selecting comprises for the one or more device of selecting described candidate's reference video unit based on the quality of error concealing.
34. device according to claim 31, wherein said device for analyzing described at least one characteristic comprise the device be used to one or more one or more objective visual quality tolerance of analyzing described candidate's reference video unit.
35. device according to claim 31, wherein said device for selecting comprises be used to selecting described quality level to satisfy the device of one or more candidate's reference video unit of at least one threshold value.
36. device according to claim 35, it further comprises for based on described quality level at least some of described candidate's reference video unit of satisfying described at least one threshold value being carried out the device of classification, and is used for selecting the highest one or more device of classification of described candidate's reference video unit.
37. device according to claim 35, it further comprises for the device based on described at least one threshold value of horizontal adjustment of available power resource.
38. device according to claim 29, described device further comprise for the device based on described analysis interpolation or the described extra video unit of extrapolating.
39. device according to claim 29, wherein said at least one characteristic comprise the characteristic of the time gap of the described extra video unit of one or more distances in the described candidate's reference video of the indication unit.
40. described device according to claim 39, it further comprises:
Device for detection of the delay-sensitive Video Applications; And
Be used at least part of described one or more device of selecting described candidate's reference video unit based on described one or more candidate's reference video unit apart from the described time gap of described extra video unit when detecting described delay-sensitive Video Applications.
41. described device according to claim 39, it further comprises the described one or more device of selecting described candidate's reference video unit apart from the described time gap of described extra video unit based on described one or more candidate's reference video unit at least part of.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US1270307P | 2007-12-10 | 2007-12-10 | |
US61/012,703 | 2007-12-10 | ||
US12/111,718 US9426414B2 (en) | 2007-12-10 | 2008-04-29 | Reference selection for video interpolation or extrapolation |
US12/111,718 | 2008-04-29 | ||
PCT/US2008/086277 WO2009076466A1 (en) | 2007-12-10 | 2008-12-10 | Reference selection for video interpolation or extrapolation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101919255A CN101919255A (en) | 2010-12-15 |
CN101919255B true CN101919255B (en) | 2013-02-27 |
Family
ID=40721647
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200880125045.4A Active CN101919249B (en) | 2007-12-10 | 2008-12-10 | The interpolation of resource-adaptive video or extrapolation |
CN2008801250420A Active CN101919255B (en) | 2007-12-10 | 2008-12-10 | Reference selection for video interpolation or extrapolation |
CN2008801265229A Active CN101939992B (en) | 2007-12-10 | 2008-12-10 | Reference selection for video interpolation or extrapolation |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200880125045.4A Active CN101919249B (en) | 2007-12-10 | 2008-12-10 | The interpolation of resource-adaptive video or extrapolation |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2008801265229A Active CN101939992B (en) | 2007-12-10 | 2008-12-10 | Reference selection for video interpolation or extrapolation |
Country Status (7)
Country | Link |
---|---|
US (3) | US8660175B2 (en) |
EP (3) | EP2232878B1 (en) |
JP (3) | JP5502747B2 (en) |
KR (4) | KR101136293B1 (en) |
CN (3) | CN101919249B (en) |
TW (3) | TW200950528A (en) |
WO (3) | WO2009076468A2 (en) |
Families Citing this family (195)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7474327B2 (en) | 2002-02-12 | 2009-01-06 | Given Imaging Ltd. | System and method for displaying an image stream |
GB0222522D0 (en) | 2002-09-27 | 2002-11-06 | Controlled Therapeutics Sct | Water-swellable polymers |
GB0417401D0 (en) | 2004-08-05 | 2004-09-08 | Controlled Therapeutics Sct | Stabilised prostaglandin composition |
GB0613333D0 (en) | 2006-07-05 | 2006-08-16 | Controlled Therapeutics Sct | Hydrophilic polyurethane compositions |
GB0613638D0 (en) | 2006-07-08 | 2006-08-16 | Controlled Therapeutics Sct | Polyurethane elastomers |
GB0620685D0 (en) | 2006-10-18 | 2006-11-29 | Controlled Therapeutics Sct | Bioresorbable polymers |
GB2450121A (en) * | 2007-06-13 | 2008-12-17 | Sharp Kk | Frame rate conversion using either interpolation or frame repetition |
US8660175B2 (en) * | 2007-12-10 | 2014-02-25 | Qualcomm Incorporated | Selective display of interpolated or extrapolated video units |
JP2009253348A (en) * | 2008-04-01 | 2009-10-29 | Alps Electric Co Ltd | Data processing method and data processing apparatus |
US8571106B2 (en) * | 2008-05-22 | 2013-10-29 | Microsoft Corporation | Digital video compression acceleration based on motion vectors produced by cameras |
US9788018B2 (en) * | 2008-06-30 | 2017-10-10 | Microsoft Technology Licensing, Llc | Error concealment techniques in video decoding |
US8374240B1 (en) * | 2008-07-10 | 2013-02-12 | Marvell International Ltd. | Image frame management |
KR101258106B1 (en) * | 2008-09-07 | 2013-04-25 | 돌비 레버러토리즈 라이쎈싱 코오포레이션 | Conversion of interleaved data sets, including chroma correction and/or correction of checkerboard interleaved formatted 3d images |
US8385404B2 (en) * | 2008-09-11 | 2013-02-26 | Google Inc. | System and method for video encoding using constructed reference frame |
US20100178038A1 (en) * | 2009-01-12 | 2010-07-15 | Mediatek Inc. | Video player |
WO2010093430A1 (en) * | 2009-02-11 | 2010-08-19 | Packetvideo Corp. | System and method for frame interpolation for a compressed video bitstream |
EP2227012A1 (en) * | 2009-03-05 | 2010-09-08 | Sony Corporation | Method and system for providing reliable motion vectors |
US8619187B2 (en) * | 2009-04-01 | 2013-12-31 | Marvell World Trade Ltd | Cadence detection in progressive video |
KR20100119354A (en) * | 2009-04-30 | 2010-11-09 | 삼성전자주식회사 | Display device and driving method of the same |
US8724707B2 (en) * | 2009-05-07 | 2014-05-13 | Qualcomm Incorporated | Video decoding using temporally constrained spatial dependency |
US9113169B2 (en) * | 2009-05-07 | 2015-08-18 | Qualcomm Incorporated | Video encoding with temporally constrained spatial dependency for localized decoding |
US20100289944A1 (en) * | 2009-05-12 | 2010-11-18 | Shing-Chia Chen | Frame Rate Up-Conversion Based Dynamic Backlight Control System and Method |
WO2010144833A2 (en) | 2009-06-12 | 2010-12-16 | Cygnus Broadband | Systems and methods for intelligent discard in a communication network |
US8531961B2 (en) | 2009-06-12 | 2013-09-10 | Cygnus Broadband, Inc. | Systems and methods for prioritization of data for intelligent discard in a communication network |
US8745677B2 (en) * | 2009-06-12 | 2014-06-03 | Cygnus Broadband, Inc. | Systems and methods for prioritization of data for intelligent discard in a communication network |
US8627396B2 (en) | 2009-06-12 | 2014-01-07 | Cygnus Broadband, Inc. | Systems and methods for prioritization of data for intelligent discard in a communication network |
US8340510B2 (en) | 2009-07-17 | 2012-12-25 | Microsoft Corporation | Implementing channel start and file seek for decoder |
US8448016B2 (en) * | 2009-07-31 | 2013-05-21 | Cleversafe, Inc. | Computing core application access utilizing dispersed storage |
CN101990093A (en) * | 2009-08-06 | 2011-03-23 | 索尼株式会社 | Method and device for detecting replay section in video |
US8279259B2 (en) * | 2009-09-24 | 2012-10-02 | Microsoft Corporation | Mimicking human visual system in detecting blockiness artifacts in compressed video streams |
WO2011042898A1 (en) * | 2009-10-05 | 2011-04-14 | I.C.V.T Ltd. | Apparatus and methods for recompression of digital images |
JP4692913B2 (en) | 2009-10-08 | 2011-06-01 | 日本ビクター株式会社 | Frame rate conversion apparatus and method |
WO2011061746A1 (en) | 2009-11-20 | 2011-05-26 | Given Imaging Ltd. | System and method for controlling power consumption of an in vivo device |
US8903812B1 (en) | 2010-01-07 | 2014-12-02 | Google Inc. | Query independent quality signals |
KR101768207B1 (en) * | 2010-01-19 | 2017-08-16 | 삼성전자주식회사 | Method and apparatus for encoding/decoding motion vector based on reduced motion vector predictor candidates |
JP5306485B2 (en) * | 2010-02-09 | 2013-10-02 | 日本電信電話株式会社 | Motion vector predictive coding method, motion vector predictive decoding method, moving picture coding apparatus, moving picture decoding apparatus, and programs thereof |
US9838709B2 (en) * | 2010-02-09 | 2017-12-05 | Nippon Telegraph And Telephone Corporation | Motion vector predictive encoding method, motion vector predictive decoding method, moving picture encoding apparatus, moving picture decoding apparatus, and programs thereof |
US9497481B2 (en) * | 2010-02-09 | 2016-11-15 | Nippon Telegraph And Telephone Corporation | Motion vector predictive encoding method, motion vector predictive decoding method, moving picture encoding apparatus, moving picture decoding apparatus, and programs thereof |
BR112012019527A2 (en) * | 2010-02-09 | 2018-03-13 | Nippon Telegraph And Telephone Corporation | A motion vector predictive coding method, a motion vector prediction decoding method, video coding equipment, video decoding devices, and those programs |
JP5583992B2 (en) * | 2010-03-09 | 2014-09-03 | パナソニック株式会社 | Signal processing device |
US8682142B1 (en) * | 2010-03-18 | 2014-03-25 | Given Imaging Ltd. | System and method for editing an image stream captured in-vivo |
US9082278B2 (en) * | 2010-03-19 | 2015-07-14 | University-Industry Cooperation Group Of Kyung Hee University | Surveillance system |
US8588309B2 (en) | 2010-04-07 | 2013-11-19 | Apple Inc. | Skin tone and feature detection for video conferencing compression |
CN106923779A (en) | 2010-04-28 | 2017-07-07 | 基文影像公司 | For the system and method for image section in display body |
KR20110131897A (en) * | 2010-06-01 | 2011-12-07 | 삼성전자주식회사 | Method of processing data and display apparatus performing the method |
US8615160B2 (en) * | 2010-06-18 | 2013-12-24 | Adobe Systems Incorporated | Media player instance throttling |
US8819269B2 (en) * | 2010-06-30 | 2014-08-26 | Cable Television Laboratories, Inc. | Adaptive bit rate method and system using retransmission and replacement |
JP2012019329A (en) * | 2010-07-07 | 2012-01-26 | Sony Corp | Recording device, recording method, reproducing device, reproducing method, program, and recording and reproducing device |
US20120044992A1 (en) * | 2010-08-17 | 2012-02-23 | Qualcomm Incorporated | Low complexity adaptive filter |
US8922633B1 (en) | 2010-09-27 | 2014-12-30 | Given Imaging Ltd. | Detection of gastrointestinal sections and transition of an in-vivo device there between |
WO2012071680A1 (en) * | 2010-11-30 | 2012-06-07 | Technicolor (China) Technology Co., Ltd. | Method and apparatus for measuring quality of video based on frame loss pattern |
TW201228403A (en) * | 2010-12-28 | 2012-07-01 | Acer Inc | Video display device, multi-media vedio streamoing device, and method thereof |
KR101736793B1 (en) | 2010-12-29 | 2017-05-30 | 삼성전자주식회사 | Video frame encoding device, encoding method thereof and operating method of video signal transmitting and receiving system including the same |
JP5812808B2 (en) * | 2011-01-05 | 2015-11-17 | キヤノン株式会社 | Image processing apparatus and image processing method |
US20120188460A1 (en) * | 2011-01-21 | 2012-07-26 | Ncomputing Inc. | System and method for dynamic video mode switching |
JP2012165071A (en) * | 2011-02-03 | 2012-08-30 | Sony Corp | Imaging apparatus, reception device, image transmission system, and image transmission method |
CN103703704B (en) * | 2011-02-24 | 2017-02-15 | 爱立信(中国)通信有限公司 | Reducing interference caused by atmospheric duct in mobile communication system |
US10171813B2 (en) * | 2011-02-24 | 2019-01-01 | Qualcomm Incorporated | Hierarchy of motion prediction video blocks |
GB2488816A (en) * | 2011-03-09 | 2012-09-12 | Canon Kk | Mapping motion vectors from a plurality of reference frames to a single reference frame |
JP5590427B2 (en) * | 2011-03-25 | 2014-09-17 | 日本電気株式会社 | Video processing system, video content monitoring method, video processing apparatus, control method thereof, and control program |
US8638854B1 (en) | 2011-04-07 | 2014-01-28 | Google Inc. | Apparatus and method for creating an alternate reference frame for video compression using maximal differences |
US8754908B2 (en) | 2011-06-07 | 2014-06-17 | Microsoft Corporation | Optimized on-screen video composition for mobile device |
CA2839345A1 (en) * | 2011-06-14 | 2012-12-20 | Zhou Wang | Method and system for structural similarity based rate-distortion optimization for perceptual video coding |
US20120328005A1 (en) * | 2011-06-22 | 2012-12-27 | General Instrument Corporation | Construction of combined list using temporal distance |
JP5848543B2 (en) * | 2011-08-04 | 2016-01-27 | キヤノン株式会社 | Image display device, image display method, and computer program |
WO2013028121A1 (en) * | 2011-08-25 | 2013-02-28 | Telefonaktiebolaget L M Ericsson (Publ) | Depth map encoding and decoding |
US8525883B2 (en) * | 2011-09-02 | 2013-09-03 | Sharp Laboratories Of America, Inc. | Methods, systems and apparatus for automatic video quality assessment |
CN104067317A (en) | 2011-09-08 | 2014-09-24 | 宝福特控股私人有限公司 | System and method for visualizing synthetic objects withinreal-world video clip |
CA2786200C (en) * | 2011-09-23 | 2015-04-21 | Cygnus Broadband, Inc. | Systems and methods for prioritization of data for intelligent discard in a communication network |
US9807386B2 (en) | 2011-09-29 | 2017-10-31 | Telefonaktiebolaget Lm Ericsson (Publ) | Reference picture list handling |
WO2013066045A1 (en) * | 2011-10-31 | 2013-05-10 | 엘지전자 주식회사 | Method and apparatus for initializing reference picture list |
KR101896026B1 (en) * | 2011-11-08 | 2018-09-07 | 삼성전자주식회사 | Apparatus and method for generating a motion blur in a portable terminal |
US10075710B2 (en) * | 2011-11-24 | 2018-09-11 | Thomson Licensing | Video quality measurement |
EP2786342B1 (en) | 2011-11-29 | 2017-09-13 | Thomson Licensing | Texture masking for video quality measurement |
US8751800B1 (en) | 2011-12-12 | 2014-06-10 | Google Inc. | DRM provider interoperability |
US9584832B2 (en) * | 2011-12-16 | 2017-02-28 | Apple Inc. | High quality seamless playback for video decoder clients |
US9432694B2 (en) | 2012-03-06 | 2016-08-30 | Apple Inc. | Signal shaping techniques for video data that is susceptible to banding artifacts |
WO2013141872A1 (en) * | 2012-03-23 | 2013-09-26 | Hewlett-Packard Development Company, L.P. | Method and system to process a video frame using prior processing decisions |
EP2831752A4 (en) * | 2012-03-30 | 2015-08-26 | Intel Corp | Techniques for media quality control |
US9609341B1 (en) | 2012-04-23 | 2017-03-28 | Google Inc. | Video data encoding and decoding using reference picture lists |
WO2013162980A2 (en) | 2012-04-23 | 2013-10-31 | Google Inc. | Managing multi-reference picture buffers for video data coding |
GB2501535A (en) | 2012-04-26 | 2013-10-30 | Sony Corp | Chrominance Processing in High Efficiency Video Codecs |
US8976254B2 (en) * | 2012-06-08 | 2015-03-10 | Apple Inc. | Temporal aliasing reduction and coding of upsampled video |
US8848061B2 (en) * | 2012-06-27 | 2014-09-30 | Apple Inc. | Image and video quality assessment |
KR20140006453A (en) * | 2012-07-05 | 2014-01-16 | 현대모비스 주식회사 | Video data's decoding method and apparatus |
US8953843B1 (en) * | 2012-07-17 | 2015-02-10 | Google Inc. | Selecting objects in a sequence of images |
US8977003B1 (en) * | 2012-07-17 | 2015-03-10 | Google Inc. | Detecting objects in a sequence of images |
EP2875640B1 (en) * | 2012-07-17 | 2017-11-08 | Thomson Licensing | Video quality assessment at a bitstream level |
US20140086310A1 (en) * | 2012-09-21 | 2014-03-27 | Jason D. Tanner | Power efficient encoder architecture during static frame or sub-frame detection |
TWI606418B (en) * | 2012-09-28 | 2017-11-21 | 輝達公司 | Computer system and method for gpu driver-generated interpolated frames |
EP2755381A1 (en) * | 2012-10-23 | 2014-07-16 | ST-Ericsson SA | Motion compensated frame interpolation with frame skipping handling |
US20140118222A1 (en) * | 2012-10-30 | 2014-05-01 | Cloudcar, Inc. | Projection of content to external display devices |
US9257092B2 (en) * | 2013-02-12 | 2016-02-09 | Vmware, Inc. | Method and system for enhancing user experience for remoting technologies |
US9661351B2 (en) * | 2013-03-15 | 2017-05-23 | Sony Interactive Entertainment America Llc | Client side frame prediction for video streams with skipped frames |
US20140373024A1 (en) * | 2013-06-14 | 2014-12-18 | Nvidia Corporation | Real time processor |
US9756331B1 (en) | 2013-06-17 | 2017-09-05 | Google Inc. | Advance coded reference prediction |
KR20160021222A (en) * | 2013-06-18 | 2016-02-24 | 브이아이디 스케일, 인크. | Inter-layer parameter set for hevc extensions |
US9674515B2 (en) * | 2013-07-11 | 2017-06-06 | Cisco Technology, Inc. | Endpoint information for network VQM |
US9324145B1 (en) | 2013-08-08 | 2016-04-26 | Given Imaging Ltd. | System and method for detection of transitions in an image stream of the gastrointestinal tract |
US20150181208A1 (en) * | 2013-12-20 | 2015-06-25 | Qualcomm Incorporated | Thermal and power management with video coding |
US9369724B2 (en) * | 2014-03-31 | 2016-06-14 | Microsoft Technology Licensing, Llc | Decoding and synthesizing frames for incomplete video data |
SG10201406217VA (en) * | 2014-05-30 | 2015-12-30 | Paofit Technology Pte Ltd | Systems and methods for motion-vector-aided video interpolation using real-time smooth video playback speed variation |
GB2527315B (en) * | 2014-06-17 | 2017-03-15 | Imagination Tech Ltd | Error detection in motion estimation |
KR102389312B1 (en) * | 2014-07-08 | 2022-04-22 | 삼성전자주식회사 | Method and apparatus for transmitting multimedia data |
CN112584140B (en) | 2014-11-27 | 2024-08-13 | 株式会社Kt | Method for decoding or encoding video signal |
CN107005696B (en) | 2014-11-27 | 2020-06-26 | 株式会社Kt | Video signal processing method and apparatus |
TWI511530B (en) * | 2014-12-09 | 2015-12-01 | Univ Nat Kaohsiung 1St Univ Sc | Distributed video coding system and decoder for distributed video coding system |
WO2016102365A1 (en) | 2014-12-22 | 2016-06-30 | Thomson Licensing | Method and apparatus for generating an extrapolated image based on object detection |
EP3300374B1 (en) * | 2015-05-22 | 2022-07-06 | Sony Group Corporation | Transmission device, transmission method, image processing device, image processing method, receiving device, and receiving method |
JP6693051B2 (en) * | 2015-05-28 | 2020-05-13 | セイコーエプソン株式会社 | Memory control device, image processing device, display device, and memory control method |
US9704298B2 (en) | 2015-06-23 | 2017-07-11 | Paofit Holdings Pte Ltd. | Systems and methods for generating 360 degree mixed reality environments |
WO2017030380A1 (en) * | 2015-08-20 | 2017-02-23 | Lg Electronics Inc. | Digital device and method of processing data therein |
US20170094288A1 (en) * | 2015-09-25 | 2017-03-30 | Nokia Technologies Oy | Apparatus, a method and a computer program for video coding and decoding |
US10805627B2 (en) | 2015-10-15 | 2020-10-13 | Cisco Technology, Inc. | Low-complexity method for generating synthetic reference frames in video coding |
US10347343B2 (en) * | 2015-10-30 | 2019-07-09 | Seagate Technology Llc | Adaptive read threshold voltage tracking with separate characterization on each side of voltage distribution about distribution mean |
JP6626319B2 (en) * | 2015-11-18 | 2019-12-25 | キヤノン株式会社 | Encoding device, imaging device, encoding method, and program |
US10523939B2 (en) | 2015-12-31 | 2019-12-31 | Facebook, Inc. | Dynamic codec adaption |
US11102516B2 (en) * | 2016-02-15 | 2021-08-24 | Nvidia Corporation | Quality aware error concealment method for video and game streaming and a viewing device employing the same |
US10404979B2 (en) | 2016-03-17 | 2019-09-03 | Mediatek Inc. | Video coding with interpolated reference pictures |
US10368074B2 (en) | 2016-03-18 | 2019-07-30 | Microsoft Technology Licensing, Llc | Opportunistic frame dropping for variable-frame-rate encoding |
US10136155B2 (en) | 2016-07-27 | 2018-11-20 | Cisco Technology, Inc. | Motion compensation using a patchwork motion field |
JP7094076B2 (en) * | 2016-08-19 | 2022-07-01 | 沖電気工業株式会社 | Video coding equipment, programs and methods, as well as video decoding equipment, programs and methods, and video transmission systems. |
US10354394B2 (en) | 2016-09-16 | 2019-07-16 | Dolby Laboratories Licensing Corporation | Dynamic adjustment of frame rate conversion settings |
DE102016221204A1 (en) * | 2016-10-27 | 2018-05-03 | Siemens Aktiengesellschaft | Determining at least one approximated intermediate data set for a real-time application |
CN107067080A (en) * | 2016-12-05 | 2017-08-18 | 哈尔滨理工大学 | Leakage gas-monitoring concentration data virtual expansion method based on core extreme learning machine |
JP6948787B2 (en) * | 2016-12-09 | 2021-10-13 | キヤノン株式会社 | Information processing equipment, methods and programs |
JP6866142B2 (en) * | 2016-12-09 | 2021-04-28 | キヤノン株式会社 | Programs, image processing equipment, and image processing methods |
US20180227502A1 (en) * | 2017-02-06 | 2018-08-09 | Qualcomm Incorporated | Systems and methods for reduced power consumption in imaging pipelines |
CN110383840A (en) * | 2017-03-10 | 2019-10-25 | 索尼公司 | Image processing apparatus and method |
US10779011B2 (en) * | 2017-07-31 | 2020-09-15 | Qualcomm Incorporated | Error concealment in virtual reality system |
GB2586941B (en) | 2017-08-01 | 2022-06-22 | Displaylink Uk Ltd | Reducing judder using motion vectors |
US10880573B2 (en) * | 2017-08-15 | 2020-12-29 | Google Llc | Dynamic motion vector referencing for video coding |
US10284869B2 (en) | 2017-09-28 | 2019-05-07 | Google Llc | Constrained motion field estimation for hardware efficiency |
US11917128B2 (en) * | 2017-08-22 | 2024-02-27 | Google Llc | Motion field estimation based on motion trajectory derivation |
US10659788B2 (en) | 2017-11-20 | 2020-05-19 | Google Llc | Block-based optical flow estimation for motion compensated prediction in video coding |
US10628958B2 (en) * | 2017-09-05 | 2020-04-21 | Htc Corporation | Frame rendering apparatus, method and non-transitory computer readable storage medium |
JP2019050451A (en) * | 2017-09-07 | 2019-03-28 | キヤノン株式会社 | Image processing apparatus and method for controlling the same, and program, and image processing system |
US10523947B2 (en) * | 2017-09-29 | 2019-12-31 | Ati Technologies Ulc | Server-based encoding of adjustable frame rate content |
US10594901B2 (en) | 2017-11-17 | 2020-03-17 | Ati Technologies Ulc | Game engine application direct to video encoder rendering |
US11290515B2 (en) | 2017-12-07 | 2022-03-29 | Advanced Micro Devices, Inc. | Real-time and low latency packetization protocol for live compressed video data |
US10977809B2 (en) | 2017-12-11 | 2021-04-13 | Dolby Laboratories Licensing Corporation | Detecting motion dragging artifacts for dynamic adjustment of frame rate conversion settings |
US10708597B2 (en) | 2018-02-01 | 2020-07-07 | Microsoft Technology Licensing, Llc | Techniques for extrapolating image frames |
EP3547684B1 (en) * | 2018-03-28 | 2020-02-26 | Axis AB | Method, device and system for encoding a sequence of frames in a video stream |
US10516812B2 (en) | 2018-04-02 | 2019-12-24 | Intel Corporation | Devices and methods for selective display frame fetch |
US10798335B2 (en) * | 2018-05-14 | 2020-10-06 | Adobe Inc. | Converting variable frame rate video to fixed frame rate video |
US11153619B2 (en) * | 2018-07-02 | 2021-10-19 | International Business Machines Corporation | Cognitively derived multimedia streaming preferences |
US11997275B2 (en) * | 2018-08-27 | 2024-05-28 | AT Technologies ULC | Benefit-based bitrate distribution for video encoding |
US12050461B2 (en) * | 2018-09-07 | 2024-07-30 | DoorDash, Inc. | Video system with frame synthesis |
US11100604B2 (en) | 2019-01-31 | 2021-08-24 | Advanced Micro Devices, Inc. | Multiple application cooperative frame-based GPU scheduling |
US11178415B2 (en) * | 2019-03-12 | 2021-11-16 | Tencent America LLC | Signaling of CU based interpolation filter selection |
EP3871421A4 (en) * | 2019-03-22 | 2022-01-26 | Tencent America Llc | Method and apparatus for interframe point cloud attribute coding |
US11418797B2 (en) | 2019-03-28 | 2022-08-16 | Advanced Micro Devices, Inc. | Multi-plane transmission |
US10998982B2 (en) | 2019-04-18 | 2021-05-04 | Microsoft Technology Licensing, Llc | Transmitter for throughput increases for optical communications |
US10951342B2 (en) | 2019-04-18 | 2021-03-16 | Microsoft Technology Licensing, Llc | Throughput increases for optical communications |
US10892847B2 (en) | 2019-04-18 | 2021-01-12 | Microsoft Technology Licensing, Llc | Blind detection model optimization |
US10742326B1 (en) * | 2019-04-18 | 2020-08-11 | Microsoft Technology Licensing, Llc | Power-based encoding of data to be transmitted over an optical communication path |
US10911152B2 (en) | 2019-04-18 | 2021-02-02 | Microsoft Technology Licensing, Llc | Power-based decoding of data received over an optical communication path |
US10897315B2 (en) | 2019-04-18 | 2021-01-19 | Microsoft Technology Licensing, Llc | Power-based decoding of data received over an optical communication path |
US10756817B1 (en) | 2019-04-18 | 2020-08-25 | Microsoft Technology Licensing, Llc | Power switching for systems implementing throughput improvements for optical communications |
US10873393B2 (en) | 2019-04-18 | 2020-12-22 | Microsoft Technology Licensing, Llc | Receiver training for throughput increases in optical communications |
US10862591B1 (en) | 2019-04-18 | 2020-12-08 | Microsoft Technology Licensing, Llc | Unequal decision regions for throughput increases for optical communications |
US10938485B2 (en) | 2019-04-18 | 2021-03-02 | Microsoft Technology Licensing, Llc | Error control coding with dynamic ranges |
US11018776B2 (en) | 2019-04-18 | 2021-05-25 | Microsoft Technology Licensing, Llc | Power-based decoding of data received over an optical communication path |
US10686530B1 (en) | 2019-04-18 | 2020-06-16 | Microsoft Technology Licensing, Llc | Power-based encoding of data to be transmitted over an optical communication path |
US10873392B2 (en) | 2019-04-18 | 2020-12-22 | Microsoft Technology Licensing, Llc | Throughput increases for optical communications |
US10742325B1 (en) | 2019-04-18 | 2020-08-11 | Microsoft Technology Licensing, Llc | Power-based encoding of data to be transmitted over an optical communication path |
US10911155B2 (en) | 2019-04-18 | 2021-02-02 | Microsoft Technology Licensing, Llc | System for throughput increases for optical communications |
US11031961B2 (en) | 2019-07-16 | 2021-06-08 | Microsoft Technology Licensing, Llc | Smart symbol changes for optimization of communications using error correction |
US10911284B1 (en) | 2019-07-16 | 2021-02-02 | Microsoft Technology Licensing, Llc | Intelligent optimization of communication systems utilizing error correction |
US11172455B2 (en) | 2019-07-16 | 2021-11-09 | Microsoft Technology Licensing, Llc | Peak to average power output reduction of RF systems utilizing error correction |
US11075656B2 (en) | 2019-07-16 | 2021-07-27 | Microsoft Technology Licensing, Llc | Bit error reduction of communication systems using error correction |
US11086719B2 (en) | 2019-07-16 | 2021-08-10 | Microsoft Technology Licensing, Llc | Use of error correction codes to prevent errors in neighboring storage |
US11044044B2 (en) | 2019-07-16 | 2021-06-22 | Microsoft Technology Licensing, Llc | Peak to average power ratio reduction of optical systems utilizing error correction |
US11063696B2 (en) | 2019-07-16 | 2021-07-13 | Microsoft Technology Licensing, Llc | Increasing average power levels to reduce peak-to-average power levels using error correction codes |
US11303847B2 (en) | 2019-07-17 | 2022-04-12 | Home Box Office, Inc. | Video frame pulldown based on frame analysis |
US10911141B1 (en) * | 2019-07-30 | 2021-02-02 | Microsoft Technology Licensing, Llc | Dynamically selecting a channel model for optical communications |
US10885343B1 (en) * | 2019-08-30 | 2021-01-05 | Amazon Technologies, Inc. | Repairing missing frames in recorded video with machine learning |
IL271774A (en) | 2019-12-31 | 2021-06-30 | Bottega Studios Ltd | System and method for dynamic images virtualisation |
CN111770332B (en) * | 2020-06-04 | 2022-08-09 | Oppo广东移动通信有限公司 | Frame insertion processing method, frame insertion processing device, storage medium and electronic equipment |
CN113949930B (en) * | 2020-07-17 | 2024-03-12 | 晶晨半导体(上海)股份有限公司 | Method for selecting reference frame, electronic device and storage medium |
US20220038654A1 (en) * | 2020-07-30 | 2022-02-03 | Nvidia Corporation | Techniques to generate interpolated video frames |
TWI740655B (en) * | 2020-09-21 | 2021-09-21 | 友達光電股份有限公司 | Driving method of display device |
US11568527B2 (en) * | 2020-09-24 | 2023-01-31 | Ati Technologies Ulc | Video quality assessment using aggregated quality values |
US11488328B2 (en) | 2020-09-25 | 2022-11-01 | Advanced Micro Devices, Inc. | Automatic data format detection |
EP3989530A1 (en) | 2020-10-23 | 2022-04-27 | Axis AB | Generating substitute image frames based on camera motion |
US11636682B2 (en) | 2020-11-05 | 2023-04-25 | International Business Machines Corporation | Embedding contextual information in an image to assist understanding |
US20230407239A1 (en) * | 2020-11-13 | 2023-12-21 | Teewinot Life Sciences Corporation | Tetrahydrocannabinolic acid (thca) synthase variants, and manufacture and use thereof |
US20220224924A1 (en) * | 2021-01-11 | 2022-07-14 | Tencent America LLC | Hierarchical structure for neural network based tools in video coding |
US11558621B2 (en) * | 2021-03-31 | 2023-01-17 | Qualcomm Incorporated | Selective motion-compensated frame interpolation |
US20220360814A1 (en) * | 2021-05-06 | 2022-11-10 | Apple Inc. | Enhanced motion vector prediction |
EP4304167A1 (en) | 2021-06-14 | 2024-01-10 | Samsung Electronics Co., Ltd. | Electronic device carrying out video call by using frc, and operation method for electronic device |
US11755272B2 (en) | 2021-12-10 | 2023-09-12 | Vmware, Inc. | Method and system for using enhancement techniques to improve remote display while reducing hardware consumption at a remote desktop |
WO2023174546A1 (en) * | 2022-03-17 | 2023-09-21 | Dream Chip Technologies Gmbh | Method and image processor unit for processing image data |
US12010450B2 (en) * | 2022-03-21 | 2024-06-11 | Novatek Microelectronics Corp. | On-screen display (OSD) image processing method |
US12008729B2 (en) * | 2022-03-21 | 2024-06-11 | Novatek Microelectronics Corp. | On-screen display (OSD) image processing method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1422928A2 (en) * | 2002-11-22 | 2004-05-26 | Matsushita Electric Industrial Co., Ltd. | Motion compensated interpolation of digital video signals |
CN101023677A (en) * | 2004-07-20 | 2007-08-22 | 高通股份有限公司 | Method and apparatus for frame rate up conversion with multiple reference frames and variable block sizes |
Family Cites Families (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2518239B2 (en) | 1986-12-26 | 1996-07-24 | キヤノン株式会社 | Digital image data processor |
US5642170A (en) | 1993-10-11 | 1997-06-24 | Thomson Consumer Electronics, S.A. | Method and apparatus for motion compensated interpolation of intermediate fields or frames |
GB2305569B (en) | 1995-09-21 | 1999-07-21 | Innovision Res Ltd | Motion compensated interpolation |
US6075918A (en) * | 1995-10-26 | 2000-06-13 | Advanced Micro Devices, Inc. | Generation of an intermediate video bitstream from a compressed video bitstream to enhance playback performance |
WO1997046020A2 (en) * | 1996-05-24 | 1997-12-04 | Philips Electronics N.V. | Motion vector processing |
FR2750558B1 (en) * | 1996-06-28 | 1998-08-28 | Thomson Multimedia Sa | FRAME INTERPOLATION METHOD FOR FILM MODE COMPATIBILITY |
JP3609571B2 (en) | 1997-03-12 | 2005-01-12 | 株式会社東芝 | Image playback device |
US6192079B1 (en) * | 1998-05-07 | 2001-02-20 | Intel Corporation | Method and apparatus for increasing video frame rate |
US6594313B1 (en) * | 1998-12-23 | 2003-07-15 | Intel Corporation | Increased video playback framerate in low bit-rate video applications |
US6760378B1 (en) | 1999-06-30 | 2004-07-06 | Realnetworks, Inc. | System and method for generating video frames and correcting motion |
JP2001352544A (en) | 2000-06-08 | 2001-12-21 | Matsushita Graphic Communication Systems Inc | Image coder and image coding method |
JP2004507957A (en) | 2000-08-29 | 2004-03-11 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Algorithm execution method and scalable programmable processing device |
FR2820927B1 (en) | 2001-02-15 | 2003-04-11 | Thomson Multimedia Sa | METHOD AND DEVICE FOR DETECTING THE RELIABILITY OF A FIELD OF MOVING VECTORS |
KR20030024839A (en) * | 2001-06-08 | 2003-03-26 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Method and system for displaying a video frame |
US7088774B1 (en) * | 2002-05-29 | 2006-08-08 | Microsoft Corporation | Media stream synchronization |
GB2394136B (en) | 2002-09-12 | 2006-02-15 | Snell & Wilcox Ltd | Improved video motion processing |
CN100438609C (en) | 2002-10-22 | 2008-11-26 | 皇家飞利浦电子股份有限公司 | Image processing unit with fall-back |
JP2005006275A (en) | 2002-11-22 | 2005-01-06 | Matsushita Electric Ind Co Ltd | Device, method, and program for generating interpolation frame |
WO2004064373A2 (en) | 2003-01-09 | 2004-07-29 | The Regents Of The University Of California | Video encoding methods and devices |
JP4220284B2 (en) | 2003-03-28 | 2009-02-04 | 株式会社東芝 | Frame interpolation method, apparatus, and image display system using the same |
US20040218669A1 (en) | 2003-04-30 | 2004-11-04 | Nokia Corporation | Picture coding method |
ATE349860T1 (en) * | 2003-05-02 | 2007-01-15 | Koninkl Philips Electronics Nv | ADVANCED MOTION VECTOR INTERPOLATION TO REDUCE VIDEO ARTIFACTS |
US7558320B2 (en) * | 2003-06-13 | 2009-07-07 | Microsoft Corporation | Quality control in frame interpolation with motion analysis |
US20050100235A1 (en) * | 2003-11-07 | 2005-05-12 | Hao-Song Kong | System and method for classifying and filtering pixels |
JP2005223454A (en) | 2004-02-03 | 2005-08-18 | Nec Access Technica Ltd | Mobile phone with tv function |
US20070195883A1 (en) | 2004-03-19 | 2007-08-23 | Koninklijke Philips Electronics, N.V. | Media signal processing method, corresponding system, and application thereof in a resource-scalable motion estimator |
WO2006007527A2 (en) | 2004-07-01 | 2006-01-19 | Qualcomm Incorporated | Method and apparatus for using frame rate up conversion techniques in scalable video coding |
CN1717056A (en) * | 2004-07-02 | 2006-01-04 | 三菱电机株式会社 | Frame internal prediction of highpass time filter frame for wavelet video frequency encoding |
RU2377737C2 (en) * | 2004-07-20 | 2009-12-27 | Квэлкомм Инкорпорейтед | Method and apparatus for encoder assisted frame rate up conversion (ea-fruc) for video compression |
US20090317420A1 (en) | 2004-07-29 | 2009-12-24 | Chiron Corporation | Immunogenic compositions for gram positive bacteria such as streptococcus agalactiae |
US8861601B2 (en) | 2004-08-18 | 2014-10-14 | Qualcomm Incorporated | Encoder-assisted adaptive video frame interpolation |
JP4515870B2 (en) | 2004-09-24 | 2010-08-04 | パナソニック株式会社 | Signal processing apparatus and video system |
JP2006217569A (en) | 2005-01-07 | 2006-08-17 | Toshiba Corp | Apparatus, method, and program for image coded string conversion |
TWI274509B (en) | 2005-02-22 | 2007-02-21 | Sunplus Technology Co Ltd | Method and system for dynamically adjusting motion estimation |
US20060233253A1 (en) * | 2005-03-10 | 2006-10-19 | Qualcomm Incorporated | Interpolated frame deblocking operation for frame rate up conversion applications |
JP4398925B2 (en) * | 2005-03-31 | 2010-01-13 | 株式会社東芝 | Interpolation frame generation method, interpolation frame generation apparatus, and interpolation frame generation program |
US7876833B2 (en) | 2005-04-11 | 2011-01-25 | Sharp Laboratories Of America, Inc. | Method and apparatus for adaptive up-scaling for spatially scalable coding |
US9258519B2 (en) | 2005-09-27 | 2016-02-09 | Qualcomm Incorporated | Encoder assisted frame rate up conversion using various motion models |
US20070074251A1 (en) | 2005-09-27 | 2007-03-29 | Oguz Seyfullah H | Method and apparatus for using random field models to improve picture and video compression and frame rate up conversion |
TW200727705A (en) | 2005-09-27 | 2007-07-16 | Qualcomm Inc | Method and apparatus for using random field models to improve picture and video compression and frame rate up conversion |
JP4468884B2 (en) * | 2005-12-09 | 2010-05-26 | リンテック株式会社 | Tape sticking device, mounting device, and mounting method |
JP4303748B2 (en) * | 2006-02-28 | 2009-07-29 | シャープ株式会社 | Image display apparatus and method, image processing apparatus and method |
EP1843587A1 (en) * | 2006-04-05 | 2007-10-10 | STMicroelectronics S.r.l. | Method for the frame-rate conversion of a digital video signal and related apparatus |
US8582660B2 (en) | 2006-04-13 | 2013-11-12 | Qualcomm Incorporated | Selective video frame rate upconversion |
JP2007311843A (en) | 2006-05-16 | 2007-11-29 | Sony Corp | Noninterlace processing apparatus, display apparatus, and program |
JP4178480B2 (en) * | 2006-06-14 | 2008-11-12 | ソニー株式会社 | Image processing apparatus, image processing method, imaging apparatus, and imaging method |
US20080025390A1 (en) | 2006-07-25 | 2008-01-31 | Fang Shi | Adaptive video frame interpolation |
US20080055311A1 (en) | 2006-08-31 | 2008-03-06 | Ati Technologies Inc. | Portable device with run-time based rendering quality control and method thereof |
JP2008067205A (en) * | 2006-09-08 | 2008-03-21 | Toshiba Corp | Frame interpolation circuit, frame interpolation method, and display device |
US9883202B2 (en) | 2006-10-06 | 2018-01-30 | Nxp Usa, Inc. | Scaling video processing complexity based on power savings factor |
JP4687658B2 (en) | 2007-01-29 | 2011-05-25 | 株式会社デンソー | Image recognition device |
US8605786B2 (en) * | 2007-09-04 | 2013-12-10 | The Regents Of The University Of California | Hierarchical motion vector processing method, software and devices |
US8660175B2 (en) * | 2007-12-10 | 2014-02-25 | Qualcomm Incorporated | Selective display of interpolated or extrapolated video units |
-
2008
- 2008-04-29 US US12/111,782 patent/US8660175B2/en not_active Expired - Fee Related
- 2008-04-29 US US12/111,718 patent/US9426414B2/en not_active Expired - Fee Related
- 2008-04-29 US US12/111,738 patent/US8953685B2/en active Active
- 2008-12-10 KR KR1020107015389A patent/KR101136293B1/en not_active IP Right Cessation
- 2008-12-10 KR KR1020107015390A patent/KR101149205B1/en not_active IP Right Cessation
- 2008-12-10 CN CN200880125045.4A patent/CN101919249B/en active Active
- 2008-12-10 TW TW097148041A patent/TW200950528A/en unknown
- 2008-12-10 EP EP08860276A patent/EP2232878B1/en not_active Not-in-force
- 2008-12-10 EP EP08859971.7A patent/EP2232871B1/en not_active Not-in-force
- 2008-12-10 WO PCT/US2008/086279 patent/WO2009076468A2/en active Application Filing
- 2008-12-10 TW TW097148055A patent/TW200943967A/en unknown
- 2008-12-10 JP JP2010538138A patent/JP5502747B2/en not_active Expired - Fee Related
- 2008-12-10 TW TW097148036A patent/TW200943974A/en unknown
- 2008-12-10 CN CN2008801250420A patent/CN101919255B/en active Active
- 2008-12-10 JP JP2010538139A patent/JP5437265B2/en not_active Expired - Fee Related
- 2008-12-10 CN CN2008801265229A patent/CN101939992B/en active Active
- 2008-12-10 WO PCT/US2008/086277 patent/WO2009076466A1/en active Application Filing
- 2008-12-10 KR KR1020107015387A patent/KR101178553B1/en active IP Right Grant
- 2008-12-10 JP JP2010538137A patent/JP5502746B2/en not_active Expired - Fee Related
- 2008-12-10 EP EP08859249.8A patent/EP2232870B1/en not_active Not-in-force
- 2008-12-10 KR KR1020127019456A patent/KR101268990B1/en not_active IP Right Cessation
- 2008-12-10 WO PCT/US2008/086282 patent/WO2009076471A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1422928A2 (en) * | 2002-11-22 | 2004-05-26 | Matsushita Electric Industrial Co., Ltd. | Motion compensated interpolation of digital video signals |
CN101023677A (en) * | 2004-07-20 | 2007-08-22 | 高通股份有限公司 | Method and apparatus for frame rate up conversion with multiple reference frames and variable block sizes |
Non-Patent Citations (2)
Title |
---|
A. Eden.No-Reference Estimation of the Coding PSNR for H.264-Coded Sequences.《IEEE Transactions on Consumer Electronics》.2007,第53卷(第2期),第667-674页. * |
Thomas Wiegand et al.Rate-Constrained Coder Control and Comparison of Video Coding Standards.《IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY》.2003,第13卷(第7期),第688-703页. * |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101919255B (en) | Reference selection for video interpolation or extrapolation | |
US8374246B2 (en) | Method and apparatus for encoder assisted-frame rate up conversion (EA-FRUC) for video compression | |
US9197912B2 (en) | Content classification for multimedia processing | |
CN101444093B (en) | Selective video frame rate is upwards changed | |
US20110090960A1 (en) | Rate Control Model Adaptation Based on Slice Dependencies for Video Coding | |
CN103124353A (en) | Motion prediction method and video coding method | |
CN110996102A (en) | Video coding method and device for inhibiting intra-frame block respiration effect in P/B frame | |
US20240040127A1 (en) | Video encoding method and apparatus and electronic device | |
JP2007124580A (en) | Moving picture encoding program, program storage medium and encoder | |
EP1921866A2 (en) | Content classification for multimedia processing | |
Rezaei et al. | Bit allocation for variable bitrate video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |