WO2005034517A1 - Adaptive reference picture generation - Google Patents
Adaptive reference picture generation Download PDFInfo
- Publication number
- WO2005034517A1 WO2005034517A1 PCT/US2004/028650 US2004028650W WO2005034517A1 WO 2005034517 A1 WO2005034517 A1 WO 2005034517A1 US 2004028650 W US2004028650 W US 2004028650W WO 2005034517 A1 WO2005034517 A1 WO 2005034517A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- picture
- processing unit
- filter
- video encoder
- motion
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/105—Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/46—Embedding additional information in the video signal during the compression process
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/60—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
- H04N19/61—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/80—Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
- H04N19/82—Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation involving filtering within a prediction loop
Definitions
- the present invention is directed towards video encoders and decoders (CODECs), and more particularly towards the use of reference pictures as predictors in CODECS.
- encoders and decoders generally rely on intra and inter coding in order to achieve compression.
- intra coding spatial prediction methods are used; while for inter coding, compression is achieved by exploiting the temporal correlation that may exist between pictures. More specifically, previously encoded/decoded pictures are used as references for future pictures, while motion estimation and compensation are employed in order to compensate for any motion activity between these pictures.
- More advanced CODECs such as H.264 also consider lighting variations, such as during a fade in/out, in order to generate a more accurate prediction, when necessary.
- deblocking methods may also be used in an effort to reduce blocking artifacts created through the prediction and quantization processes.
- the typical methods related to inter coding fail to consider some additional properties and features that may considerably affect the entire prediction process.
- a picture may contain several types of noise, such as film grain or speckle noise. This kind of noise tends to be completely uncorrelated from one picture to the other, which would imply that any remaining noise within a reference picture would most likely need to be compensated during the encoding process. Although some of this noise may be removed through the quantization process and possibly through non-normative thresholding introduced in the encoder, it can still have a considerable negative impact in coding efficiency.
- the invention is a video encoder, decoder, and corresponding methods for encoding (and by corollary - decoding) a input picture or image block using a prediction from a reference-only picture.
- An exemplary encoder includes a picture buffer for storing a previously coded picture, and a reference processing unit in signal communication with the picture buffer for generating the reference-only picture from a previously coded picture.
- An exemplary encoding method includes receiving a substantially uncompressed image block, filtering a previously coded picture to create an adaptive reference, motion compensating the adaptive reference, subtracting the motion compensated adaptive reference from the substantially uncompressed image block, and encoding the difference between the substantially uncompressed image block and the motion compensated adaptive reference.
- Figure 1 shows a block diagram of a video encoder for adaptive reference regeneration according to an embodiment of the present invention
- Figure 2 shows a block diagram of a video decoder for adaptive reference regeneration according to an embodiment of the present invention
- Figure 3 shows a block diagram of a video encoder for adaptive reference regeneration according to another embodiment of the present invention
- Figure 4 shows a block diagram of a video decoder for adaptive reference regeneration according to another embodiment of the present invention
- Figure 5 shows a pictorial representation of reference generation by pixel projection according to an embodiment of the present invention
- Figure 6 shows a pictorial representation of reference generation by motion projection according to an embodiment of the present invention
- Figure 7 shows a flow diagram for encoding with adaptive reference generation in accordance with principles of the present invention
- Figure 8 shows a flow diagram for decoding with adaptive reference generation in accordance with principles of the present invention.
- a filter is applied to a previously coded picture before using it for reference when coding later pictures, thus allowing a further improvement in encoding efficiency. It can be used to retain noise information during the encoding process, when desired, to preserve the artistic content of display pictures.
- the instant description illustrates the principles of several embodiments of the invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown as embodiments of the invention herein, embody the principles of the invention and are included within its spirit and scope.
- any flow charts, flow diagrams, state transition diagrams, pseudocode, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
- the functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
- the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared.
- processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), and non-volatile storage. Other hardware, conventional and/or custom, may also be included.
- DSP digital signal processor
- ROM read-only memory
- RAM random access memory
- non-volatile storage Other hardware, conventional and/or custom, may also be included.
- any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
- any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function.
- the invention as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. Applicant thus regards any means that can provide those functionalities as equivalent to those shown herein.
- a video encoder is indicated generally by the reference numeral 100.
- An input to the encoder 100 is connected in signal communication with each of a spatial filter 102 and input picture stores 104.
- the output of the spatial filter 102 is switchably connected in signal communication with a first position of a switchable non-inverting input of a summing junction 110.
- the input picture stores 104 is connected in signal communication with a noise reducer 106, which is switchably connected in signal communication with a second position of the switchable non-inverting input of the summing junction 110.
- the output of the summing junction 110 is connected in signal communication with a block transformer 120.
- the transformer 120 is connected in signal communication with a quantizer 130.
- the output of the quantizer 130 is connected in signal communication with an entropy or variable length coder ("VLC") 140, where the output of the VLC 140 is an externally available output of the encoder 100.
- the output of the quantizer 130 is further connected in signal communication with an inverse quantizer 150.
- the inverse quantizer 150 is connected in signal communication with an inverse block transformer 160, which, in turn, is connected in signal communication with an in-loop filter 172.
- the in-loop filter 172 is connected in signal communication with reference picture stores 175.
- a first output of the reference picture stores 175 is connected in signal communication with a first input of a motion estimator and mode decision unit 180.
- the output of the spatial filter 102 is further connected in signal communication with a second input of the motion estimator 180.
- a first output of the motion estimator 180 is connected in signal communication with a first input of a motion compensator 190.
- a second output of the motion estimator 180 is connected in signal communication with a second input of the noise reducer 106.
- a second output of the reference picture stores 175 is connected in signal communication with a second input of the motion compensator 190.
- the output of the motion compensator 190 is connected in signal communication with an inverting input of the summing junction 110.
- a transmitter 201 is in signal communication with a receiver 204.
- the transmitter 201 includes a film grain removal unit 211 and a film grain modeling unit 212, each for receiving an input video signal.
- the film grain removal unit 211 is in signal communication with each of the film grain modeling unit 212 and a video encoder 213.
- the video encoder 213 is in signal communication with a video decoder 202 of the receiver 204 for transmitting a compressed bitstream.
- the film grain modeling unit 212 is in signal communication with a film grain simulation unit 278 of the receiver 204 for transmitting a Supplemental Enhancement Information ("SEI") message.
- SEI Supplemental Enhancement Information
- the video decoder 202 is in signal communication with each of the film grain simulation unit 278 and a first non- inverting input of a summing unit 282.
- the film grain simulation unit 278, in turn, is in signal communication with a second non-inverting input of the summing unit 282.
- the summing unit 282 provides a signal for display.
- the receiver 204 includes a variable length decoder ("VLD") 210 for receiving a bitstream, connected in signal communication with an inverse quantizer 220.
- the inverse quantizer is connected with an inverse block transformer 230.
- the inverse transformer is connected in signal communication with a first input terminal of an adder or summing junction 240.
- the output of the summing junction 240 is connected in signal communication with a loop filter 272.
- the loop filter 272 is connected in signal communication with a frame buffer 274.
- the frame buffer 274 is connected in signal communication with each of a motion compensator 260, a film grain simulation unit 278, and a first non-inverting input of a summing junction 282.
- the VLD 210 output is also coupled as an second input to the motion compensator 260.
- the output of the motion compensator 260 is connected in signal communication with a second input terminal of the summing junction 240.
- the film grain simulation unit 278 has a second input for receiving noise parameters, such as an SEI message, for example.
- the output of the film grain simulation unit 278 is connected in signal communication with a second non-inverting input of the summing junction 282.
- the output of the summing junction 282 provides the output from the receiver 204.
- a video encoder is indicated generally by the reference numeral 300.
- An input to the encoder 300 is connected in signal communication with a non-inverting input of a summing junction 310.
- the output of the summing junction 310 is connected in signal communication with a block transformer 320.
- the transformer 320 is connected in signal communication with a quantizer 330.
- the output of the quantizer 330 is connected in signal communication with a variable length coder ("VLC") 340, where the output of the VLC 340 is an externally available bitstream output of the encoder 300.
- the output of the quantizer 330 is further connected in signal communication with an inverse quantizer 350.
- the inverse quantizer 350 is connected in signal communication with an inverse block transformer 360, which, in turn, is connected in signal communication with a first non-inverting input of a summing junction 370.
- the output of the summing junction 370 is connected in signal communication with a loop filter 372, and optionally connected in signal communication with a reference processing unit 376.
- the loop filter 372 is connected in signal communication with a frame buffer 374.
- the frame buffer 374 is connected in signal communication with the reference processing unit 376, and optionally connected in signal communication with a motion compensation unit 390.
- the reference processing unit 376 is connected in signal communication with each of the VLC 340, the frame buffer 374, a motion estimation unit 380, and the motion compensation unit 390.
- the input to the encoder 300 is further connected in signal communication with a second input of the motion estimator 380.
- Outputs of the motion estimator 380 are connected in signal communication with a third input of the motion compensator 390, and a third input of the VLC 340.
- the output of the motion compensator 390 is connected in signal communication with an inverting input of the summing junction 310.
- a video decoder is indicated generally by the reference numeral 400.
- the video decoder 400 includes a variable length decoder ("VLD") 410 for receiving a bitstream, connected in signal communication with an inverse quantizer 420.
- the inverse quantizer is connected with an inverse transformer 430.
- the inverse transformer 430 is connected in signal communication with a first input terminal of an adder or summing junction 440.
- the output of the summing junction 440 is connected in signal communication with a loop filter 472, (and optionally connected in signal communication with a reference processing unit 476 [not shown] instead of the loop filter 472, effectively bypassing the loop filter).
- the loop filter 472 is connected in signal communication with a frame buffer 474.
- a first output of the frame buffer 474 is connected in signal communication with a first input of the reference processing unit 476.
- the VLD 410 is connected in signal communication with a second input of the reference processing unit 476.
- a first output of the reference processing unit 476 is connected in signal communication with a motion compensator 460, which is connected in signal communication with a second non-inverting input terminal of the summing junction 440, and optionally connected in signal communication with an input of the reference processing unit 476 [not shown].
- An output of the video decoder 400 is switchably connected in signal communication with a second output of the frame buffer 474, and a second output of the reference processing unit 476, respectively.
- reference generation by pixel projection is indicated generally by the reference numeral 500, where each pixel is projected to a new position according to its previous motion vector.
- reference generation by motion projection is indicated generally by the reference numeral 600, where each block in the current reference is assumed to have the same MV as its co-located block.
- a process for encoding with adaptive reference generation in accordance with principles of the present invention is indicated generally by the reference numeral 700.
- the process includes a start block 710 that passes control to an input block 712.
- the input block 712 receives substantially uncompressed image block data, and passes control to a function block 714, which applies a filter to a stored previously coded picture to create an adaptive reference.
- the function block 714 passes control to a function block 718, which motion compensates the adaptive reference in correspondence with the motion vectors, and passes control to a function block 722.
- the function block 722 subtracts the motion compensated adaptive reference from the substantially uncompressed image block, and passes control to a function block 724.
- the function block 724 encodes a signal with the difference between the substantially uncompressed image block and the motion compensated adaptive reference, and passes control to a function block 726.
- the function block 726 adds a decoded difference to the motion compensated adaptive reference to form a decoded picture, and passes control to a function block 728.
- the function block 728 stores the decoded picture in a picture buffer, and passes control to an end block 730.
- a process for decoding with adaptive reference generation in accordance with principles of the present invention is indicated generally by the reference numeral 800.
- the process includes a start block 810 that passes control to an input block 812.
- the input block 812 receives coded picture data, and passes control to a function block 814.
- the function block 814 applies a filter to a stored previously coded picture to create an adaptive reference, and passes control to a function block 816.
- the function block 816 motion compensates the adaptive reference, and passes control to a function block 818.
- the function block 818 decodes the coded difference, and passes control to a function block 824.
- the function block 824 adds the motion compensated adaptive reference to the decoded difference to form a decoded picture, and passes control to a function block 826.
- the function block 826 stores and displays the decoded picture, and passes control to an end block 828.
- pre-processing methods such as spatial and/or temporal filtering, for example, in an attempt to remove noise from a video sequence. This process essentially improves the spatial and/or temporal relationships within the sequence, which leads to better encoding efficiency.
- retaining some types of noise may be desirable (e.g., film grain noise within film type HD content).
- the presently disclosed architecture introduces an additional step within the encoding and decoding process where, using known spatial and temporal information that is available on both the encoder and decoder, embodiments can, if necessary, analyze and process a decoded picture and generate a new picture that can be used as a reference for a future picture.
- the decision and the entire reference generation procedure may be based on additional information encoded within the bitstream, or may be adaptively decided based on context.
- this new picture may also be used for display purposes, based on a bitstream signal or decoder decision.
- an additional step within the encoding and decoding process is introduced for the generation of the inter prediction reference pictures that can lead to further improvement in coding efficiency compared to existing systems.
- a feature of the presently disclosed system is that for certain sequences such as noisy content, a previously decoded picture may not be the best possible reference since noise is usually not temporally correlated. Although noise can be removed in a pre-processing stage, it might be desirable that it be retained since it may itself be part of the actual content, as is sometimes the case for film-grain in film type content, for example.
- an additional optional filtering process may be applied to each reference picture using filters such as, for example, a median filter, Wiener filtering, Geometric Mean, Least Square, and the like, as well as combinations thereof. Filtering can additionally be used to handle and remove other types of noise that the sub-pixel interpolation does not remove. Linear filters may also be used, such as a simple averaging filter, for example, but without necessarily having to consider sub-pixel positions. It is also possible to consider temporal methods, such as temporal filtering, or even using the motion information of previous pictures to generate a new motion compensated reference, such as using global motion compensation, for example. For simplicity, these types of methods may be referred to as "filtering" or using a "filter” herein.
- a simple method would be to encode a signal for every picture, such as a 1 bit signal, for example, that specifies that if this picture is referenced by another picture, the filtered version will be used instead. In this case, it can be predefined on both encoder and decoder that the sub-pixel positions will be generated either by using the original samples or the filtered ones. Alternatively, an additional signal can be transmitted that specifies this operation. Filter parameters and/or information may also be signaled within the stream for each picture.
- This method has the obvious benefit that no additional memory is required, but may also limit the encoder's flexibility since for some pictures a reference using a different filter, or none at all, would be more beneficial. Instead, a more flexible solution would be to specify for each picture whether or not its references are filtered. This solution allows the encoder to better adapt to the characteristics of each picture and achieve higher performance. It is again possible to use the same filter for all references, or even allow different filters for each reference that may be signaled within the bitstream. One may also allow the same picture to be reused as a reference without any, or with different filtering options, considering that some areas of a picture might have different characteristics and may benefit from different filtering methods.
- the filter would essentially be selected through the reference index associated to each reference at the macroblock level, somewhat similar to what is currently done for the explicit weighted prediction mode within H.264. More specifically, the use and the parameters required by explicit weighted prediction are signaled within the picture and slice header during encoding.
- the picture header for example, contains the parameters weighted_pred_flag and weighted_bipred_flag, which specify whether and which mode of weighted prediction is to be used.
- weighted_pred_flag is 1
- weighted_bipred_flag is also 1
- explicit weighted prediction is used for B pictures. If either of these parameters is set to 1 and the proper slice type is used, then the prediction weight table (pred_weight_table) elements are also transmitted within the slice header.
- the particular weight and offset to use for a particular reference picture is indicated by the reference picture index of a particular macroblock or macroblock partition.
- additional elements that will signal the new prediction method and its parameters in the same headers. For example, one can define a new picture level syntax element named adaptive_ref_pred_flag. If this is equal to 1 , an additional table is transmitted within the slice header (e.g.
- adaptive_ref_table that will contain additional parameters for each reference picture in either list, which can include filtering methods and filter parameters.
- the filter parameters can be dynamic and depend on the filtering method to be used. For example, no additional parameters are necessary for a median filter, while for a separable filter one may need the number of taps and the coefficients.
- the reference picture index may be used to select a particular filter for a particular macroblock or macroblock partition. The above method would imply additional computation and storage on the encoder and decoder, although it might be possible to compute the filtered values on the fly and reduce the storage requirements.
- calculating the sub- pixel values for each one of these references is not always necessary, and only the sub-pixel values generated from the original might be calculated and stored thereby further reducing the complexity of such an encoder.
- a unique element included in the encoder and decoder is the Reference Processing module. From Figure 3, it is possible to select to process and generate a reference from a picture after or before the deblocking process, while filtering is not mandatory. Motion information from previously encoded pictures may also be used to generate a motion-projected reference. This reference may be generated by considering that, similar to temporal direct used within H.264, motion remains relatively continuous from one adjacent picture to another. Using this observation, the reference may be generated by using pixel projection, where each pixel is projected to a new position according to its previous motion vector, as shown in Figure 5; or by using motion projection, where each macroblock in the current reference is assumed to have the same MV as its co-located block, as shown in Figure 6.
- Both methods may also be combined using a weighted average, while the pixels projected may be also be filtered using other methods.
- a similar method as in multispectral image enhancement, where multiple pictures are combined, such as through motion/pixel projection or filtering, to generate a single reference, comparable to super-resolution imaging or salient stills.
- Exemplary filters include: a) 1x3, b) 3x1 , c) the separable (first 1x3 followed by 3x1 ), and d) 3x3 median, f) their weighted averages with the original references (i.e.
- ref (axMed 1x3 + bxMed 3x1 + cxMed 3 ⁇ 3+ dxoriginal + (a+b+c+d)/2)/(a+b+c+d) ), g) wiener filtering (due to its handling of Gaussian noise), h) a 3x3 averaging filter with coefficients : A , or I) a simple separable n-tap filter, j) the thresholded average (use filter h only if result is considerably different than original sample), and combinations of these filters.
- Other filters may also be used and may be signaled in the bitstream or be known in both the encoder and decoder, and selected through parameters within the bitstream, such as through the reference index, for example.
- the current pixel and its 8 surrounding pixels within the original picture were selected and then sorted. If the current pixel was the same as the 3x3 median, no other operation was performed. If not, and if the current pixel was either the largest or smallest sample, then this was replaced with its closest sample, or in a different implementation the average between its closest sample and itself. Otherwise, this sample was replaced with the average of the value of the current and its two closest samples.
- the decoder must know exactly which filter was used in order to generate an identical reference and avoid drift.
- This filter may be known, and identical on both encoder and decoder, while it is also possible to transmit this filter at the picture or slice level. Although transmitting the entire filter at the MB level would imply a considerable overhead, it might nevertheless be possible to transmit some additional parameters that may adjust part of the filter, such as an additional MV scaling parameter if the filter is using a global motion compensated reference, and allow extra flexibility.
- the encoder needs to be able to select the proper processing method with minimal complexity. For this purpose, pre-analysis methods of the reference and the current picture may be used, such as noise estimation, image correlation, and the like.
- the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPU"), a random access memory (“RAM”), and input/output (“I/O") interfaces.
- the computer platform may also include an operating system and microinstruction code.
- the various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU.
- various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. It is to be further understood that, because some of the constituent system components and methods depicted in the accompanying drawings are preferably implemented in software, the actual connections between the system components or the process function blocks may differ depending upon the manner in which the present invention is programmed.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Compression, Expansion, Code Conversion, And Decoders (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2004800256296A CN1846444B (en) | 2003-09-17 | 2004-09-02 | Adaptive reference picture generation |
JP2006526923A JP5330647B2 (en) | 2003-09-17 | 2004-09-02 | Adaptive reference image generation |
BRPI0414397-3A BRPI0414397A (en) | 2003-09-17 | 2004-09-02 | adaptive reference imaging |
KR1020067005095A KR101094323B1 (en) | 2003-09-17 | 2004-09-02 | Adaptive reference picture generation |
EP04783029A EP1665804A1 (en) | 2003-09-17 | 2004-09-02 | Adaptive reference picture generation |
US10/569,695 US8094711B2 (en) | 2003-09-17 | 2004-09-02 | Adaptive reference picture generation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US50457503P | 2003-09-17 | 2003-09-17 | |
US60/504,575 | 2003-09-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005034517A1 true WO2005034517A1 (en) | 2005-04-14 |
Family
ID=34421519
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2004/028650 WO2005034517A1 (en) | 2003-09-17 | 2004-09-02 | Adaptive reference picture generation |
Country Status (7)
Country | Link |
---|---|
US (1) | US8094711B2 (en) |
EP (1) | EP1665804A1 (en) |
JP (1) | JP5330647B2 (en) |
KR (1) | KR101094323B1 (en) |
CN (1) | CN1846444B (en) |
BR (1) | BRPI0414397A (en) |
WO (1) | WO2005034517A1 (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1841230A1 (en) * | 2006-03-27 | 2007-10-03 | Matsushita Electric Industrial Co., Ltd. | Adaptive wiener filter for video coding |
EP1845729A1 (en) * | 2006-04-12 | 2007-10-17 | Matsushita Electric Industrial Co., Ltd. | Transmission of post-filter hints |
EP1894418A2 (en) * | 2005-06-24 | 2008-03-05 | NTT DoCoMo Inc. | Method and apparatus for video encoding and decoding using adaptive interpolation |
WO2008075247A1 (en) | 2006-12-18 | 2008-06-26 | Koninklijke Philips Electronics N.V. | Image compression and decompression |
EP2109322A2 (en) * | 2008-04-09 | 2009-10-14 | Intel Corporation | In-loop adaptive Wiener filter for video coding and decoding |
WO2009133845A1 (en) * | 2008-04-30 | 2009-11-05 | 株式会社 東芝 | Video encoding/decoding device and method |
EP2249572A1 (en) * | 2008-03-07 | 2010-11-10 | Kabushiki Kaisha Toshiba | Dynamic image encoding/decoding method and device |
EP2252063A1 (en) * | 2008-03-07 | 2010-11-17 | Kabushiki Kaisha Toshiba | Dynamic image encoding/decoding device |
WO2010131537A1 (en) * | 2009-05-11 | 2010-11-18 | 株式会社エヌ・ティ・ティ・ドコモ | Moving image encoding device, method, and program, and moving image decoding device, method, and program |
US8295628B2 (en) | 2006-06-21 | 2012-10-23 | Thomson Licensing | Automatic film grain adjustment |
CN102986223A (en) * | 2010-07-16 | 2013-03-20 | 索尼公司 | Image processing device, image processing method, and program |
EP2041981B1 (en) * | 2006-07-18 | 2013-09-04 | Thomson Licensing | Methods and apparatus for adaptive reference filtering |
JP2013243759A (en) * | 2008-07-25 | 2013-12-05 | Sony Corp | Image processing device and method, program, and record medium |
US8964852B2 (en) | 2011-02-23 | 2015-02-24 | Qualcomm Incorporated | Multi-metric filtering |
EP2229780A4 (en) * | 2007-12-13 | 2015-08-05 | Mediatek Inc | In-loop fidelity enhancement for video compression |
US9143803B2 (en) | 2009-01-15 | 2015-09-22 | Qualcomm Incorporated | Filter prediction based on activity metrics in video coding |
US10123050B2 (en) | 2008-07-11 | 2018-11-06 | Qualcomm Incorporated | Filtering video data using a plurality of filters |
Families Citing this family (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101448162B (en) * | 2001-12-17 | 2013-01-02 | 微软公司 | Method for processing video image |
US7212573B2 (en) * | 2003-06-13 | 2007-05-01 | Lsi Logic Corporation | Method and/or apparatus for determining minimum positive reference indices for a direct prediction mode |
US10554985B2 (en) | 2003-07-18 | 2020-02-04 | Microsoft Technology Licensing, Llc | DC coefficient signaling at small quantization step sizes |
EP1676445B1 (en) * | 2003-09-23 | 2019-09-04 | InterDigital VC Holdings, Inc. | Method for simulating film grain by mosaicing pre-computed samples |
JP2005100100A (en) * | 2003-09-25 | 2005-04-14 | Toyota Motor Corp | Wheel information processing device and method |
BRPI0415307A (en) * | 2003-10-14 | 2006-12-05 | Thomson Licensing | technique for bit precision film grain simulation |
US8150206B2 (en) * | 2004-03-30 | 2012-04-03 | Thomson Licensing | Method and apparatus for representing image granularity by one or more parameters |
US20060082649A1 (en) | 2004-10-18 | 2006-04-20 | Cristina Gomila | Film grain simulation method |
US8014558B2 (en) | 2004-10-18 | 2011-09-06 | Thomson Licensing | Methods, apparatus and system for film grain simulation |
CA2803789C (en) | 2004-11-12 | 2014-04-15 | Thomson Licensing | Film grain simulation for normal play and trick mode play for video playback systems |
US20060104353A1 (en) * | 2004-11-16 | 2006-05-18 | Johnson Andrew W | Video signal preprocessing to minimize prediction error |
JP4825808B2 (en) | 2004-11-16 | 2011-11-30 | トムソン ライセンシング | Film grain simulation method based on pre-calculated conversion factor |
US9117261B2 (en) | 2004-11-16 | 2015-08-25 | Thomson Licensing | Film grain SEI message insertion for bit-accurate simulation in a video system |
RU2372659C2 (en) * | 2004-11-17 | 2009-11-10 | Томсон Лайсенсинг | Imitation method of film grain accurate to bit on basis of pre-calculated converted coefficients |
WO2006057937A2 (en) | 2004-11-22 | 2006-06-01 | Thomson Licensing | Methods, apparatus and system for film grain cache splitting for film grain simulation |
MX2007006139A (en) * | 2004-11-24 | 2007-07-19 | Thomson Licensing | Film grain simulation technique for use in media playback devices. |
US8265151B1 (en) | 2005-12-14 | 2012-09-11 | Ambarella Taiwan Ltd. | Mode decision using approximate 1/2 pel interpolation |
EP1980109B1 (en) * | 2006-01-31 | 2018-07-11 | Thomson Licensing DTV | Methods and apparatus for edge-based spatio-temporal filtering |
US20100091845A1 (en) * | 2006-03-30 | 2010-04-15 | Byeong Moon Jeon | Method and apparatus for decoding/encoding a video signal |
BRPI0719536A2 (en) * | 2006-10-16 | 2014-01-14 | Thomson Licensing | METHOD FOR USING A GENERAL LAYER UNIT IN THE WORK NETWORK SIGNALING AN INSTANT DECODING RESET DURING A VIDEO OPERATION. |
US8213500B2 (en) * | 2006-12-21 | 2012-07-03 | Sharp Laboratories Of America, Inc. | Methods and systems for processing film grain noise |
JP2008219163A (en) * | 2007-02-28 | 2008-09-18 | Toshiba Corp | Information encoding method, information playback method, and information storage medium |
JP2010525658A (en) * | 2007-04-19 | 2010-07-22 | トムソン ライセンシング | Adaptive reference image data generation for intra prediction |
US10715834B2 (en) | 2007-05-10 | 2020-07-14 | Interdigital Vc Holdings, Inc. | Film grain simulation based on pre-computed transform coefficients |
CN100566427C (en) * | 2007-07-31 | 2009-12-02 | 北京大学 | The choosing method and the device that are used for the intraframe predictive coding optimal mode of video coding |
BRPI0815108A2 (en) * | 2007-08-15 | 2015-01-27 | Thomson Licensing | METHODS AND APPARATUS FOR ADVANCED MULTI-VISIT CODE MOVEMENT ADVANCE MODE WITH THE USE OF REGIONAL DISPARITY VECTORS |
KR101682516B1 (en) * | 2008-01-07 | 2016-12-05 | 톰슨 라이센싱 | Methods and apparatus for video encoding and decoding using parametric filtering |
KR20090098214A (en) * | 2008-03-13 | 2009-09-17 | 삼성전자주식회사 | Method and apparatus for video encoding and decoding |
US9967590B2 (en) * | 2008-04-10 | 2018-05-08 | Qualcomm Incorporated | Rate-distortion defined interpolation for video coding based on fixed filter or adaptive filter |
CA2722204C (en) * | 2008-04-25 | 2016-08-09 | Thomas Schierl | Flexible sub-stream referencing within a transport data stream |
US8548041B2 (en) * | 2008-09-25 | 2013-10-01 | Mediatek Inc. | Adaptive filter |
JPWO2010064675A1 (en) * | 2008-12-03 | 2012-05-10 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
TWI463878B (en) | 2009-02-19 | 2014-12-01 | Sony Corp | Image processing apparatus and method |
TWI440363B (en) | 2009-02-19 | 2014-06-01 | Sony Corp | Image processing apparatus and method |
BRPI1009553A2 (en) * | 2009-03-13 | 2019-04-09 | Thomson Licensing | blur measurement on a block-based compressed image |
US8995526B2 (en) * | 2009-07-09 | 2015-03-31 | Qualcomm Incorporated | Different weights for uni-directional prediction and bi-directional prediction in video coding |
US9161057B2 (en) | 2009-07-09 | 2015-10-13 | Qualcomm Incorporated | Non-zero rounding and prediction mode selection techniques in video encoding |
JP2011049740A (en) | 2009-08-26 | 2011-03-10 | Sony Corp | Image processing apparatus and method |
WO2011033643A1 (en) * | 2009-09-17 | 2011-03-24 | 株式会社 東芝 | Dynamic image encoding method and dynamic image decoding method |
JP5340415B2 (en) * | 2009-12-07 | 2013-11-13 | 三菱電機株式会社 | Image encoding device, image decoding device, image encoding method, and image decoding method |
US20120243611A1 (en) * | 2009-12-22 | 2012-09-27 | Sony Corporation | Image processing apparatus and method as well as program |
JP5323211B2 (en) * | 2010-01-13 | 2013-10-23 | 株式会社東芝 | Video encoding apparatus and decoding apparatus |
WO2011086672A1 (en) * | 2010-01-13 | 2011-07-21 | 株式会社 東芝 | Moving image coding device and decoding device |
EP2375747B1 (en) * | 2010-04-12 | 2019-03-13 | Sun Patent Trust | Filter positioning and selection |
WO2011161823A1 (en) * | 2010-06-25 | 2011-12-29 | 株式会社 東芝 | Video encoding method and decoding method |
US20130194386A1 (en) * | 2010-10-12 | 2013-08-01 | Dolby Laboratories Licensing Corporation | Joint Layer Optimization for a Frame-Compatible Video Delivery |
US9036695B2 (en) | 2010-11-02 | 2015-05-19 | Sharp Laboratories Of America, Inc. | Motion-compensated temporal filtering based on variable filter parameters |
US8849053B2 (en) * | 2011-01-14 | 2014-09-30 | Sony Corporation | Parametric loop filter |
KR20120118782A (en) * | 2011-04-19 | 2012-10-29 | 삼성전자주식회사 | Method and apparatus for encoding/decoding video using adaptive filtering |
US8837582B2 (en) | 2011-06-22 | 2014-09-16 | Blackberry Limited | Compressing image data |
US8768082B2 (en) * | 2011-06-22 | 2014-07-01 | Blackberry Limited | Compressing image data |
US9451284B2 (en) | 2011-10-10 | 2016-09-20 | Qualcomm Incorporated | Efficient signaling of reference picture sets |
JP5698644B2 (en) * | 2011-10-18 | 2015-04-08 | 株式会社Nttドコモ | Video predictive encoding method, video predictive encoding device, video predictive encoding program, video predictive decoding method, video predictive decoding device, and video predictive decode program |
CN107566835B (en) | 2011-12-23 | 2020-02-28 | 韩国电子通信研究院 | Image decoding method, image encoding method, and recording medium |
US20130177084A1 (en) * | 2012-01-10 | 2013-07-11 | Qualcomm Incorporated | Motion vector scaling in video coding |
US9729870B2 (en) * | 2012-01-31 | 2017-08-08 | Apple Inc. | Video coding efficiency with camera metadata |
MY198312A (en) * | 2012-04-16 | 2023-08-23 | Samsung Electronics Co Ltd | Method And Apparatus For Determining Reference Picture Set Of Image |
CN104349169B (en) * | 2013-08-09 | 2018-11-09 | 联想(北京)有限公司 | A kind of image processing method and electronic equipment |
JP2015144423A (en) | 2013-12-25 | 2015-08-06 | 三星電子株式会社Samsung Electronics Co.,Ltd. | Image encoder, image decoder, method of image encoder and image decoder, program and image processing system |
JP6509523B2 (en) * | 2014-03-18 | 2019-05-08 | パナソニック株式会社 | Image coding device |
US10778993B2 (en) * | 2017-06-23 | 2020-09-15 | Mediatek Inc. | Methods and apparatus for deriving composite tracks with track grouping |
WO2021025080A1 (en) * | 2019-08-07 | 2021-02-11 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Encoding device, decoding device, encoding method, and decoding method |
WO2021125703A1 (en) * | 2019-12-20 | 2021-06-24 | 엘지전자 주식회사 | Image/video coding method and device |
MX2022008445A (en) * | 2020-01-13 | 2022-10-18 | Lg Electronics Inc | Inter prediction method and apparatus in image/video coding system. |
US11470358B2 (en) | 2020-04-02 | 2022-10-11 | Sharp Kabushiki Kaisha | Systems and methods for signaling scaling window information in video coding |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0817497A2 (en) * | 1996-07-06 | 1998-01-07 | Samsung Electronics Co., Ltd. | Loop filtering method for reducing blocking effects and ringing noise of a motion-compensated image |
US6067125A (en) * | 1997-05-15 | 2000-05-23 | Minerva Systems | Structure and method for film grain noise reduction |
EP1333681A2 (en) * | 2002-01-31 | 2003-08-06 | Samsung Electronics Co., Ltd. | Filtering method and apparatus for reducing block artifacts or ringing noise |
EP1335609A2 (en) * | 2002-01-25 | 2003-08-13 | Microsoft Corporation | Improved video coding methods and apparatuses |
US20030152146A1 (en) * | 2001-12-17 | 2003-08-14 | Microsoft Corporation | Motion compensation loop with filtering |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0497681A (en) | 1990-08-16 | 1992-03-30 | Nippon Telegr & Teleph Corp <Ntt> | Video encoding and decoding device |
US5317397A (en) * | 1991-05-31 | 1994-05-31 | Kabushiki Kaisha Toshiba | Predictive coding using spatial-temporal filtering and plural motion vectors |
JP2817497B2 (en) | 1992-01-31 | 1998-10-30 | 日本電気株式会社 | Dictionary editing device |
US5576765A (en) * | 1994-03-17 | 1996-11-19 | International Business Machines, Corporation | Video decoder |
JP3980659B2 (en) | 1994-08-31 | 2007-09-26 | ソニー株式会社 | Video encoding method and apparatus, video decoding method and apparatus. |
JP3353604B2 (en) | 1995-08-09 | 2002-12-03 | ソニー株式会社 | Moving image encoding method and apparatus, and signal recording medium |
FR2742900B1 (en) * | 1995-12-22 | 1998-02-13 | Thomson Multimedia Sa | METHOD FOR INTERPOLATING PROGRESSIVE FRAMES |
US6005626A (en) | 1997-01-09 | 1999-12-21 | Sun Microsystems, Inc. | Digital video signal encoder and encoding method |
US6487249B2 (en) * | 1998-10-09 | 2002-11-26 | Matsushita Electric Industrial Co., Ltd. | Efficient down conversion system for 2:1 decimation |
JP3466951B2 (en) * | 1999-03-30 | 2003-11-17 | 株式会社東芝 | Liquid crystal display |
US6987805B1 (en) * | 1999-09-24 | 2006-01-17 | Lsi Logic Corporation | Macroblock level intrarefresh technique for encoded video |
JP2001112000A (en) * | 1999-10-07 | 2001-04-20 | Matsushita Electric Ind Co Ltd | Video signal encoding device |
WO2001054418A1 (en) * | 2000-01-21 | 2001-07-26 | Nokia Corporation | A motion estimation method and a system for a video coder |
KR100796085B1 (en) * | 2000-04-14 | 2008-01-21 | 소니 가부시끼 가이샤 | Decoder, decoding method, and recorded medium |
JP2002016928A (en) * | 2000-06-30 | 2002-01-18 | Toshiba Corp | Moving image encoding method and decoding method and apparatus |
KR100370076B1 (en) * | 2000-07-27 | 2003-01-30 | 엘지전자 주식회사 | video decoder with down conversion function and method of decoding a video signal |
DE10046807C2 (en) * | 2000-09-21 | 2003-04-03 | Infineon Technologies Ag | Method and device for image compression |
US6748020B1 (en) * | 2000-10-25 | 2004-06-08 | General Instrument Corporation | Transcoder-multiplexer (transmux) software architecture |
TWI248073B (en) * | 2002-01-17 | 2006-01-21 | Media Tek Inc | Device and method for displaying static pictures |
US6980598B2 (en) * | 2002-02-22 | 2005-12-27 | International Business Machines Corporation | Programmable vertical filter for video encoding |
EP1383339A1 (en) * | 2002-07-15 | 2004-01-21 | Matsushita Electric Industrial Co., Ltd. | Memory management method for video sequence motion estimation and compensation |
US7903742B2 (en) * | 2002-07-15 | 2011-03-08 | Thomson Licensing | Adaptive weighting of reference pictures in video decoding |
US7801217B2 (en) * | 2002-10-01 | 2010-09-21 | Thomson Licensing | Implicit weighting of reference pictures in a video encoder |
US7724818B2 (en) * | 2003-04-30 | 2010-05-25 | Nokia Corporation | Method for coding sequences of pictures |
MXPA06000323A (en) * | 2003-07-09 | 2006-05-31 | Thomson Licensing | Video encoder with low complexity noise reduction. |
JP2010525658A (en) * | 2007-04-19 | 2010-07-22 | トムソン ライセンシング | Adaptive reference image data generation for intra prediction |
-
2004
- 2004-09-02 KR KR1020067005095A patent/KR101094323B1/en active IP Right Grant
- 2004-09-02 US US10/569,695 patent/US8094711B2/en active Active
- 2004-09-02 WO PCT/US2004/028650 patent/WO2005034517A1/en active Application Filing
- 2004-09-02 BR BRPI0414397-3A patent/BRPI0414397A/en not_active IP Right Cessation
- 2004-09-02 EP EP04783029A patent/EP1665804A1/en not_active Ceased
- 2004-09-02 CN CN2004800256296A patent/CN1846444B/en not_active Expired - Lifetime
- 2004-09-02 JP JP2006526923A patent/JP5330647B2/en not_active Expired - Lifetime
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0817497A2 (en) * | 1996-07-06 | 1998-01-07 | Samsung Electronics Co., Ltd. | Loop filtering method for reducing blocking effects and ringing noise of a motion-compensated image |
US6067125A (en) * | 1997-05-15 | 2000-05-23 | Minerva Systems | Structure and method for film grain noise reduction |
US20030152146A1 (en) * | 2001-12-17 | 2003-08-14 | Microsoft Corporation | Motion compensation loop with filtering |
EP1335609A2 (en) * | 2002-01-25 | 2003-08-13 | Microsoft Corporation | Improved video coding methods and apparatuses |
EP1333681A2 (en) * | 2002-01-31 | 2003-08-06 | Samsung Electronics Co., Ltd. | Filtering method and apparatus for reducing block artifacts or ringing noise |
Non-Patent Citations (4)
Title |
---|
CHRISTINA GOMILA, ALEXANDER KOBILANSKY: "SEI message for film grain encoding", JVT OF ISO IEC MPEG AND ITU-T VCEG JVT-H022, 23 May 2003 (2003-05-23), GENEVA, SWITZERLAND, pages 1 - 14, XP002308742 * |
KAUP A: "Reduction of ringing noise in transform image coding using simple adaptive filter", ELECTRONICS LETTERS, IEE STEVENAGE, GB, vol. 34, no. 22, 29 October 1998 (1998-10-29), pages 2110 - 2112, XP006010547, ISSN: 0013-5194 * |
LLACH J, BOYCE J: "H.264 encoder with low complexity noise pre-filtering", PROCEEDINGS OF SPIE, APPLICATIONS OF DIGITAL IMAGE PROCESSING XXVI, vol. 5203, 5 August 2003 (2003-08-05), SAN DIEGO, USA, pages 478 - 489, XP002311426 * |
YUEN M ET AL: "Performance of loop filters in MC/DPCM/DCT video coding", SIGNAL PROCESSING, 1996., 3RD INTERNATIONAL CONFERENCE ON BEIJING, CHINA 14-18 OCT. 1996, NEW YORK, NY, USA,IEEE, US, 14 October 1996 (1996-10-14), pages 1182 - 1186, XP010209397, ISBN: 0-7803-2912-0 * |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1894418A2 (en) * | 2005-06-24 | 2008-03-05 | NTT DoCoMo Inc. | Method and apparatus for video encoding and decoding using adaptive interpolation |
EP1841230A1 (en) * | 2006-03-27 | 2007-10-03 | Matsushita Electric Industrial Co., Ltd. | Adaptive wiener filter for video coding |
EP1845729A1 (en) * | 2006-04-12 | 2007-10-17 | Matsushita Electric Industrial Co., Ltd. | Transmission of post-filter hints |
US8295628B2 (en) | 2006-06-21 | 2012-10-23 | Thomson Licensing | Automatic film grain adjustment |
EP2041981B1 (en) * | 2006-07-18 | 2013-09-04 | Thomson Licensing | Methods and apparatus for adaptive reference filtering |
JP2010514246A (en) * | 2006-12-18 | 2010-04-30 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Image compression and decompression |
RU2461977C2 (en) * | 2006-12-18 | 2012-09-20 | Конинклейке Филипс Электроникс Н.В. | Compression and decompression of images |
US9786066B2 (en) | 2006-12-18 | 2017-10-10 | Koninklijke Philips N.V. | Image compression and decompression |
US8582666B2 (en) | 2006-12-18 | 2013-11-12 | Koninklijke Philips N.V. | Image compression and decompression |
WO2008075247A1 (en) | 2006-12-18 | 2008-06-26 | Koninklijke Philips Electronics N.V. | Image compression and decompression |
EP3383048A1 (en) * | 2007-12-13 | 2018-10-03 | HFI Innovation Inc. | In-loop fidelity enhancement for video compression |
EP2229780A4 (en) * | 2007-12-13 | 2015-08-05 | Mediatek Inc | In-loop fidelity enhancement for video compression |
US10327010B2 (en) | 2007-12-13 | 2019-06-18 | Hfi Innovation Inc. | In-loop fidelity enhancement for video compression |
EP2252063A1 (en) * | 2008-03-07 | 2010-11-17 | Kabushiki Kaisha Toshiba | Dynamic image encoding/decoding device |
EP2249572A1 (en) * | 2008-03-07 | 2010-11-10 | Kabushiki Kaisha Toshiba | Dynamic image encoding/decoding method and device |
EP2249572A4 (en) * | 2008-03-07 | 2012-05-23 | Toshiba Kk | Dynamic image encoding/decoding method and device |
CN101926175A (en) * | 2008-03-07 | 2010-12-22 | 株式会社东芝 | Dynamic image encoding/decoding method and device |
EP2252063A4 (en) * | 2008-03-07 | 2012-09-12 | Toshiba Kk | Dynamic image encoding/decoding device |
EP2109322A2 (en) * | 2008-04-09 | 2009-10-14 | Intel Corporation | In-loop adaptive Wiener filter for video coding and decoding |
WO2009133845A1 (en) * | 2008-04-30 | 2009-11-05 | 株式会社 東芝 | Video encoding/decoding device and method |
US11711548B2 (en) | 2008-07-11 | 2023-07-25 | Qualcomm Incorporated | Filtering video data using a plurality of filters |
US10123050B2 (en) | 2008-07-11 | 2018-11-06 | Qualcomm Incorporated | Filtering video data using a plurality of filters |
JP2013243759A (en) * | 2008-07-25 | 2013-12-05 | Sony Corp | Image processing device and method, program, and record medium |
US9143803B2 (en) | 2009-01-15 | 2015-09-22 | Qualcomm Incorporated | Filter prediction based on activity metrics in video coding |
WO2010131537A1 (en) * | 2009-05-11 | 2010-11-18 | 株式会社エヌ・ティ・ティ・ドコモ | Moving image encoding device, method, and program, and moving image decoding device, method, and program |
US9241172B2 (en) | 2009-05-11 | 2016-01-19 | Ntt Docomo, Inc. | Moving image encoding and decoding device |
CN102986223A (en) * | 2010-07-16 | 2013-03-20 | 索尼公司 | Image processing device, image processing method, and program |
US8964852B2 (en) | 2011-02-23 | 2015-02-24 | Qualcomm Incorporated | Multi-metric filtering |
US9819936B2 (en) | 2011-02-23 | 2017-11-14 | Qualcomm Incorporated | Multi-metric filtering |
US9877023B2 (en) | 2011-02-23 | 2018-01-23 | Qualcomm Incorporated | Multi-metric filtering |
US9258563B2 (en) | 2011-02-23 | 2016-02-09 | Qualcomm Incorporated | Multi-metric filtering |
US8989261B2 (en) | 2011-02-23 | 2015-03-24 | Qualcomm Incorporated | Multi-metric filtering |
US8982960B2 (en) | 2011-02-23 | 2015-03-17 | Qualcomm Incorporated | Multi-metric filtering |
US8964853B2 (en) | 2011-02-23 | 2015-02-24 | Qualcomm Incorporated | Multi-metric filtering |
Also Published As
Publication number | Publication date |
---|---|
KR101094323B1 (en) | 2011-12-19 |
BRPI0414397A (en) | 2006-11-21 |
US20060291557A1 (en) | 2006-12-28 |
US8094711B2 (en) | 2012-01-10 |
EP1665804A1 (en) | 2006-06-07 |
KR20060083974A (en) | 2006-07-21 |
CN1846444A (en) | 2006-10-11 |
JP5330647B2 (en) | 2013-10-30 |
JP2007506361A (en) | 2007-03-15 |
CN1846444B (en) | 2011-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8094711B2 (en) | Adaptive reference picture generation | |
EP3114835B1 (en) | Encoding strategies for adaptive switching of color spaces | |
KR101792041B1 (en) | Methods and apparatus for adaptive coding of motion information | |
US20070189392A1 (en) | Reduced resolution update mode for advanced video coding | |
US20090141814A1 (en) | Method and Apparatus for Providing Reduced Resolution Update Mode for Multi-View Video Coding | |
US20120033728A1 (en) | Method and apparatus for encoding and decoding images by adaptively using an interpolation filter | |
US20140078394A1 (en) | Selective use of chroma interpolation filters in luma interpolation process | |
US20110103464A1 (en) | Methods and Apparatus for Locally Adaptive Filtering for Motion Compensation Interpolation and Reference Picture Filtering | |
EP2596636A1 (en) | Reference processing using advanced motion models for video coding | |
CN107646194B (en) | Apparatus and method for video motion compensation | |
US20240137516A1 (en) | Methods and devices for prediction dependent residual scaling for video coding | |
EP2489188A2 (en) | Methods and apparatus for efficient adaptive filtering for video encoders and decoders | |
WO2020257785A1 (en) | Methods and devices for prediction dependent residual scaling for video coding | |
JP7367237B2 (en) | Method and device for prediction-dependent residual scaling for video coding | |
EP4128770A1 (en) | Methods and devices for prediction dependent residual scaling for video coding | |
US20230067650A1 (en) | Methods and devices for prediction dependent residual scaling for video coding | |
KR20240089011A (en) | Video coding using optional neural network-based coding tools | |
MXPA06003034A (en) | Adaptive reference picture generation | |
JP2004180248A (en) | Coding distortion removal method, dynamic image encoding method, dynamic image decoding method, and apparatus and program for achieving the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200480025629.6 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006291557 Country of ref document: US Ref document number: 10569695 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004783029 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067005095 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006526923 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: PA/a/2006/003034 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1565/DELNP/2006 Country of ref document: IN |
|
WWP | Wipo information: published in national office |
Ref document number: 2004783029 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020067005095 Country of ref document: KR |
|
ENP | Entry into the national phase |
Ref document number: PI0414397 Country of ref document: BR |
|
WWP | Wipo information: published in national office |
Ref document number: 10569695 Country of ref document: US |