WO2011064673A1 - Method of and apparatus for encoding video frames, method of and apparatus for decoding video frames - Google Patents

Method of and apparatus for encoding video frames, method of and apparatus for decoding video frames Download PDF

Info

Publication number
WO2011064673A1
WO2011064673A1 PCT/IB2010/003424 IB2010003424W WO2011064673A1 WO 2011064673 A1 WO2011064673 A1 WO 2011064673A1 IB 2010003424 W IB2010003424 W IB 2010003424W WO 2011064673 A1 WO2011064673 A1 WO 2011064673A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion vector
motion
block
mode
motion compensation
Prior art date
Application number
PCT/IB2010/003424
Other languages
French (fr)
Inventor
Ronggang Wang
Yuan Dong
Original Assignee
France Telecom
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by France Telecom filed Critical France Telecom
Publication of WO2011064673A1 publication Critical patent/WO2011064673A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/109Selection of coding mode or of prediction mode among a plurality of temporal predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/147Data rate or code amount at the encoder output according to rate distortion criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/517Processing of motion vectors by encoding
    • H04N19/52Processing of motion vectors by encoding by predictive encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding

Definitions

  • the present invention relates in general to a methods and apparatus for encoding and/or decoding video frames.
  • the invention further relates to a method and apparatus for encoding and/or decoding video frames.
  • Video data comprises a sequence of distinct images, or frames, that are shown one after the other in such rapid succession as to give the illusion of smooth movement.
  • Video compression algorithms attempt to reduce the amount of data required to transmit or store video data by exploiting the fact that there are very little changes from one frame to the next.
  • a compression algorithm compares a small area of the new frame to be transmitted with the same area in the preceding frame. If the selected area is similar to the corresponding area in the preceding frame then there is no need to transmit a second copy.
  • the compression algorithm merely notes that the two areas are the same.
  • the new frame is reconstructed by the decompression algorithm, it copies the area from the preceding frame into the new frame.
  • This reuse of data results in a significant data compression.
  • the compared areas in the new frame are constituted of non-overlapping squares of 16x16 pixels. These are known as macroblocks. Compression algorithms do not require that the macroblocks be identical, merely similar. Adjusting the degree of similarity required can affect the degree of compression and the quality of the resulting video.
  • a simple macroblock replacement scheme typically compresses only those portions of the screen that do not move.
  • the block in a new frame and a previous frame are not similar, it is often the case that the information contained in the new frame's block is, in fact, available in the preceding frame - but at a different location.
  • Motion compensation techniques attempt to achieve greater compression by reusing portions from the previous frame to construct the new frame, even if the portions have moved location within the frame compared to the preceding frame.
  • motion compensation describes a frame in terms of where each section of that frame came from, in a previous frame. Subsequent frames in a video sequence are often very similar, thus containing a lot of redundancy. Removing this redundancy helps achieve the goal of better compression ratios.
  • a first approach to reducing the redundancy is to simply subtract a reference frame from a given frame.
  • the resulting difference is then referred to as the residual and usually contains less energy (or information) than the original frame.
  • the residual can be encoded at a lower bit-rate with the same quality.
  • the decoder can reconstruct the original frame by adding the reference frame again.
  • the frames of the video sequence are processed in groups.
  • One frame (usually the first) is encoded without motion compensation just as a normal image.
  • This frame is called an l-frame (intra-coded frame, MPEG terminology) or l-picture.
  • the other frames are called P-frames or P-pictures and are predicted from the l-frame or P-frame that comes (temporally) immediately before it.
  • the prediction schemes are, for instance, described as IPPP..., meaning that a group consists of one l-frame followed by three P-frames.
  • Frames can also be predicted from future frames.
  • the future frames then need to be encoded before the predicted frames and thus, the encoding order does not necessarily match the real temporal frame order.
  • Such frames are usually predicted from two directions, i.e. from the I- or P-frames that immediately precede or follow the predicted frame.
  • These bi-directionally predicted frames are called B-frames.
  • a coding scheme could, for instance, be IBBPBBPBB....
  • block motion compensation BMC
  • the frames are partitioned in blocks of pixels (e.g. macroblocks of 16x16 pixels in MPEG). Each block is predicted from a block of equal size in the reference frame.
  • the blocks are not transformed in any way apart from being shifted to the position of the predicted block. This shift is represented by a motion vector.
  • Motion vectors are parameters of the BMC motion model and should be encoded into the bit-stream. As the motion vectors are not always independent (e.g. if two neighbouring blocks belong to the same moving object), they are usually encoded differentially to save bit-rate. This means that the difference between the motion vector and the neighbouring motion vector(s) encoded before is encoded. (The result of this differencing process is mathematically equivalent to a global motion compensation capable of panning.) An entropy codec can exploit the resulting statistical distribution of the motion vectors (around the zero vector).
  • block motion compensation introduces discontinuities at the block borders (blocking artifacts). These artifacts appear in the form of sharp horizontal and vertical edges which are easily spotted by the human eye and produce ringing effects (large coefficients in high frequency sub- bands) in the Fourier-related transform used for transform coding of the residual frames.
  • BMC Another disadvantage of BMC is that there is at most one predictor for each block, so the motion compensation efficiency is constrained by motion vector accuracy.
  • the bi-direction predicted block in B frames is an example of using two predictors for one block to improve the motion compensation efficiency. Summary of the Invention
  • the present invention has been devised with the foregoing in mind. To better address one or more of the foregoing concerns, in its most general form the invention provides a method of .
  • a first aspect of the invention provides a method of decoding a stream of video frames each video frame being partitioned into one or more blocks, the method comprising: receiving a plurality of successive video frames, wherein each frame has a motion vector indicator; selecting for a block in a current frame a neighbouring zone of the respective block; deriving a first motion vector by defining the relative movement between the position of the neighbouring zone in a reference frame and the position of the neighbouring zone in the current frame; deriving a second motion vector using the motion vector indicator and the first motion vector; and averaging a first block predicted by the first motion vector and a second block predicted by a second motion vector to provide a combined block for reconstruction of the current frame.
  • a second aspect of the invention provides a decoder for decoding a stream of video frames each video frame being partitioned into one or more blocks, the decoder comprising: a receiver for receiving a plurality of successive video frames, wherein each frame has a motion vector indicator; a selector for selecting for a block in a current frame a decoded neighbouring zone of the respective block; a processor configured to derive a first motion vector by defining the relative movement between the position of the decoded neighbouring zone in a reference frame and the position of the neighbouring zone in the current frame; derive a second motion vector using the motion vector indicator and the first motion vector; and average a first block predicted by the first motion vector and a second block predicted by a second motion vector for reconstruction of the current frame.
  • a third aspect of the invention provides a method of encoding a stream of video frames each frame being partitioned into one or more blocks, the method comprising: performing motion estimation on a block to determine a motion vector and a motion compensation mode for motion compensation; performing motion compensation on the block according to the determined motion vector and the determined motion compensation mode to provide predicted block values; determining residue data between the predicted block values and the raw block, quantising the residue data; and mapping the quantized residue data, a motion vector indicator indicating the motion vector and a motion compensation mode indicator indicating the corresponding motion compensation mode onto a bit stream.
  • a fourth aspect of the invention provides an encoder for encoding a stream of video frames each frame being partitioned into one or more blocks, the encoder comprising: a motion estimator for determining a motion vector and a motion compensation mode for motion compensation of a block; a motion compensator for performing motion compensation on the block according to the determined motion vector and the determined motion compensation mode to provide predicted block values; a subtractor for determining residue data between the predicted block values and the raw block, a quantizer for quantising the residue data; and a mapper for mapping the quantized residue data, a motion vector indicator indicating the motion vector and a motion compensation mode indicator indicating the corresponding motion compensation mode onto a bit stream.
  • each frame may comprise a motion compensation mode indicator, the mode indicator determining whether the second block is used as the predicted block or the combined block is used as the predicted block.
  • the motion vector of the motion compensation mode of the combined block may be coded as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
  • the second motion vector may correspond to a motion vector of a BMC mode
  • the first motion vector maybe template matching motion vector
  • the motion vector indicator may be a motion vector residue decoded from the bitstream ⁇ the step of performing motion estimation on a block, in the method of encoding, to determine a motion vector and a motion compensation mode for motion compensation may include
  • the motion compensation mode may be selected from a block motion
  • each motion vector of a partition mode the motion cost of the motion vector in block motion compensation mode may be calculated and compared with the motion cost of the motion vector in a combined template matching motion compensation mode.
  • the motion vector of the TMJ3MC may be coded as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
  • each frame may comprise a motion compensation mode indicator, and to processor is configured to determine from the mode indicator whether the second block is used as the predicted block or the combined block is used as the predicted block.
  • ⁇ the motion estimator of the encoder may include:
  • a block partitioner for partitioning the block into partitions according to a plurality of partition modes; the motion estimator being further configured to obtain a motion vector and motion compensation mode for each partition mode;
  • a partition mode from the plurality of partition modes by applying a rate distortion optimisation technique wherein the reconstructed block distortion is weighed against the bit rate needed to transmit the coding mode and the residue data.
  • the motion estimator may be configured to select the motion compensation mode from a block motion compensation mode (BMC) or a combined template matching motion compensation mode (TM_BMC)
  • BMC block motion compensation mode
  • TM_BMC combined template matching motion compensation mode
  • the motion estimator may be configured to compare the motion cost for each motion vector of a partition mode the motion cost of the motion vector in block motion compensation mode with the motion cost of the motion vector in a combined template matching motion compensation mode
  • the encoder may be configured to code the motion vector of the TM_BMC as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
  • the methods according to the invention may be computer implemented.
  • the methods may be implemented in software on a programmable apparatus. They may also be implemented solely in hardware or in software, or in a combination thereof.
  • a tangible carrier medium may comprise a storage medium such as a floppy disk, a CD-ROM, a hard disk drive, a magnetic tape device or a solid state memory device and the like.
  • a transient carrier medium may include a signal such as an electrical signal, an electronic signal, an optical signal, an acoustic signal, a magnetic signal or an electromagnetic signal, e.g. a microwave or RF signal.
  • Figure 1 schematically illustrates a generally used video coding hierarchical syntax
  • FIG. 2 schematically illustrates video codec architecture according to an embodiment of the invention
  • Figure 3 represents a flowchart of encoding a video sequence according to the embodiment of the invention
  • Figure 4 represents a flowchart of motion estimation on a macroblock partition according to the embodiment of the invention
  • Figure 5 schematically illustrates a process of template matching on a macroblock partition according to an embodiment of the invention
  • Figure 6 schematically illustrates a syntax structure on macroblock level of coded bit-stream according to the embodiment of the invention
  • Figure 7 represents a flowchart of a method for decoding a macroblock from the coded bit-stream according to the embodiment of the invention
  • Video coding generally comprises the process of compressing (encoding) and decompressing (decoding) a digital video signal.
  • a device that compresses data is referred to as an encoder
  • a device that decompresses data is referred to as a decoder
  • a device that acts as both encoder and decoder will be referred to as a codec.
  • Each frame P can be split into one or several slices SL, each slice SL being defined as a sequence of macroblocks MB.
  • a macroblock MB is defined as the basic unit for encoding and is generally of fixed size frame partitions that cover a rectangular area of 16x16 pixels.
  • Each macroblock MB can be further segmented into one or more blocks B of variable block size. Further, hereinafter we will use the notion of macroblock partition to refer to a block of a macroblock for which motion- compensated prediction is applied.
  • An allowable set of macroblock partition modes i.e.
  • a macroblock can be partitioned in one or more macroblock partitions MP1 to MP9, typically vary from one coding scheme to another and, for example, a 16x16 macroblock MB may have a mix of 8x8, 4x4, 4x8 and 8x4 macroblock partitions within a single macroblock.
  • FIG. 2 a video codec architecture 2 according to an embodiment of the invention is illustrated.
  • a part of a frame of raw video data is input into a motion estimation module 200, which performs a hybrid template matching motion estimation (TMJvIE) and traditional block motion estimation (BME) in motion estimation to determine the optimal motion vector and motion compensation mode for motion compensation.
  • the motion compensation mode can be BMC (block motion compensation predictor) or TM_BMC (combining template matching predictor and block motion compensator predictor).
  • Motion compensation of the received frame part is performed in module 205 according to the determined motion vector and the determined motion compensation mode in order to obtain prediction values.
  • the residues between the prediction values and the raw frame values are transformed and quantized by a DCT&Q module 210.
  • the results are scanned to one dimension values and mapped into bit-stream by Entropy coding module 215.
  • the decoded frame in the decoder can be reconstructed by the encoder.
  • Inverse quantization and inverse transform are performed on the output of the DCT&Q 210 module in the IQ&IDCT module 220 to recover the residue values. Since there is information loss in quantization and inverse quantization, the residue values are not the same as the output from DCT&Q module 210.
  • Deblocking module 225 performs de-blocking filtering on the border between neighbouring areas to improve the subject quality of the reconstructed frame. If the filtered reconstructed frame can act as a reference frame for the following frames, it is stored in Reference buffer 230.
  • Figure 3 presents a flowchart outlining the steps of encoding a video sequence of frames according to the embodiment of the invention.
  • a frame can be divided into macroblocks, and the macroblock forms a basic processing unit for encoding process.
  • Step TM_ME&BME 305 is firstly performed on one macroblock of a frame.
  • the macroblock is further divided into partitions (macroblock partitions) of varying sizes according to a set of macroblock partition modes.
  • Motion estimation is performed on the macroblock independently for each macroblock partition mode.
  • a separate set of motion vectors and a motion compensation mode (either TM_BMC mode or BMC mode) are obtained for each macroblock partition in this step.
  • a mode decision process is performed to select a macroblock coding mode (containing macroblock partition mode information) according to the results obtained in step 305.
  • Rate Distortion Optimization is a typical mode decision scheme that may be used in the decision process.
  • RDO may use a Lagrangian formulation wherein the reconstructed macroblock distortion is weighed against the bit rate needed to transmit the coding mode and residue information of the macroblock.
  • the optimal macroblock coding mode for the macroblock corresponds to that minimising the Lagrange cost.
  • the Lagrange cost can be calculated, for example, according to the following formula :
  • the macroblock coding mode contains motion compensation mode (either as TM_BMC or BMC) information in addition to reference frame and motion vectors information to indicate the motion compensation mode to be performed.
  • motion compensation mode either as TM_BMC or BMC
  • motion compensated prediction is performed on the macroblock in step 315 to get the prediction values.
  • the residues between the raw macroblock and the prediction values are transformed and quantized in step 320.
  • Steps 305-320 are repeated until a macroblock at the end of a frame is reached in step 325.
  • a macroblock of a new frame is then feed into step 305.
  • the above steps are then repeated until the macroblock is at the end of a whole video sequence in step 300.
  • a flowchart of motion estimation of a macroblock partition is disclosed.
  • a macroblock is divided into macroblock partitions of variable size by a macroblock partition mode.
  • a macroblock partition is the basic unit used for motion estimation.
  • a variable min_mcost representing the minimal motion cost is initialized as LARGEST in step 400.
  • the LARGEST corresponds to the largest value which can be represented by a target computing device.
  • the template matching process is then performed to get the template matching predictor in step 405.
  • a process of template matching will be described in more detail later with reference to Figure 5.
  • the motion compensation mode is set as BMC in step 410, and the motion cost of current candidate motion vector with BMC mode is calculated in step 415. The result is stored as variable mcost. Then motion cost of current candidate motion vector applying the TM_BMC mode is then calculated in step 420, and result is stored in variable tm_bmc_mcost. The tm_bmc_cost and mcost variables are compared in step 425. If tm_bmc_cost is lower than mcost, mcost is updated as tm_bmc_cost and motion compensation mode is set as TM_BMC mode in step 430, otherwise, mcost is unchanged and motion compensation mode is BMC mode.
  • variable mcost is then compared with the variable min_mcost in step 435, if mcost is lower than min_mcost, min_mcost is updated as mcost, and the optimal motion vector and the motion compensation mode is updated as the current candidate motion vector and MC mode in step 440. Steps 410-445 are repeated until the condition in step 445 is satisfied. The optimal motion vector and motion compensation mode for current macrobock partition are therebyobtained.
  • the motion cost of a macroblock partition with motion vector MV and motion compensation mode MCJvlODE can be calculated, for example, according to the following formula
  • Dm represents the difference between the motion compensated prediction values and the raw values of the current macroblock partition
  • LAMDAm represents a trade-off between Dm and the bits for coding the motion vector and the motion compensation mode
  • MV r represents the bits for coding the motion vector
  • MC_MODE r represents the bits for coding the motion compensation mode.
  • a difference measurement D m corresponds to the overall degree of difference between motion compensated prediction values and raw values of current macroblock partition based, for example, on mean square error (MSE) or mean absolute error (MAE).
  • MSE mean square error
  • MAE mean absolute error
  • a difference measurement of MAE often has the following form:
  • MAE Y ⁇ S x,y - Spre x
  • x is the number of rows in a macroblock partition
  • y is the number of columns in a macroblock partition
  • Sxy is a pixel in the xth row and yth column of the macroblock partition in current frame
  • Spre xy is a pixel in the xth row and yth column of the maroblock partition motion compensated prediction.
  • MSE ⁇ (S ⁇ - Spre x J
  • the motion vector of TM_BMC mode is coded as a motion vector difference between the motion vector and the motion vector predictor.
  • the motion vector predictor is derived from template matching process.
  • the template matching process is shown in Figure 5.
  • the template of current block 505 is the grey block 500.
  • the motion vector 520 is derived by motion estimation of the grey block 500 in its reference picture, grey block 510, which is the result of shifting grey block 500 by motion vector 520 in its reference picture.
  • block of 515 as the template matching predictor of block 505.
  • the motion compensated prediction value of Sprexy is calculated by combining the block motion compensation predictor and template matching predictor according to the following formula:
  • Predtrrixy is the template matching predictor obtained by template matching process (the pixel value in block of 515 in Figure 5).
  • the syntax MC mode 615 is added to the bit-stream, the MC mode is a 1 bit long variable (either BMC mode or TM_BMC mode) and can either be coded by entropy coding scheme of CAVLC or CABAC.
  • the macroblock type information (containing macroblock partition mode information) is decoded in step 700.
  • the motion compensation mode of a macroblock partition is decoded in step 705. If the motion
  • BMC motion compensation mode is BMC mode in step 710, then BMC motion compensation is performed on the current macroblock partition in order to obtain the motion compensated prediction values in step 725. Otherwise, if the motion
  • TM_BMC mode is TM_BMC mode in step 710, then the template matching process (shown in Figure 5) is performed to obtain the template matching predictor in step 715.
  • the other information obtained from template matching process is the motion vector of the template.
  • the motion vector obtained is then used as the predictor of the motion vector of block motion compensation.
  • the motion vector of block motion compensation is calculated by adding the predictor to the motion vector residue decoded from the bit-stream.
  • TM_BMC motion compensation is performed on current macroblock partition, by combining the the template matching predictor and the block motion compensation predictor, to get the motion compensated prediction values in step 720.
  • Step 720 or 725 The motion compensated prediction values obtained from Step 720 or 725 are added to the reconstructed residues to get a reconstructed macroblock (and deblocking is performed on block edges when necessary) in step 730. Steps 705- 730 are looped until all macroblock partitions in the current macroblock have been decoded.
  • Embodiments of the invention are based on the idea of encoding a video sequence by locally switching between block motion compensation (BMC) and combining template matching predictor and block motion compensation predictor (TM_BMC).
  • BMC block motion compensation
  • TM_BMC block motion compensation predictor
  • An advantage of the invention is that BMC and TM_BMC mode can be switched locally. Consequently, the motion compensated prediction efficiency of pixels near block edge can be improved by TM_BMC, the over-smoothing affect of TM_BMC can be eliminated by block motion compensation (BMC), the coding efficiency of video sequence can be improved.
  • a further advantage of the invention is that only the motion vector the BMC is coded when TM_BMC mode is selected, and the motion vector predictor of the BMC is derived from the template matching, so the bit-rate of motion vector is saved.
  • Embodiments of the present invention can find applications in the next generation video coding standard, for example AVS, H.265 and MPEG HVC etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

A method of decoding a stream of video frames each video frame being partitioned into one or more blocks, the method comprising: receiving a plurality of successive video frames, wherein each frame has a motion vector indicator; selecting for a block in a current frame a decoded neighbouring zone of the respective block; deriving a first motion vector by defining the relative movement between the position of the decoded neighbouring zone in a reference frame and the position of the neighbouring zone in the current frame; deriving a second motion vector using the motion vector indicator and the first motion vector; and averaging a first block predicted by the first motion vector and a second block predicted by a second motion vector for reconstruction of the current frame. The invention also relates to a decoder, and a corresponding method of encoding and an encoder.

Description

METHOD OF AND APPARATUS FOR ENCODING VIDEO FRAMES, METHOD OF AND APPARATUS FOR DECODING VIDEO FRAMES
Field of the Invention
The present invention relates in general to a methods and apparatus for encoding and/or decoding video frames. The invention further relates to a method and apparatus for encoding and/or decoding video frames.
Background of the Invention
Video data comprises a sequence of distinct images, or frames, that are shown one after the other in such rapid succession as to give the illusion of smooth movement. Video compression algorithms attempt to reduce the amount of data required to transmit or store video data by exploiting the fact that there are very little changes from one frame to the next. A compression algorithm compares a small area of the new frame to be transmitted with the same area in the preceding frame. If the selected area is similar to the corresponding area in the preceding frame then there is no need to transmit a second copy. The
compression algorithm merely notes that the two areas are the same. When the new frame is reconstructed by the decompression algorithm, it copies the area from the preceding frame into the new frame. This reuse of data results in a significant data compression. Typically the compared areas in the new frame are constituted of non-overlapping squares of 16x16 pixels. These are known as macroblocks. Compression algorithms do not require that the macroblocks be identical, merely similar. Adjusting the degree of similarity required can affect the degree of compression and the quality of the resulting video.
A simple macroblock replacement scheme typically compresses only those portions of the screen that do not move. However when the block in a new frame and a previous frame are not similar, it is often the case that the information contained in the new frame's block is, in fact, available in the preceding frame - but at a different location. Consider, for example, a car moving across the screen. Even though it has moved from one frame to the next, much of the data required for the latest frame is still available in the previous frame, just not in the same place. Motion compensation techniques attempt to achieve greater compression by reusing portions from the previous frame to construct the new frame, even if the portions have moved location within the frame compared to the preceding frame.
In video compression, motion compensation describes a frame in terms of where each section of that frame came from, in a previous frame. Subsequent frames in a video sequence are often very similar, thus containing a lot of redundancy. Removing this redundancy helps achieve the goal of better compression ratios.
A first approach to reducing the redundancy is to simply subtract a reference frame from a given frame. The resulting difference is then referred to as the residual and usually contains less energy (or information) than the original frame. The residual can be encoded at a lower bit-rate with the same quality. The decoder can reconstruct the original frame by adding the reference frame again.
A more sophisticated approach to reducing data redundancy is to
approximate the motion of the whole scene and the objects of a video sequence. The motion is described by parameters that should be encoded in the bit-stream. The pixels of the predicted frame are approximated by appropriately translated pixels of the reference frame. This gives much better residuals than a simple subtraction. However, the bit-rate occupied by the parameters of the motion model should not become too high.
Usually, the frames of the video sequence are processed in groups. One frame (usually the first) is encoded without motion compensation just as a normal image. This frame is called an l-frame (intra-coded frame, MPEG terminology) or l-picture. The other frames are called P-frames or P-pictures and are predicted from the l-frame or P-frame that comes (temporally) immediately before it. The prediction schemes are, for instance, described as IPPP..., meaning that a group consists of one l-frame followed by three P-frames.
Frames can also be predicted from future frames. The future frames then need to be encoded before the predicted frames and thus, the encoding order does not necessarily match the real temporal frame order. Such frames are usually predicted from two directions, i.e. from the I- or P-frames that immediately precede or follow the predicted frame. These bi-directionally predicted frames are called B-frames. A coding scheme could, for instance, be IBBPBBPBB.... In block motion compensation (BMC), the frames are partitioned in blocks of pixels (e.g. macroblocks of 16x16 pixels in MPEG). Each block is predicted from a block of equal size in the reference frame. The blocks are not transformed in any way apart from being shifted to the position of the predicted block. This shift is represented by a motion vector.
Motion vectors are parameters of the BMC motion model and should be encoded into the bit-stream. As the motion vectors are not always independent (e.g. if two neighbouring blocks belong to the same moving object), they are usually encoded differentially to save bit-rate. This means that the difference between the motion vector and the neighbouring motion vector(s) encoded before is encoded. (The result of this differencing process is mathematically equivalent to a global motion compensation capable of panning.) An entropy codec can exploit the resulting statistical distribution of the motion vectors (around the zero vector).
It is possible to shift blocks by non-integer vectors, which is known as sub- pixel precision. This is done by interpolating the pixel's values. Usually, the precision of the motion vectors is increased by one bit: half-pixel precision. The computational expense of sub-pixel precision is much higher, due to the
interpolation required.
The main disadvantage of block motion compensation is that it introduces discontinuities at the block borders (blocking artifacts). These artifacts appear in the form of sharp horizontal and vertical edges which are easily spotted by the human eye and produce ringing effects (large coefficients in high frequency sub- bands) in the Fourier-related transform used for transform coding of the residual frames.
Another disadvantage of BMC is that there is at most one predictor for each block, so the motion compensation efficiency is constrained by motion vector accuracy. The bi-direction predicted block in B frames is an example of using two predictors for one block to improve the motion compensation efficiency. Summary of the Invention
The present invention has been devised with the foregoing in mind. To better address one or more of the foregoing concerns, in its most general form the invention provides a method of .
Accordingly, a first aspect of the invention provides a method of decoding a stream of video frames each video frame being partitioned into one or more blocks, the method comprising: receiving a plurality of successive video frames, wherein each frame has a motion vector indicator; selecting for a block in a current frame a neighbouring zone of the respective block; deriving a first motion vector by defining the relative movement between the position of the neighbouring zone in a reference frame and the position of the neighbouring zone in the current frame; deriving a second motion vector using the motion vector indicator and the first motion vector; and averaging a first block predicted by the first motion vector and a second block predicted by a second motion vector to provide a combined block for reconstruction of the current frame.
A second aspect of the invention provides a decoder for decoding a stream of video frames each video frame being partitioned into one or more blocks, the decoder comprising: a receiver for receiving a plurality of successive video frames, wherein each frame has a motion vector indicator; a selector for selecting for a block in a current frame a decoded neighbouring zone of the respective block; a processor configured to derive a first motion vector by defining the relative movement between the position of the decoded neighbouring zone in a reference frame and the position of the neighbouring zone in the current frame; derive a second motion vector using the motion vector indicator and the first motion vector; and average a first block predicted by the first motion vector and a second block predicted by a second motion vector for reconstruction of the current frame.
A third aspect of the invention provides a method of encoding a stream of video frames each frame being partitioned into one or more blocks, the method comprising: performing motion estimation on a block to determine a motion vector and a motion compensation mode for motion compensation; performing motion compensation on the block according to the determined motion vector and the determined motion compensation mode to provide predicted block values; determining residue data between the predicted block values and the raw block, quantising the residue data; and mapping the quantized residue data, a motion vector indicator indicating the motion vector and a motion compensation mode indicator indicating the corresponding motion compensation mode onto a bit stream.
A fourth aspect of the invention provides an encoder for encoding a stream of video frames each frame being partitioned into one or more blocks, the encoder comprising: a motion estimator for determining a motion vector and a motion compensation mode for motion compensation of a block; a motion compensator for performing motion compensation on the block according to the determined motion vector and the determined motion compensation mode to provide predicted block values; a subtractor for determining residue data between the predicted block values and the raw block, a quantizer for quantising the residue data; and a mapper for mapping the quantized residue data, a motion vector indicator indicating the motion vector and a motion compensation mode indicator indicating the corresponding motion compensation mode onto a bit stream.
In embodiments of the invention:
■ each frame may comprise a motion compensation mode indicator, the mode indicator determining whether the second block is used as the predicted block or the combined block is used as the predicted block.
the motion vector of the motion compensation mode of the combined block may be coded as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
in an embodiment of the invention, the second motion vector may correspond to a motion vector of a BMC mode, the first motion vector maybe template matching motion vector and the motion vector indicator may be a motion vector residue decoded from the bitstream ■ the step of performing motion estimation on a block, in the method of encoding, to determine a motion vector and a motion compensation mode for motion compensation may include
partitioning the block into partitions according to a plurality of partition modes;
obtaining a motion vector and motion compensation mode for each partition mode; and
selecting a partition mode from the plurality of partition modes by applying a rate distortion optimisation technique wherein the reconstructed block distortion is weighed against the bit rate needed to transmit the coding mode and the residue data.
■ the motion compensation mode may be selected from a block motion
compensation mode (BMC) or a combined template matching motion compensation mode (TM_BMC) each motion vector of a partition mode the motion cost of the motion vector in block motion compensation mode may be calculated and compared with the motion cost of the motion vector in a combined template matching motion compensation mode. the motion vector of the TMJ3MC may be coded as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
each frame may comprise a motion compensation mode indicator, and to processor is configured to determine from the mode indicator whether the second block is used as the predicted block or the combined block is used as the predicted block.
the motion estimator of the encoder may include:
a block partitioner for partitioning the block into partitions according to a plurality of partition modes; the motion estimator being further configured to obtain a motion vector and motion compensation mode for each partition mode; and
select a partition mode from the plurality of partition modes by applying a rate distortion optimisation technique wherein the reconstructed block distortion is weighed against the bit rate needed to transmit the coding mode and the residue data.
the motion estimator may be configured to select the motion compensation mode from a block motion compensation mode (BMC) or a combined template matching motion compensation mode (TM_BMC)
the motion estimator may be configured to compare the motion cost for each motion vector of a partition mode the motion cost of the motion vector in block motion compensation mode with the motion cost of the motion vector in a combined template matching motion compensation mode
the encoder may be configured to code the motion vector of the TM_BMC as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
The methods according to the invention may be computer implemented. The methods may be implemented in software on a programmable apparatus. They may also be implemented solely in hardware or in software, or in a combination thereof.
Since the present invention can be implemented in software, the present invention can be embodied as computer readable code for provision to a programmable apparatus on any suitable carrier medium. A tangible carrier medium may comprise a storage medium such as a floppy disk, a CD-ROM, a hard disk drive, a magnetic tape device or a solid state memory device and the like. A transient carrier medium may include a signal such as an electrical signal, an electronic signal, an optical signal, an acoustic signal, a magnetic signal or an electromagnetic signal, e.g. a microwave or RF signal. Brief Description of the Drawings
Embodiments of the invention will now be described, by way of example only, and with reference to the following drawings in which:-
Figure 1 schematically illustrates a generally used video coding hierarchical syntax,
Figure 2 schematically illustrates video codec architecture according to an embodiment of the invention,
Figure 3 represents a flowchart of encoding a video sequence according to the embodiment of the invention,
Figure 4 represents a flowchart of motion estimation on a macroblock partition according to the embodiment of the invention,
Figure 5 schematically illustrates a process of template matching on a macroblock partition according to an embodiment of the invention,
Figure 6 schematically illustrates a syntax structure on macroblock level of coded bit-stream according to the embodiment of the invention,
Figure 7 represents a flowchart of a method for decoding a macroblock from the coded bit-stream according to the embodiment of the invention,
Detailed description
Video coding generally comprises the process of compressing (encoding) and decompressing (decoding) a digital video signal. In the description which follows, and in accordance with general usage in the field of the invention, a device that compresses data is referred to as an encoder, a device that decompresses data is referred to as a decoder, and a device that acts as both encoder and decoder will be referred to as a codec. It is common in the field of video coding to use a syntax according to the hierarchical structure illustrated in Figure 1 , in which a video sequence VS consists of a plurality of successive pictures P, which hereinafter will be indistinctively referred to as frames. Each frame P can be split into one or several slices SL, each slice SL being defined as a sequence of macroblocks MB. A macroblock MB is defined as the basic unit for encoding and is generally of fixed size frame partitions that cover a rectangular area of 16x16 pixels. Each macroblock MB can be further segmented into one or more blocks B of variable block size. Further, hereinafter we will use the notion of macroblock partition to refer to a block of a macroblock for which motion- compensated prediction is applied. An allowable set of macroblock partition modes, i.e. a number of specific ways a macroblock can be partitioned in one or more macroblock partitions MP1 to MP9, typically vary from one coding scheme to another and, for example, a 16x16 macroblock MB may have a mix of 8x8, 4x4, 4x8 and 8x4 macroblock partitions within a single macroblock.
Referring now to Figure 2, a video codec architecture 2 according to an embodiment of the invention is illustrated.
A part of a frame of raw video data is input into a motion estimation module 200, which performs a hybrid template matching motion estimation (TMJvIE) and traditional block motion estimation (BME) in motion estimation to determine the optimal motion vector and motion compensation mode for motion compensation. The motion compensation mode can be BMC (block motion compensation predictor) or TM_BMC (combining template matching predictor and block motion compensator predictor). Motion compensation of the received frame part is performed in module 205 according to the determined motion vector and the determined motion compensation mode in order to obtain prediction values. The residues between the prediction values and the raw frame values are transformed and quantized by a DCT&Q module 210. The results are scanned to one dimension values and mapped into bit-stream by Entropy coding module 215.
In order to obtain identical prediction values between encoder and decoder, the decoded frame in the decoder can be reconstructed by the encoder. Inverse quantization and inverse transform are performed on the output of the DCT&Q 210 module in the IQ&IDCT module 220 to recover the residue values. Since there is information loss in quantization and inverse quantization, the residue values are not the same as the output from DCT&Q module 210.
Similarly, the prediction values obtained from motion compensation module
205 are added to the residues to reconstruct the raw frame values. Since the quantization error on neighbouring areas is not necessary identical, and blocking artifacts tend to be introduced between neighbouring areas, Deblocking module 225 performs de-blocking filtering on the border between neighbouring areas to improve the subject quality of the reconstructed frame. If the filtered reconstructed frame can act as a reference frame for the following frames, it is stored in Reference buffer 230.
The functions of the modules in the dashed rectangle of Figure 2 are performed in decoding to reconstruct the raw video from the coded bit-stream.
Figure 3 presents a flowchart outlining the steps of encoding a video sequence of frames according to the embodiment of the invention. Generally, as described above, a frame can be divided into macroblocks, and the macroblock forms a basic processing unit for encoding process.
Step TM_ME&BME 305 is firstly performed on one macroblock of a frame. In this step, the macroblock is further divided into partitions (macroblock partitions) of varying sizes according to a set of macroblock partition modes. Motion estimation is performed on the macroblock independently for each macroblock partition mode. A separate set of motion vectors and a motion compensation mode (either TM_BMC mode or BMC mode) are obtained for each macroblock partition in this step.
In the Partition mode selection step 310 a mode decision process is performed to select a macroblock coding mode (containing macroblock partition mode information) according to the results obtained in step 305. Rate Distortion Optimization (RDO) is a typical mode decision scheme that may be used in the decision process. RDO may use a Lagrangian formulation wherein the reconstructed macroblock distortion is weighed against the bit rate needed to transmit the coding mode and residue information of the macroblock. The optimal macroblock coding mode for the macroblock corresponds to that minimising the Lagrange cost. The Lagrange cost can be calculated, for example, according to the following formula :
COST = D + LAMDA (MODEr + RESr) (1 )
Where D represents distortion between the raw macroblock and the reconstructed macroblock under the target coding mode, LAMDA represents a trade-off coefficient between distortion and the coded bits of the macroblock, MODEr represents the bits needed for coding the macroblock coding mode, and RESr represents the bits needed for coding the residues under the target coding mode. According to a particular embodiment of the invention, the macroblock coding mode contains motion compensation mode (either as TM_BMC or BMC) information in addition to reference frame and motion vectors information to indicate the motion compensation mode to be performed.
According to the motion compensation mode obtained in step 310, motion compensated prediction is performed on the macroblock in step 315 to get the prediction values. The residues between the raw macroblock and the prediction values are transformed and quantized in step 320.
Steps 305-320 are repeated until a macroblock at the end of a frame is reached in step 325. A macroblock of a new frame is then feed into step 305. The above steps are then repeated until the macroblock is at the end of a whole video sequence in step 300.
With reference to Figure 4, a flowchart of motion estimation of a macroblock partition according to this invention is disclosed. As described in Figure 1 , a macroblock is divided into macroblock partitions of variable size by a macroblock partition mode. A macroblock partition is the basic unit used for motion estimation. A variable min_mcost representing the minimal motion cost is initialized as LARGEST in step 400. The LARGEST corresponds to the largest value which can be represented by a target computing device. The template matching process is then performed to get the template matching predictor in step 405. A process of template matching will be described in more detail later with reference to Figure 5.
For each motion vector candidate of the current macroblock parition, the motion compensation mode is set as BMC in step 410, and the motion cost of current candidate motion vector with BMC mode is calculated in step 415. The result is stored as variable mcost. Then motion cost of current candidate motion vector applying the TM_BMC mode is then calculated in step 420, and result is stored in variable tm_bmc_mcost. The tm_bmc_cost and mcost variables are compared in step 425. If tm_bmc_cost is lower than mcost, mcost is updated as tm_bmc_cost and motion compensation mode is set as TM_BMC mode in step 430, otherwise, mcost is unchanged and motion compensation mode is BMC mode. The variable mcost is then compared with the variable min_mcost in step 435, if mcost is lower than min_mcost, min_mcost is updated as mcost, and the optimal motion vector and the motion compensation mode is updated as the current candidate motion vector and MC mode in step 440. Steps 410-445 are repeated until the condition in step 445 is satisfied. The optimal motion vector and motion compensation mode for current macrobock partition are therebyobtained.
The motion cost of a macroblock partition with motion vector MV and motion compensation mode MCJvlODE can be calculated, for example, according to the following formula
COSTm = Dm + LAMDAm (MVr + MC_MODEr)
where Dm represents the difference between the motion compensated prediction values and the raw values of the current macroblock partition, LAMDAm represents a trade-off between Dm and the bits for coding the motion vector and the motion compensation mode, MVr represents the bits for coding the motion vector and MC_MODEr represents the bits for coding the motion compensation mode.
A difference measurement Dm corresponds to the overall degree of difference between motion compensated prediction values and raw values of current macroblock partition based, for example, on mean square error (MSE) or mean absolute error (MAE). As will be appreciated by those skilled in the art, various differencing methods may be used to determine the difference
measurement described above. A difference measurement of MAE often has the following form:
MAE = Y∑\ Sx,y - Sprex
x y
where:
x is the number of rows in a macroblock partition,
y is the number of columns in a macroblock partition,
Sxy is a pixel in the xth row and yth column of the macroblock partition in current frame,
Sprexy is a pixel in the xth row and yth column of the maroblock partition motion compensated prediction.
It will be appreciated that the lower is the difference indicated by MAE, the more efficient for motion compensated prediction. The other difference
measurement of MSE often has following form: MSE =∑∑(S^ - SprexJ
As an embodiment of this invention, the motion vector of TM_BMC mode is coded as a motion vector difference between the motion vector and the motion vector predictor. The motion vector predictor is derived from template matching process.
The template matching process is shown in Figure 5. For example say the template of current block 505 is the grey block 500. The motion vector 520 is derived by motion estimation of the grey block 500 in its reference picture, grey block 510, which is the result of shifting grey block 500 by motion vector 520 in its reference picture. We can then obtain block of 515 as the template matching predictor of block 505.
As another embodiment of the invention, when TM_BMC mode is selected for motion compensation, the motion compensated prediction value of Sprexy is calculated by combining the block motion compensation predictor and template matching predictor according to the following formula:
Sprexy = (PredbmCxy + Predtrrixy ) » 1 where PredbmCxy is the block motion compensation predictor obtained by shifting the block by the said block's motion vector in its reference picture,
Predtrrixy is the template matching predictor obtained by template matching process (the pixel value in block of 515 in Figure 5).
Now referring to Figure 6, a syntax structure on macroblock level of coded bit-stream according to the invention is disclosed. Compared with the syntax structure on macroblock level of a tradition encoding process, the syntax MC mode 615 is added to the bit-stream, the MC mode is a 1 bit long variable (either BMC mode or TM_BMC mode) and can either be coded by entropy coding scheme of CAVLC or CABAC.
Now referring to Figure 7, a flowchart of a method for decoding a
macroblock from the coded bit-stream according to an embodiment of the invention is disclosed. The macroblock type information (containing macroblock partition mode information) is decoded in step 700. The motion compensation mode of a macroblock partition is decoded in step 705. If the motion
compensation mode is BMC mode in step 710, then BMC motion compensation is performed on the current macroblock partition in order to obtain the motion compensated prediction values in step 725. Otherwise, if the motion
compensation mode is TM_BMC mode in step 710, then the template matching process (shown in Figure 5) is performed to obtain the template matching predictor in step 715. The other information obtained from template matching process is the motion vector of the template. The motion vector obtained is then used as the predictor of the motion vector of block motion compensation. The motion vector of block motion compensation is calculated by adding the predictor to the motion vector residue decoded from the bit-stream. TM_BMC motion compensation is performed on current macroblock partition, by combining the the template matching predictor and the block motion compensation predictor, to get the motion compensated prediction values in step 720.
The motion compensated prediction values obtained from Step 720 or 725 are added to the reconstructed residues to get a reconstructed macroblock (and deblocking is performed on block edges when necessary) in step 730. Steps 705- 730 are looped until all macroblock partitions in the current macroblock have been decoded.
Embodiments of the invention are based on the idea of encoding a video sequence by locally switching between block motion compensation (BMC) and combining template matching predictor and block motion compensation predictor (TM_BMC). When the mode of TM_BMC is selected, only the motion vector of BMC is coded, and the motion vector predictor of BMC can then be derived from template matching.
An advantage of the invention is that BMC and TM_BMC mode can be switched locally. Consequently, the motion compensated prediction efficiency of pixels near block edge can be improved by TM_BMC, the over-smoothing affect of TM_BMC can be eliminated by block motion compensation (BMC), the coding efficiency of video sequence can be improved.
A further advantage of the invention is that only the motion vector the BMC is coded when TM_BMC mode is selected, and the motion vector predictor of the BMC is derived from the template matching, so the bit-rate of motion vector is saved. Embodiments of the present invention can find applications in the next generation video coding standard, for example AVS, H.265 and MPEG HVC etc.
Although the present invention has been described hereinabove with reference to specific embodiments, the present invention is not limited to the specific embodiments, and modifications will be apparent to a skilled person in the art which lie within the scope of the present invention.
In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The mere fact that different features are recited in mutually different dependent claims does not indicate that a combination of these features cannot be advantageously used. Any reference signs in the claims should not be construed as limiting the scope of the invention.

Claims

1. A method of decoding a stream of video frames each video frame being partitioned into one or more blocks, the method comprising:
receiving a plurality of successive video frames, wherein each frame has a motion vector indicator;
selecting for a block in a current frame a decoded neighbouring zone of the respective block;
deriving a first motion vector by defining the relative movement between the position of the decoded neighbouring zone in a reference frame and the position of the neighbouring zone in the current frame;
deriving a second motion vector using the motion vector indicator and the first motion vector; and
averaging a first block predicted by the first motion vector and a second block predicted by a second motion vector for reconstruction of the current frame.
2. A method according to claim 1 , wherein each frame comprises a motion compensation mode indicator, the mode indicator determining whether the second block is used as the predicted block or the combined block is used as the predicted block.
3. A method according to claim 1 or 2, wherein the motion vector of the motion compensation mode of the combined block is coded as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
4. A method of encoding a stream of video frames each frame being partitioned into one or more blocks, the method comprising:
performing motion estimation on a block to determine a motion vector and a motion compensation mode for motion compensation; performing motion compensation on the block according to the determined motion vector and the determined motion compensation mode to provide predicted block values;
determining residue data between the predicted block values and the raw block,
quantising the residue -datd; and
mapping the quantized residue data, a motion vector indicator indicating the motion vector and a motion compensation mode indicator indicating the
corresponding motion compensation mode onto a bit stream.
5. A method according to claim 4, wherein the step of performing motion estimation on a block to determine a motion vector and a motion compensation mode for motion compensation includes
partitioning the block into partitions according to a plurality of partition modes;
obtaining a motion vector and motion compensation mode for each partition mode; and
selecting a partition mode from the plurality of partition modes by applying a rate distortion optimisation technique wherein the reconstructed block distortion is weighed against the bit rate needed to transmit the coding mode and the residue data.
6. A method according to claim 4 or 5 wherein the motion compensation mode is selected from a block motion compensation mode (BMC) or a combined template matching motion compensation mode (TM_BMC)
7. A method according to claim 6 wherein for each motion vector of a partition mode the motion cost of the motion vector in block motion compensation mode is calculated and compared with the motion cost of the motion vector in a combined template matching motion compensation mode.
8. A method according to claim 6 or 7 wherein the motion vector of the
TM_BMC is coded as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
9. A decoder for decoding a stream of video frames each video frame being partitioned into one or more blocks, the decoder comprising:
a receiver for receiving a plurality of successive video frames, wherein each frame has a motion vector indicator;
a selector for selecting for a block in a current frame a decoded neighbouring zone of the respective block;
a processor configured to derive a first motion vector by defining the relative movement between the position of the decoded neighbouring zone in a reference frame and the position of the neighbouring zone in the current frame;
derive a second motion vector using the motion vector indicator and the first motion vector; and
average a first block predicted by the first motion vector and a second block predicted by a second motion vector for reconstruction of the current frame.
10. A decoder according to claim 9, wherein each frame comprises a motion compensation mode indicator, and the processor is configured to determine from the mode indicator whether the second block is used as the predicted block or the combined block is used as the predicted block.
11. A decoder according to claim 9 or 10, wherein the motion vector of the motion compensation mode of the combined block is coded as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
12. An encoder for encoding a stream of video frames each frame being partitioned into one or more blocks, the encoder comprising:
a motion estimator for determining a motion vector and a motion
compensation mode for motion compensation of a block; a motion compensator for performing motion compensation on the block according to the determined motion vector and the determined motion
compensation mode to provide predicted block values;
a subtractor for determining residue data between the predicted block values and the raw block,
a quantizer for quantising the residue data; and
a mapper for mapping the quantized residue data, a motion vector indicator indicating the motion vector and a motion compensation mode indicator indicating the corresponding motion compensation mode onto a bit stream.
13. An encoder according to claim 12, wherein the motion estimator includes a block partitioner for partitioning the block into partitions according to a plurality of partition modes; the motion estimator being further configured to
obtain a motion vector and motion compensation mode for each partition mode; and
select a partition mode from the plurality of partition modes by applying a rate distortion optimisation technique wherein the reconstructed block distortion is weighed against the bit rate needed to transmit the coding mode and the residue data.
14. An encoder according to claim 12 or 13 wherein the motion estimator is configured to select the motion compensation mode from a block motion compensation mode (BMC) or a combined template matching motion
compensation mode (TM_BMC)
15. An encoder according to claim 13 or 14 wherein the motion estimator is configured to compare the motion cost for each motion vector of a partition mode the motion cost of the motion vector in block motion compensation mode with the motion cost of the motion vector in a combined template matching motion compensation mode
16. An encoder according to claim configured to code the motion vector of the TM_BMC as a motion vector difference between the motion vector and the motion vector predictor, the motion vector predictor being derived from a template matching process.
17. A computer-readable medium having computer-executable instructions to enable a computer system to perform the method of any one of claims 1 to 8.
PCT/IB2010/003424 2009-11-30 2010-11-25 Method of and apparatus for encoding video frames, method of and apparatus for decoding video frames WO2011064673A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2009075209 2009-11-30
CNPCT/CN2009/075209 2009-11-30

Publications (1)

Publication Number Publication Date
WO2011064673A1 true WO2011064673A1 (en) 2011-06-03

Family

ID=43769072

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/003424 WO2011064673A1 (en) 2009-11-30 2010-11-25 Method of and apparatus for encoding video frames, method of and apparatus for decoding video frames

Country Status (1)

Country Link
WO (1) WO2011064673A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2597523C2 (en) * 2011-06-30 2016-09-10 Шарп Кабусики Кайся Context initialisation based on decoder picture buffer
US9445120B2 (en) 2011-04-12 2016-09-13 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US9456217B2 (en) 2011-05-24 2016-09-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US9456214B2 (en) 2011-08-03 2016-09-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus, and moving picture coding and decoding apparatus
US9560373B2 (en) 2011-05-31 2017-01-31 Sun Patent Trust Image coding method and apparatus with candidate motion vectors
US9609356B2 (en) 2011-05-31 2017-03-28 Sun Patent Trust Moving picture coding method and apparatus with candidate motion vectors
US9615107B2 (en) 2011-05-27 2017-04-04 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9706206B2 (en) 2012-05-14 2017-07-11 V-Nova International Limited Estimation, encoding and decoding of motion information in multidimensional signals through motion zones, and auxiliary information through auxiliary zones
US9723322B2 (en) 2011-05-27 2017-08-01 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US10205948B2 (en) 2011-06-30 2019-02-12 Velos Media, Llc Context initialization based on slice header flag and slice type
US10887585B2 (en) 2011-06-30 2021-01-05 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
CN112970256A (en) * 2018-09-14 2021-06-15 皇家Kpn公司 Video coding based on globally motion compensated motion vectors
CN112997499A (en) * 2018-09-14 2021-06-18 皇家Kpn公司 Video coding based on globally motion compensated motion vector predictors
US11218708B2 (en) 2011-10-19 2022-01-04 Sun Patent Trust Picture decoding method for decoding using a merging candidate selected from a first merging candidate derived using a first derivation process and a second merging candidate derived using a second derivation process
US11647197B2 (en) 2011-06-30 2023-05-09 Velos Media, Llc Context initialization based on slice header flag and slice type

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080159401A1 (en) * 2007-01-03 2008-07-03 Samsung Electronics Co., Ltd. Method and apparatus for estimating motion vector using plurality of motion vector predictors, encoder, decoder, and decoding method
CA2722027A1 (en) * 2008-04-24 2009-10-29 Ntt Docomo, Inc. Image prediction encoding device, image prediction encoding method, image prediction encoding program, image prediction decoding device, image prediction decoding method, and image prediction decoding program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080159401A1 (en) * 2007-01-03 2008-07-03 Samsung Electronics Co., Ltd. Method and apparatus for estimating motion vector using plurality of motion vector predictors, encoder, decoder, and decoding method
CA2722027A1 (en) * 2008-04-24 2009-10-29 Ntt Docomo, Inc. Image prediction encoding device, image prediction encoding method, image prediction encoding program, image prediction decoding device, image prediction decoding method, and image prediction decoding program

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
KAMP S ET AL: "Multihypothesis prediction using decoder side-motion vector derivation in inter-frame video coding", VISUAL COMMUNICATIONS AND IMAGE PROCESSING; 20-1-2009 - 22-1-2009; SAN JOSE,, 20 January 2009 (2009-01-20), XP030081712 *
KWANGHYUN WON ET AL: "Motion vector coding using decoder-side estimation of motion vector", BROADBAND MULTIMEDIA SYSTEMS AND BROADCASTING, 2009. BMSB '09. IEEE INTERNATIONAL SYMPOSIUM ON, IEEE, PISCATAWAY, NJ, USA, 13 May 2009 (2009-05-13), pages 1 - 4, XP031480195, ISBN: 978-1-4244-2590-7 *
THOMAS WIEGAND ET AL: "Rate-Distortion Optimized Mode Selection for Very Low Bit Rate Video Coding and the Emerging H.263 Standard", IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 6, no. 2, 1 April 1996 (1996-04-01), XP011014299, ISSN: 1051-8215 *
WEN YANG ET AL: "An efficient motion vector coding algorithm based on adaptive predictor selection", IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS. ISCAS 2010 - 30 MAY-2 JUNE 2010 - PARIS, FRANCE, IEEE, US, 30 May 2010 (2010-05-30), pages 2175 - 2178, XP031724471, ISBN: 978-1-4244-5308-5 *
YOSHINORI SUZUKI ET AL: "Inter Frame Coding with Template Matching Averaging", IMAGE PROCESSING, 2007. ICIP 2007. IEEE INTERNATIONAL CONFERENCE ON, IEEE, PI, 1 September 2007 (2007-09-01), pages III - 409, XP031158091, ISBN: 978-1-4244-1436-9 *
Y-W CHEN ET AL: "MB Mode with joint application of template and block motion compensations", 2. JCT-VC MEETING; 21-7-2010 - 28-7-2010; GENEVA; (JOINT COLLABORATIVE TEAM ON VIDEO CODING OF ISO/IEC JTC1/SC29/WG11 AND ITU-T SG.16 ); URL: HTTP://WFTP3.ITU.INT/AV-ARCH/JCTVC-SITE/,, 23 July 2010 (2010-07-23), XP030007652 *

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11012705B2 (en) 2011-04-12 2021-05-18 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10536712B2 (en) 2011-04-12 2020-01-14 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10178404B2 (en) 2011-04-12 2019-01-08 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11917186B2 (en) 2011-04-12 2024-02-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US11356694B2 (en) 2011-04-12 2022-06-07 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US9872036B2 (en) 2011-04-12 2018-01-16 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US9445120B2 (en) 2011-04-12 2016-09-13 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10382774B2 (en) 2011-04-12 2019-08-13 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10609406B2 (en) 2011-04-12 2020-03-31 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus and moving picture coding and decoding apparatus
US10484708B2 (en) 2011-05-24 2019-11-19 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US9826249B2 (en) 2011-05-24 2017-11-21 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US10129564B2 (en) 2011-05-24 2018-11-13 Velos Media, LCC Decoding method and apparatuses with candidate motion vectors
US11228784B2 (en) 2011-05-24 2022-01-18 Velos Media, Llc Decoding method and apparatuses with candidate motion vectors
US9456217B2 (en) 2011-05-24 2016-09-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US10034001B2 (en) 2011-05-27 2018-07-24 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9723322B2 (en) 2011-05-27 2017-08-01 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US11979582B2 (en) 2011-05-27 2024-05-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11076170B2 (en) 2011-05-27 2021-07-27 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US11115664B2 (en) 2011-05-27 2021-09-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9883199B2 (en) 2011-05-27 2018-01-30 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US10200714B2 (en) 2011-05-27 2019-02-05 Sun Patent Trust Decoding method and apparatus with candidate motion vectors
US11895324B2 (en) 2011-05-27 2024-02-06 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US10212450B2 (en) 2011-05-27 2019-02-19 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US11575930B2 (en) 2011-05-27 2023-02-07 Sun Patent Trust Coding method and apparatus with candidate motion vectors
US9838695B2 (en) 2011-05-27 2017-12-05 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US9615107B2 (en) 2011-05-27 2017-04-04 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11570444B2 (en) 2011-05-27 2023-01-31 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10721474B2 (en) 2011-05-27 2020-07-21 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10708598B2 (en) 2011-05-27 2020-07-07 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US10595023B2 (en) 2011-05-27 2020-03-17 Sun Patent Trust Image coding method, image coding apparatus, image decoding method, image decoding apparatus, and image coding and decoding apparatus
US11368710B2 (en) 2011-05-31 2022-06-21 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US11509928B2 (en) 2011-05-31 2022-11-22 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US10645413B2 (en) 2011-05-31 2020-05-05 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US10652573B2 (en) 2011-05-31 2020-05-12 Sun Patent Trust Video encoding method, video encoding device, video decoding method, video decoding device, and video encoding/decoding device
US9900613B2 (en) 2011-05-31 2018-02-20 Sun Patent Trust Image coding and decoding system using candidate motion vectors
US9819961B2 (en) 2011-05-31 2017-11-14 Sun Patent Trust Decoding method and apparatuses with candidate motion vectors
US11057639B2 (en) 2011-05-31 2021-07-06 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US9560373B2 (en) 2011-05-31 2017-01-31 Sun Patent Trust Image coding method and apparatus with candidate motion vectors
US9609356B2 (en) 2011-05-31 2017-03-28 Sun Patent Trust Moving picture coding method and apparatus with candidate motion vectors
US10951911B2 (en) 2011-05-31 2021-03-16 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US10412404B2 (en) 2011-05-31 2019-09-10 Velos Media, Llc Image decoding method and image decoding apparatus using candidate motion vectors
US11917192B2 (en) 2011-05-31 2024-02-27 Sun Patent Trust Derivation method and apparatuses with candidate motion vectors
US11949903B2 (en) 2011-05-31 2024-04-02 Sun Patent Trust Image decoding method and image decoding apparatus using candidate motion vectors
US10531091B2 (en) 2011-06-30 2020-01-07 Velos Media, Llc Context initialization based on slice header flag and slice type
US11647197B2 (en) 2011-06-30 2023-05-09 Velos Media, Llc Context initialization based on slice header flag and slice type
RU2597523C2 (en) * 2011-06-30 2016-09-10 Шарп Кабусики Кайся Context initialisation based on decoder picture buffer
US11973950B2 (en) 2011-06-30 2024-04-30 Velos Media, Llc Context initialization based on slice header flag and slice type
US10931951B2 (en) 2011-06-30 2021-02-23 Velos Media, Llc Context initialization based on slice header flag and slice type
US10887585B2 (en) 2011-06-30 2021-01-05 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
RU2645270C1 (en) * 2011-06-30 2018-02-19 Вилос Медиа Интернэшнл Лимитед Context initialization based on buffer of decoder pictures
US11412226B2 (en) 2011-06-30 2022-08-09 Velos Media, Llc Context initialization based on slice header flag and slice type
US10205948B2 (en) 2011-06-30 2019-02-12 Velos Media, Llc Context initialization based on slice header flag and slice type
US9456214B2 (en) 2011-08-03 2016-09-27 Sun Patent Trust Moving picture coding method, moving picture coding apparatus, moving picture decoding method, moving picture decoding apparatus, and moving picture coding and decoding apparatus
US10440387B2 (en) 2011-08-03 2019-10-08 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US10284872B2 (en) 2011-08-03 2019-05-07 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US11553202B2 (en) 2011-08-03 2023-01-10 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US10129561B2 (en) 2011-08-03 2018-11-13 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US11979598B2 (en) 2011-08-03 2024-05-07 Sun Patent Trust Video encoding method, video encoding apparatus, video decoding method, video decoding apparatus, and video encoding/decoding apparatus
US11647208B2 (en) 2011-10-19 2023-05-09 Sun Patent Trust Picture coding method, picture coding apparatus, picture decoding method, and picture decoding apparatus
US11218708B2 (en) 2011-10-19 2022-01-04 Sun Patent Trust Picture decoding method for decoding using a merging candidate selected from a first merging candidate derived using a first derivation process and a second merging candidate derived using a second derivation process
US11595653B2 (en) 2012-05-14 2023-02-28 V-Nova International Limited Processing of motion information in multidimensional signals through motion zones and auxiliary information through auxiliary zones
US9706206B2 (en) 2012-05-14 2017-07-11 V-Nova International Limited Estimation, encoding and decoding of motion information in multidimensional signals through motion zones, and auxiliary information through auxiliary zones
US10750178B2 (en) 2012-05-14 2020-08-18 V-Nova International Limited Processing of motion information in multidimensional signals through motion zones and auxiliary information through auxiliary zones
CN112997499A (en) * 2018-09-14 2021-06-18 皇家Kpn公司 Video coding based on globally motion compensated motion vector predictors
CN112970256A (en) * 2018-09-14 2021-06-15 皇家Kpn公司 Video coding based on globally motion compensated motion vectors

Similar Documents

Publication Publication Date Title
JP6863669B2 (en) Image coding device, image coding method, image decoding device and image decoding method
WO2011064673A1 (en) Method of and apparatus for encoding video frames, method of and apparatus for decoding video frames
US8023562B2 (en) Real-time video coding/decoding
US8681873B2 (en) Data compression for video
US9078009B2 (en) Data compression for video utilizing non-translational motion information
KR101565228B1 (en) Image encoding apparatus, image decoding apparatus, image encoding method, image decoding method, and image prediction device
US20080126278A1 (en) Parallel processing motion estimation for H.264 video codec
US20140241424A1 (en) Apparatus of decoding video data
KR20120042910A (en) Template matching for video coding
TW201444350A (en) Square block prediction
CN113727108B (en) Video decoding method, video encoding method and related equipment
WO2012081162A1 (en) Moving image encoding device, moving image decoding device, moving image encoding method and moving image decoding method
US20120218432A1 (en) Recursive adaptive intra smoothing for video coding
US20070133689A1 (en) Low-cost motion estimation apparatus and method thereof
US9438925B2 (en) Video encoder with block merging and methods for use therewith
WO2023048646A9 (en) Methods and systems for performing combined inter and intra prediction
JP2014090327A (en) Moving image encoder, moving image decoder, moving image encoding method and moving image decoding method
JP2014090326A (en) Moving image encoder, moving image decoder, moving image encoding method and moving image decoding method
US20130170565A1 (en) Motion Estimation Complexity Reduction
JP2013098715A (en) Moving image encoder, moving image decoder, moving image encoding method and moving image decoding method
KR20070061214A (en) Low cost motion estimation device and motion estimation method
Paul et al. Video coding using arbitrarily shaped block partitions in globally optimal perspective
Vermeirsch et al. Efficient adaptive-shape partitioning of video
KR101037070B1 (en) Fast Motion Estimation Method Using Full Search
Wu et al. A real-time H. 264 video streaming system on DSP/PC platform

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10814697

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10814697

Country of ref document: EP

Kind code of ref document: A1