WO2009028780A1 - Method and apparatus for estimating and compensating spatiotemporal motion of image - Google Patents

Method and apparatus for estimating and compensating spatiotemporal motion of image Download PDF

Info

Publication number
WO2009028780A1
WO2009028780A1 PCT/KR2008/002274 KR2008002274W WO2009028780A1 WO 2009028780 A1 WO2009028780 A1 WO 2009028780A1 KR 2008002274 W KR2008002274 W KR 2008002274W WO 2009028780 A1 WO2009028780 A1 WO 2009028780A1
Authority
WO
WIPO (PCT)
Prior art keywords
block
region
estimation
current
spatiotemporal
Prior art date
Application number
PCT/KR2008/002274
Other languages
French (fr)
Inventor
Il-Koo Kim
Woo-Jin Han
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to CN2008801050937A priority Critical patent/CN101796844B/en
Priority to EP08741515A priority patent/EP2186341A4/en
Publication of WO2009028780A1 publication Critical patent/WO2009028780A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/55Motion estimation with spatial constraints, e.g. at image or region borders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/563Motion estimation with padding, i.e. with filling of non-object values in an arbitrarily shaped picture block or region for estimation purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/593Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques

Definitions

  • the present invention relates to a method of estimating and compensating for motion in image encoding and decoding, and more particularly, to a method of estimating and compensating for motion when a current block refers to a region that is outside a reference frame by using a motion vector in image encoding and decoding.
  • Video codecs such as H.263, Moving Picture Experts Group 2 (MPEG 2), MPEG 4,
  • H.264/ Advanced Video Coding (AVC), etc. use correlation between frames, perform a method of estimating and compensating for motion, and thereby increase compression efficiency, even when motion of an object appears in an image.
  • AVC Advanced Video Coding
  • the method of estimating and compensating for motion refers to a reference block of a reference frame to reconstruct a block of a current frame.
  • a part of the reference block determined by a motion vector may be outside the reference frame.
  • the conventional video codecs such as H.263, MPEG 2, MPEG 4, H.264/AVC, etc. extend an outside region of the reference frame by using boundary pixels of the reference frame, and thereby estimate the part of the reference block which is outside the reference frame.
  • FIG. 1 is a diagram of a conventional method of estimating and compensating for motion when a reference block is outside a reference frame.
  • a current block 120 of a current frame 110 refers to a reference block 160 of a reference frame 150 by a motion vector.
  • a region 170 included in the reference frame 150 is inserted into an equivalent region 130 of the current block 120.
  • an equivalent region 140 of the current block 120 referring to the part 180 that is outside the reference frame 150 is padded with pixel values of boundary pixels 190 of the reference frame 150. This conventional technique is known as a padding method.
  • the conventional padding method extends an outside region of a frame prior to performing a method of compensating for motion, and as such only limited in- formation is used in the padding method. Thus, a region that is related to motion and is outside the frame is not accurately estimated. Disclosure of Invention Technical Solution
  • the present invention provides a method and apparatus for accurately estimating and compensating for motion by using information of a region included in a neighboring block of a current block and a reference frame, when a reference block selected by a motion vector of the current block has a region that is outside the reference frame.
  • the present invention also provides a method and apparatus for estimating and compensating for motion, wherein the method and apparatus use correlation with a neighboring block and a reference frame, and can thereby greatly enhance compression efficiency of an active image sequence having panning, tilting, zooming in/out, fast camera motion, and fast object motion.
  • the method and apparatus for estimating and compensating for motion in image decoding according to the present invention use the information of the region included in the neighboring block and the reference frame. By doing so, the method and apparatus has reflected therein correlation with the reference frame, thereby has high compression efficiency. In particular, the compression efficiency of an active image sequence having panning, tilting, zooming in/out, fast camera motion, and fast object motion is greatly increased.
  • an effect of the method of estimating and compensating for motion in image decoding is higher than an effect of the method of estimating and compensating for motion in image encoding, in terms of calculation time.
  • the present invention can be applied to video codecs based on temporal motion estimation, or to all methods and apparatuses such as mobile phones, camcorders, digital cameras, Portable Multimedia Players (PMPs), next- generation Digital Video Discs (DVDs), software video codes, and the like which are capable of using the video codecs.
  • PMPs Portable Multimedia Players
  • DVDs Digital Video Discs
  • software video codes and the like which are capable of using the video codecs.
  • FIG. 1 is a diagram of a conventional method of estimating and compensating for motion when a reference block is outside a reference frame;
  • FIG. 2 is a block diagram illustrating a motion estimation and compensation apparatus for image decoding according to an exemplary embodiment of the present invention
  • FIG. 3 is a block diagram illustrating a motion estimation and compensation apparatus for image encoding according to another exemplary embodiment of the present invention.
  • FIG. 4 is a diagram illustrating a method of generating an estimation block when a reference block is outside a reference frame, according to an exemplary embodiment of the present invention
  • FIG. 5 is a diagram illustrating an example of a type of a search pattern for estimating a region that is outside a reference frame
  • FIG. 6 A provides diagrams illustrating one of various methods of estimating a region that is outside a reference frame
  • FIG. 6B provides diagrams illustrating another method from among various methods of estimating a region that is outside a reference frame
  • FIG. 6C provides a diagram illustrating another method from among various methods of estimating a region that is outside a reference frame
  • FIG. 7 is a flowchart of a method of estimating and compensating for motion in image decoding, according to another exemplary embodiment of the present invention.
  • FIG. 8 is a flowchart of a method of estimating and compensating for motion in image encoding, according to another exemplary embodiment of the present invention. Best Mode
  • a method of estimating and compensating for motion in image decoding including the operations of determining a reference block of a reference frame indicated by a motion vector of a current block of a current frame being decoded; and generating a spati- otemporal estimation block of the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame.
  • the operation of generating the spatiotemporal estimation block may include the operations of determining the first block region of the reference block to be an estimation region of the first block region of the spatiotemporal estimation block; and generating an estimation region of the second block region of the spatiotemporal estimation block by using at least one of pixel values of neighboring blocks of the current block, and pixel values of an estimation region of the first block region.
  • the operation of generating the spatiotemporal estimation block may include the operations of determining pixels of the neighboring blocks of the current block and pixels surrounding the second block region to be a search pattern region, wherein the neighboring blocks are from among regions reconstructed prior to the current block, and the pixels surrounding the second block region are part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block; searching for an outside estimation region surrounded by a similar pattern region having a minimum difference compared with the search pattern region in the regions reconstructed prior to the current block; and determining the outside estimation region to be an estimation region of the second block region of the spatiotemporal estimation block.
  • the operation of searching for the outside estimation region may include the operation of searching for the similar pattern region having a same form as the search pattern region in the regions reconstructed prior to the current block, and having a minimum Sum of Absolute Differences (SAD) between pixel values of the search pattern region and pixel values of the similar pattern region.
  • SAD Sum of Absolute Differences
  • the operation of determining the estimation region of the second block region of the spatiotemporal estimation block may include the operations of selecting at least one of a pixel line of the neighboring blocks of the current block and a pixel line surrounding the second block region, wherein the pixel line surrounding the second block region is a part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block; and determining pixels of the selected pixel line to be estimation pixels related to pixels of the second block region located either vertically or diagonally to the selected pixel line.
  • the operation of determining the estimation region of the second block region of the spatiotemporal estimation block may include the operation of determining an average value between pixel values of the neighboring blocks of the current block and pixel values surrounding the second block region to be an estimation pixel value of the second block region, wherein the pixel values surrounding the second block region are a part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block.
  • a method of estimating and compensating for motion in image encoding including the operations of searching for a reference block having a minimum difference compared with a current block in a reference frame, calculating a motion vector, and thereby performing motion estimation; generating an estimation block for the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame; and encoding an image by using the estimation block and the motion vector.
  • the operation of generating the estimation block may include the operations of determining the first block region of the reference block to be an estimation region of the first block region of the estimation block; and generating an estimation region of the second block region of the estimation block by using at least one of pixels of neighboring blocks of the current block and pixels of an estimation region of the first block region.
  • the operation of generating the estimation region of the second block region of the estimation block may include the operations of determining pixels of the neighboring blocks of the current block and pixels surrounding the second block region to be a search pattern region, wherein the neighboring blocks are from among regions reconstructed prior to the current block, and the pixels surrounding the second block region are of a part of the first block region of the estimation block, when the estimation block is allocated to the current block; searching for an outside estimation region surrounded by a similar pattern region having a minimum difference compared with the search pattern region in the regions reconstructed prior to the current block; and determining the outside estimation region to be an estimation region of the second block region of the estimation block.
  • the operation of searching for the outside estimation region may include the operation of searching for the similar pattern region having a same form as the search pattern region in the regions reconstructed prior to the current block, and having a minimum SAD between pixel values of the search pattern region and pixel values of the similar pattern region.
  • the operation of generating the estimation region of the second block region of the estimation block may include the operations of selecting at least one of a pixel line of the neighboring blocks of the current block and a pixel line surrounding the second block region, wherein the pixel line surrounding the second block region is a part of the first block region of the estimation block, when the estimation block is allocated to the current block; and determining pixels of the selected pixel line to be estimation pixels related to pixels of the second block region located either vertically or diagonally to the selected pixel line.
  • the operation of generating the estimation region of the second block region of the estimation block may include the operation of determining an average value between pixel values of the neighboring blocks of the current block and pixel values surrounding the second block region to be an estimation pixel value of the second block region, wherein the pixel values surrounding the second block region are a part of the first block region of the estimation block, when the estimation block is allocated to the current block.
  • a motion estimation and compensation apparatus in image decoding, the motion estimation and compensation apparatus including a reference block determining unit determining a reference block of a reference frame for a current block of a current frame being decoded; and a spatiotemporal estimation block generation unit generating a spati- otemporal estimation block of the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame.
  • a motion estimation and compensation apparatus in image encoding, the motion estimation and compensation apparatus including a motion estimation performing unit searching for a reference block having a minimum difference compared with a current block in a reference frame, calculating a motion vector, and thereby performing motion estimation; an estimation block generation unit generating an estimation block for the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame; and an encoding unit encoding an image by using the estimation block and the motion vector.
  • a computer readable recording medium having recorded thereon a program for executing the method of estimating and compensating for motion in image decoding.
  • a computer readable recording medium having recorded thereon a program for executing the method of estimating and compensating for motion in image encoding.
  • FIG. 2 is a block diagram illustrating a motion estimation and compensation apparatus 200 for image decoding according to an embodiment of the present invention.
  • the motion estimation and compensation apparatus 200 for image decoding includes a reference block determination unit 210, and a spatiotemporal estimation block generation unit 220.
  • the spatiotemporal estimation block generation unit 220 includes a first block region determination unit 230, and a second block region determination unit 240.
  • the reference block determination unit 210 determines a reference block of a reference frame for a current block of a current frame being decoded.
  • the spatiotemporal estimation block generation unit 220 When some pixels of the reference block are outside the reference frame, the spatiotemporal estimation block generation unit 220 generates a spatiotemporal estimation block of the current block by using the current frame and the reference frame.
  • a block region in the same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in the same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block.
  • the first block region determination unit 230 determines the first block region of the reference block to be an estimation region of a first block region of the spatiotemporal estimation block.
  • the second block region determination unit 240 generates an estimation region of a second block region of the spatiotemporal estimation block by using at least one of pixel values of neighboring blocks of the current block and pixel values of the estimation region of the first block region.
  • Embodiments related to the second block region determination unit 240 will be described later in detail with reference to FIGS. 4 through 6C.
  • An aspect of the present invention determines blocks estimated by the reference block, and in particular, the spatiotemporal estimation blocks generated by the spatiotemporal estimation block generation unit 220 when the reference block is outside the reference frame, to be a current block, and as such an image is reconstructed during image decoding.
  • FIG. 3 is a block diagram illustrating a motion estimation and compensation apparatus 300 for image encoding according to another embodiment of the present invention.
  • the motion estimation and compensation apparatus 300 for image encoding includes a motion estimation performing unit 310, an estimation block generation unit 320, and an encoding unit 330.
  • the motion estimation performing unit 310 searches for a reference block in a reference frame, wherein the reference block has a minimum difference compared with a current block, calculates a motion vector, and thereby performs motion estimation.
  • the motion estimation performing unit 310 may determine a block to be the reference block, wherein the block has a minimum Sum of Absolute Differences (SAD) between each pixel value of a block in the reference frame and each pixel value of the current block.
  • the motion vector may indicate a positional distance between the current block and the reference block.
  • the estimation block generation unit 320 When some pixels of the reference block are outside the reference frame, the estimation block generation unit 320 generates an estimation block for the current block by using the current frame and the reference frame.
  • the estimation block generation unit 320 includes a first block region determination unit and a second block region determination unit which are respectively for a first block region and a second block region of the estimation block.
  • the first and second block region determination units of the estimation block generation unit 320 respectively have the same operating principle as the first and second block region determination units of the spatiotemporal estimation block generation unit 220.
  • the operating principle of the first and second block region determination units of the estimation block generation unit 320 will be described later with reference to FIGS. 4 through 6C.
  • the encoding unit 330 encodes an image by using the estimation block and the motion vector.
  • the encoding unit 330 may determine the estimation block to be the current block, and encode the image by using a difference value between the current block and the reference block.
  • FIG. 4 is a diagram illustrating a method of generating an estimation block when a reference block is outside a reference frame, according to an embodiment of the present invention.
  • a frame 110, a frame 150, and a block 160 respectively indicate a current frame, a reference frame, and a reference block.
  • a region 170 of the reference block 160 indicates a region included in the reference frame 150.
  • a region 180 of the reference block 160 indicates a region excluded from the reference frame 150.
  • a block 420 indicates a spatiotemporal estimation block. Since the spatiotemporal estimation block generated by the present invention is determined to be a current block, the block 420 eventually becomes a current block 120.
  • a region 430 indicates a first block region of the spatiotemporal estimation block
  • a region 440 indicates a second block region of the spatiotemporal estimation block
  • a region 450 indicates pixels surrounding the current block 120, wherein the pixels are from among neighboring blocks of the current block 120.
  • an estimation block related to the current block 120 can be determined by a motion estimation/compensation method that is well-known to one of ordinary skill in the art.
  • a case in which the reference block 160 is outside the reference frame 150 will now be described in detail.
  • a block region in the same position as the region 170 included in the reference frame 150 is defined to be a first block region, wherein the block region is a part of the reference block 160
  • a block region in the same position as the region 180 excluded from the reference frame 150 is defined to be a second block region, wherein the block region is a part of the reference block 160.
  • the region 430 and the region 440 respectively become the first block region and the second block region, which are of the spatiotemporal estimation block 420 that is to be inserted into the current block 120.
  • the first block region determination unit 230 may determine the first block region
  • the second block region determination unit 240 may generate an estimation region of the second block region 440 of the spatiotemporal estimation block 420 by using at least one of pixel values of neighboring blocks of the current block 120, and pixel values of the estimation region of the first block region 430 of the spatiotemporal es- timation block 420.
  • the second block region determination unit 240 may use a method of using a search pattern including the pixels 450, and a method of using values of some of the pixels 450 without a change, wherein the pixels 450 surrounding the current block 120 are from among pixels of the neighboring blocks and first block region 430, so as to use the neighboring blocks of the current block 120 and the estimation region of the first block region 430 of the spatiotemporal estimation block 420.
  • the method of using the search pattern will be described later with reference to FIG. 5, and the method of using the pixels 450 will be described later with reference to FIGS. 6 A through 6C.
  • the first and second block region determination units of the estimation block generation unit 320 for image encoding respectively have the same operating principle as the first and second block region determination units 230 and 240 of the spatiotemporal estimation block generation unit 220 for image decoding.
  • FIG. 5 is a diagram illustrating an example of a type of a search pattern for estimating a region that is outside a reference frame.
  • a region 510 indicates a search pattern region for determining the second block region 440.
  • a region 520 indicates a similar pattern region searched for by the search pattern region 510.
  • a region 530 indicates an outside estimation region.
  • the second block region determination unit 240 may determine pixels of the neighboring blocks of the current block 120 and pixels surrounding the second block region 440 to be the search pattern region 510, wherein the neighboring blocks are from among regions reconstructed prior to the current block 120, and the pixels surrounding the second block region 440 are part of the first block region 430 of the spatiotemporal estimation block (estimation block) 420.
  • the second block region determination unit 240 may search for the similar pattern region 520 having a minimum difference compared with the search pattern region 510 in the regions reconstructed prior to the current block 120.
  • the similar pattern region 520 has the same form as the search pattern region 510, and a region having a minimum SAD (that is the sum of the absolute values of the difference between pixel values of the search pattern region 510 and pixel values of the similar pattern region 520) in the regions reconstructed prior to the current block 120 is de- termined to be the similar pattern region 520.
  • the outside estimation region 530 corresponds to a region surrounded by the similar pattern region 520 determined by the search pattern region 510.
  • the second block region determination unit 240 determines the outside estimation region 530 to be an estimation region of the second block region 440, and thereby completes the spati- otemporal estimation block (estimation block) 420.
  • FIGS. 6A through 6C are diagrams illustrating various methods of estimating a region that is outside a reference frame, according to another embodiment of the present invention.
  • the method is related to a method of determining the second block region 440 by using the aforementioned pixels 450.
  • FIG. 6A provides diagrams illustrating one of the various methods of estimating the region that is outside the reference frame.
  • a block 610 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of the first block region 430 is inserted vertically to a boundary line and to each pixel of the second block region 440.
  • a block 620 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of a previously reconstructed neighboring block is inserted vertically to a boundary line and to each pixel of the second block region 440.
  • a block 630 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of the previously reconstructed neighboring block is inserted horizontally to a boundary line and to each pixel of the second block region 440.
  • the pixel values of the previously reconstructed neighboring block that is the nearest block to the second block region 440 of the current block 120 are inserted vertically and horizontally to the boundary line and to each pixel of the second block region 440 from the boundary lines between the current block 120 and the previously reconstructed neighboring block, such that the estimation blocks 620 and 630 are generated.
  • the pixel values of the first block region 430 that is the nearest block region to the second block region 440 are inserted vertically to each pixel of the second block region 440 from the boundary line between the first block region 430 and the second block region 440, and such that the estimation block 610 is generated.
  • FIG. 6B provides diagrams illustrating another method from among the various methods of estimating the region that is outside the reference frame.
  • a block 640 indicates the spatiotemporal estimation block (estimation block) 420 in which the pixel values of the previously reconstructed neighboring block are inserted diagonally left to each pixel of the second block region 440.
  • a block 650 indicates the spatiotemporal estimation block (estimation block) 420 in which the pixel values of the previously reconstructed neighboring block are inserted diagonally right to each pixel of the second block region 440.
  • a block 660 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of the first block region 430 is inserted diagonally right to each pixel of the second block region 440.
  • a block 670 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of the first block region 430 is inserted diagonally left to each pixel of the second block region 440.
  • the pixel values of the first block region 430 that is the nearest block region to the second block region 440 are inserted diagonally to each pixel of the second block region 440 from the boundary line between the first block region 430 and the second block region 440, such that the estimation blocks 660 and 670 are generated.
  • FIG. 6C provides a diagram illustrating another method from among the various methods of estimating the region that is outside the reference frame.
  • a block 680 indicates the spatiotemporal estimation block (estimation block) 420 in which an average value of pixel values included in the region 510 is determined by estimation pixel values of the second block region 440.
  • the estimation block 680 determines an average value of pixel values of the previously reconstructed neighboring blocks from among neighboring blocks of the current block 120, and a region that is the same as the search pattern region 510 that comprises pixels of the first block region 430, wherein the pixels are the nearest pixels to the second block region 440, to be an estimation region.
  • FIG. 7 is a flowchart of a method of estimating and compensating for motion in image decoding, according to another embodiment of the present invention.
  • a reference block of a reference frame for a current block of a current frame is determined by using a received frame and a motion vector.
  • a spatiotemporal estimation block for the current block is generated by using the current frame and the reference frame.
  • the spatiotemporal estimation block is divided into a first block region corresponding to a region included in the reference frame, wherein the region is a part of the reference block, and a second block region corresponding to a region excluded from the reference frame, wherein the region is a part of the reference block, and as such the spatiotemporal estimation block corresponding to the respective regions is generated. That is, temporal estimation using pixel values of the reference block is performed on the first block region of the spatiotemporal estimation block, and spatial estimation using pixel values of a neighboring block of the current block is performed on the second block region of the spatiotemporal estimation block.
  • FIG. 8 is a flowchart of a method of estimating and compensating for motion in image encoding, according to another embodiment of the present invention.
  • a reference block having a minimum difference compared with a current block is searched for in a reference frame, a motion vector is calculated, and thereby motion estimation is performed.
  • an estimation block for the current block is generated by using a current frame and the reference frame.
  • the spatiotemporal estimation block is divided into a first block region corresponding to a region included in the reference frame, wherein the region is a part of the reference block, and a second block region corresponding to a region excluded from the reference frame, wherein the region is a part of the reference block, and as such the spatiotemporal estimation block corresponding to the respective regions is generated. That is, temporal estimation using pixel values of the reference block is performed on the first block region of the spatiotemporal estimation block, and spatial estimation using pixel values of a neighboring block of the current block is performed on the second block region of the spatiotemporal estimation block.
  • an image is encoded by using the estimation block and the motion vector. In this manner, motion compensation is performed during image encoding and then the image to be encoded is reconstructed. Thus, it is possible to use the image in which information regarding motion is more accurately reflected.
  • the method and apparatus for estimating and compensating for the motion in image decoding according to the present invention use information of the region included in the neighboring block of the current block and the reference frame. Accordingly, the method and apparatus can accurately estimate and compensate for the motion in comparison with the conventional technology.
  • the method and apparatus for estimating and compensating for motion in image decoding according to the present invention use the information of the region included in the neighboring block and the reference frame. By doing so, the method and apparatus has reflected therein correlation with the reference frame, thereby has high compression efficiency. In particular, the compression efficiency of an active image sequence having panning, tilting, zooming in/out, fast camera motion, and fast object motion is greatly increased.
  • an effect of the method of estimating and compensating for motion in image decoding is higher than an effect of the method of estimating and compensating for motion in image encoding, in terms of calculation time.
  • the present invention can be applied to video codecs based on temporal motion estimation, or to all methods and apparatuses such as mobile phones, camcorders, digital cameras, Portable Multimedia Players (PMPs), next- generation Digital Video Discs (DVDs), software video codes, and the like which are capable of using the video codecs.
  • PMPs Portable Multimedia Players
  • DVDs Digital Video Discs
  • software video codes and the like which are capable of using the video codecs.
  • Exemplary embodiments of the present invention can be written as computer programs and can be implemented in general-use digital computers that execute the programs using a computer readable recording medium and other media.
  • a data structure used in the embodiments of the present invention can be written in a computer readable recording medium through various means.
  • Examples of the computer readable recording medium include magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs)An example of other media is carrier waves (e.g., transmission through the Internet).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

A method of estimating and compensating for motion in image decoding are provided. The method involves determining a reference block of a reference frame indicated by a motion vector of a current block of a current frame being decoded, and generating a spatiotemporal estimation block of the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame.

Description

Description
METHOD AND APPARATUS FOR ESTIMATING AND COMPENSATING SPATIOTEMPORAL MOTION OF IMAGE
Technical Field
[1] The present invention relates to a method of estimating and compensating for motion in image encoding and decoding, and more particularly, to a method of estimating and compensating for motion when a current block refers to a region that is outside a reference frame by using a motion vector in image encoding and decoding. Background Art
[2] Video codecs such as H.263, Moving Picture Experts Group 2 (MPEG 2), MPEG 4,
H.264/ Advanced Video Coding (AVC), etc. use correlation between frames, perform a method of estimating and compensating for motion, and thereby increase compression efficiency, even when motion of an object appears in an image.
[3] The method of estimating and compensating for motion refers to a reference block of a reference frame to reconstruct a block of a current frame. However, a part of the reference block determined by a motion vector may be outside the reference frame. The conventional video codecs such as H.263, MPEG 2, MPEG 4, H.264/AVC, etc. extend an outside region of the reference frame by using boundary pixels of the reference frame, and thereby estimate the part of the reference block which is outside the reference frame.
[4] FIG. 1 is a diagram of a conventional method of estimating and compensating for motion when a reference block is outside a reference frame.
[5] A current block 120 of a current frame 110 refers to a reference block 160 of a reference frame 150 by a motion vector. A region 170 included in the reference frame 150 is inserted into an equivalent region 130 of the current block 120. However, when a part 180 of the reference block 160 is outside the reference frame 150, an equivalent region 140 of the current block 120 referring to the part 180 that is outside the reference frame 150 is padded with pixel values of boundary pixels 190 of the reference frame 150. This conventional technique is known as a padding method.
[6] When motion occurs in an outside region of a frame, compression efficiency of a block which refers to the outside of the frame by using the conventional simple padding method is decreased. In particular, compression efficiency of an active image sequence having panning, tilting, zooming in/out, fast camera motion, and fast object motion is greatly decreased.
[7] The conventional padding method extends an outside region of a frame prior to performing a method of compensating for motion, and as such only limited in- formation is used in the padding method. Thus, a region that is related to motion and is outside the frame is not accurately estimated. Disclosure of Invention Technical Solution
[8] The present invention provides a method and apparatus for accurately estimating and compensating for motion by using information of a region included in a neighboring block of a current block and a reference frame, when a reference block selected by a motion vector of the current block has a region that is outside the reference frame.
[9] The present invention also provides a method and apparatus for estimating and compensating for motion, wherein the method and apparatus use correlation with a neighboring block and a reference frame, and can thereby greatly enhance compression efficiency of an active image sequence having panning, tilting, zooming in/out, fast camera motion, and fast object motion. Advantageous Effects
[10] While the conventional technology uses a padding method which depends on limited information corresponding to boundary pixels of a reference frame, the method and apparatus for estimating and compensating for motion in image decoding according to the present invention use the information of the region included in the neighboring block and the reference frame. By doing so, the method and apparatus has reflected therein correlation with the reference frame, thereby has high compression efficiency. In particular, the compression efficiency of an active image sequence having panning, tilting, zooming in/out, fast camera motion, and fast object motion is greatly increased.
[11] Among exemplary embodiments of the present invention, an effect of the method of estimating and compensating for motion in image decoding is higher than an effect of the method of estimating and compensating for motion in image encoding, in terms of calculation time.
[12] The present invention can be applied to video codecs based on temporal motion estimation, or to all methods and apparatuses such as mobile phones, camcorders, digital cameras, Portable Multimedia Players (PMPs), next- generation Digital Video Discs (DVDs), software video codes, and the like which are capable of using the video codecs. Description of Drawings
[13] The above and other features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings in which:
[14] FIG. 1 is a diagram of a conventional method of estimating and compensating for motion when a reference block is outside a reference frame; [15] FIG. 2 is a block diagram illustrating a motion estimation and compensation apparatus for image decoding according to an exemplary embodiment of the present invention;
[16] FIG. 3 is a block diagram illustrating a motion estimation and compensation apparatus for image encoding according to another exemplary embodiment of the present invention;
[17] FIG. 4 is a diagram illustrating a method of generating an estimation block when a reference block is outside a reference frame, according to an exemplary embodiment of the present invention;
[18] FIG. 5 is a diagram illustrating an example of a type of a search pattern for estimating a region that is outside a reference frame;
[19] FIG. 6 A provides diagrams illustrating one of various methods of estimating a region that is outside a reference frame;
[20] FIG. 6B provides diagrams illustrating another method from among various methods of estimating a region that is outside a reference frame;
[21] FIG. 6C provides a diagram illustrating another method from among various methods of estimating a region that is outside a reference frame;
[22] FIG. 7 is a flowchart of a method of estimating and compensating for motion in image decoding, according to another exemplary embodiment of the present invention; and
[23] FIG. 8 is a flowchart of a method of estimating and compensating for motion in image encoding, according to another exemplary embodiment of the present invention. Best Mode
[24] According to an aspect of the present invention, there is provided a method of estimating and compensating for motion in image decoding, the method including the operations of determining a reference block of a reference frame indicated by a motion vector of a current block of a current frame being decoded; and generating a spati- otemporal estimation block of the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame.
[25] When a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the operation of generating the spatiotemporal estimation block may include the operations of determining the first block region of the reference block to be an estimation region of the first block region of the spatiotemporal estimation block; and generating an estimation region of the second block region of the spatiotemporal estimation block by using at least one of pixel values of neighboring blocks of the current block, and pixel values of an estimation region of the first block region.
[26] When a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the operation of generating the spatiotemporal estimation block may include the operations of determining pixels of the neighboring blocks of the current block and pixels surrounding the second block region to be a search pattern region, wherein the neighboring blocks are from among regions reconstructed prior to the current block, and the pixels surrounding the second block region are part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block; searching for an outside estimation region surrounded by a similar pattern region having a minimum difference compared with the search pattern region in the regions reconstructed prior to the current block; and determining the outside estimation region to be an estimation region of the second block region of the spatiotemporal estimation block.
[27] The operation of searching for the outside estimation region may include the operation of searching for the similar pattern region having a same form as the search pattern region in the regions reconstructed prior to the current block, and having a minimum Sum of Absolute Differences (SAD) between pixel values of the search pattern region and pixel values of the similar pattern region.
[28] The operation of determining the estimation region of the second block region of the spatiotemporal estimation block may include the operations of selecting at least one of a pixel line of the neighboring blocks of the current block and a pixel line surrounding the second block region, wherein the pixel line surrounding the second block region is a part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block; and determining pixels of the selected pixel line to be estimation pixels related to pixels of the second block region located either vertically or diagonally to the selected pixel line.
[29] The operation of determining the estimation region of the second block region of the spatiotemporal estimation block may include the operation of determining an average value between pixel values of the neighboring blocks of the current block and pixel values surrounding the second block region to be an estimation pixel value of the second block region, wherein the pixel values surrounding the second block region are a part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block.
[30] According to another aspect of the present invention, there is provided a method of estimating and compensating for motion in image encoding, the method including the operations of searching for a reference block having a minimum difference compared with a current block in a reference frame, calculating a motion vector, and thereby performing motion estimation; generating an estimation block for the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame; and encoding an image by using the estimation block and the motion vector.
[31] When a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the operation of generating the estimation block may include the operations of determining the first block region of the reference block to be an estimation region of the first block region of the estimation block; and generating an estimation region of the second block region of the estimation block by using at least one of pixels of neighboring blocks of the current block and pixels of an estimation region of the first block region.
[32] The operation of generating the estimation region of the second block region of the estimation block may include the operations of determining pixels of the neighboring blocks of the current block and pixels surrounding the second block region to be a search pattern region, wherein the neighboring blocks are from among regions reconstructed prior to the current block, and the pixels surrounding the second block region are of a part of the first block region of the estimation block, when the estimation block is allocated to the current block; searching for an outside estimation region surrounded by a similar pattern region having a minimum difference compared with the search pattern region in the regions reconstructed prior to the current block; and determining the outside estimation region to be an estimation region of the second block region of the estimation block.
[33] The operation of searching for the outside estimation region may include the operation of searching for the similar pattern region having a same form as the search pattern region in the regions reconstructed prior to the current block, and having a minimum SAD between pixel values of the search pattern region and pixel values of the similar pattern region.
[34] The operation of generating the estimation region of the second block region of the estimation block may include the operations of selecting at least one of a pixel line of the neighboring blocks of the current block and a pixel line surrounding the second block region, wherein the pixel line surrounding the second block region is a part of the first block region of the estimation block, when the estimation block is allocated to the current block; and determining pixels of the selected pixel line to be estimation pixels related to pixels of the second block region located either vertically or diagonally to the selected pixel line.
[35] The operation of generating the estimation region of the second block region of the estimation block may include the operation of determining an average value between pixel values of the neighboring blocks of the current block and pixel values surrounding the second block region to be an estimation pixel value of the second block region, wherein the pixel values surrounding the second block region are a part of the first block region of the estimation block, when the estimation block is allocated to the current block.
[36] According to another aspect of the present invention, there is provided a motion estimation and compensation apparatus in image decoding, the motion estimation and compensation apparatus including a reference block determining unit determining a reference block of a reference frame for a current block of a current frame being decoded; and a spatiotemporal estimation block generation unit generating a spati- otemporal estimation block of the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame.
[37] According to another aspect of the present invention, there is provided a motion estimation and compensation apparatus in image encoding, the motion estimation and compensation apparatus including a motion estimation performing unit searching for a reference block having a minimum difference compared with a current block in a reference frame, calculating a motion vector, and thereby performing motion estimation; an estimation block generation unit generating an estimation block for the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame; and an encoding unit encoding an image by using the estimation block and the motion vector.
[38] According to another aspect of the present invention, there is provided a computer readable recording medium having recorded thereon a program for executing the method of estimating and compensating for motion in image decoding.
[39] According to another aspect of the present invention, there is provided a computer readable recording medium having recorded thereon a program for executing the method of estimating and compensating for motion in image encoding. Mode for Invention
[40] This application claims priority from Korean Patent Application No.
10-2007-0086549, filed on August 28, 2007, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
[41] Hereinafter, a method and apparatus for estimating and compensating for motion in image decoding, and a method and apparatus for encoding an image, according to exemplary embodiments of the present invention will now be described more fully with reference to the accompanying drawings, in which exemplary embodiments of the invention are shown.
[42] FIG. 2 is a block diagram illustrating a motion estimation and compensation apparatus 200 for image decoding according to an embodiment of the present invention.
[43] The motion estimation and compensation apparatus 200 for image decoding according to the current embodiment of the present invention includes a reference block determination unit 210, and a spatiotemporal estimation block generation unit 220. The spatiotemporal estimation block generation unit 220 includes a first block region determination unit 230, and a second block region determination unit 240.
[44] The reference block determination unit 210 determines a reference block of a reference frame for a current block of a current frame being decoded.
[45] When some pixels of the reference block are outside the reference frame, the spatiotemporal estimation block generation unit 220 generates a spatiotemporal estimation block of the current block by using the current frame and the reference frame.
[46] For convenience of description, it is assumed that a block region in the same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in the same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block.
[47] The first block region determination unit 230 determines the first block region of the reference block to be an estimation region of a first block region of the spatiotemporal estimation block.
[48] The second block region determination unit 240 generates an estimation region of a second block region of the spatiotemporal estimation block by using at least one of pixel values of neighboring blocks of the current block and pixel values of the estimation region of the first block region.
[49] Embodiments related to the second block region determination unit 240 will be described later in detail with reference to FIGS. 4 through 6C.
[50] An aspect of the present invention determines blocks estimated by the reference block, and in particular, the spatiotemporal estimation blocks generated by the spatiotemporal estimation block generation unit 220 when the reference block is outside the reference frame, to be a current block, and as such an image is reconstructed during image decoding.
[51] FIG. 3 is a block diagram illustrating a motion estimation and compensation apparatus 300 for image encoding according to another embodiment of the present invention.
[52] The motion estimation and compensation apparatus 300 for image encoding according to the current embodiment of the present invention includes a motion estimation performing unit 310, an estimation block generation unit 320, and an encoding unit 330.
[53] The motion estimation performing unit 310 searches for a reference block in a reference frame, wherein the reference block has a minimum difference compared with a current block, calculates a motion vector, and thereby performs motion estimation.
[54] The motion estimation performing unit 310 may determine a block to be the reference block, wherein the block has a minimum Sum of Absolute Differences (SAD) between each pixel value of a block in the reference frame and each pixel value of the current block. The motion vector may indicate a positional distance between the current block and the reference block.
[55] When some pixels of the reference block are outside the reference frame, the estimation block generation unit 320 generates an estimation block for the current block by using the current frame and the reference frame.
[56] An embodiment for the estimation block generation unit 320 is not illustrated.
However, similar to the motion estimation and compensation apparatus 200 for image decoding, the estimation block generation unit 320 includes a first block region determination unit and a second block region determination unit which are respectively for a first block region and a second block region of the estimation block. The first and second block region determination units of the estimation block generation unit 320 respectively have the same operating principle as the first and second block region determination units of the spatiotemporal estimation block generation unit 220. Thus, the operating principle of the first and second block region determination units of the estimation block generation unit 320 will be described later with reference to FIGS. 4 through 6C.
[57] The encoding unit 330 encodes an image by using the estimation block and the motion vector.
[58] The encoding unit 330 may determine the estimation block to be the current block, and encode the image by using a difference value between the current block and the reference block.
[59] FIG. 4 is a diagram illustrating a method of generating an estimation block when a reference block is outside a reference frame, according to an embodiment of the present invention.
[60] Referring to FIG. 4, a method of operating the spatiotemporal estimation block generation unit 220 of the motion estimation and compensation apparatus 200 for image decoding, and the estimation block generation unit 320 of the motion estimation and compensation apparatus 300 for image encoding will now be described in detail. In particular, a method of operating the first and second block region determination units of the spatiotemporal estimation block generation unit 220 and the estimation block generation unit 320 will now be described in detail.
[61] A frame 110, a frame 150, and a block 160 respectively indicate a current frame, a reference frame, and a reference block.
[62] A region 170 of the reference block 160 indicates a region included in the reference frame 150. A region 180 of the reference block 160 indicates a region excluded from the reference frame 150.
[63] A block 420 indicates a spatiotemporal estimation block. Since the spatiotemporal estimation block generated by the present invention is determined to be a current block, the block 420 eventually becomes a current block 120.
[64] A region 430 indicates a first block region of the spatiotemporal estimation block
420.
[65] A region 440 indicates a second block region of the spatiotemporal estimation block
420.
[66] A region 450 indicates pixels surrounding the current block 120, wherein the pixels are from among neighboring blocks of the current block 120.
[67] When the reference block 160 related to the current block 120 is inside the reference frame 150, an estimation block related to the current block 120 can be determined by a motion estimation/compensation method that is well-known to one of ordinary skill in the art. Hereinafter, a case in which the reference block 160 is outside the reference frame 150 will now be described in detail.
[68] As described above, it is assumed that a block region in the same position as the region 170 included in the reference frame 150 is defined to be a first block region, wherein the block region is a part of the reference block 160, and a block region in the same position as the region 180 excluded from the reference frame 150 is defined to be a second block region, wherein the block region is a part of the reference block 160. Thus, the region 430 and the region 440 respectively become the first block region and the second block region, which are of the spatiotemporal estimation block 420 that is to be inserted into the current block 120.
[69] The first block region determination unit 230 may determine the first block region
170 of the reference block 160 to be an estimation region of the first block region 430 of the spatiotemporal estimation block 420.
[70] The second block region determination unit 240 may generate an estimation region of the second block region 440 of the spatiotemporal estimation block 420 by using at least one of pixel values of neighboring blocks of the current block 120, and pixel values of the estimation region of the first block region 430 of the spatiotemporal es- timation block 420.
[71] The second block region determination unit 240 may use a method of using a search pattern including the pixels 450, and a method of using values of some of the pixels 450 without a change, wherein the pixels 450 surrounding the current block 120 are from among pixels of the neighboring blocks and first block region 430, so as to use the neighboring blocks of the current block 120 and the estimation region of the first block region 430 of the spatiotemporal estimation block 420. The method of using the search pattern will be described later with reference to FIG. 5, and the method of using the pixels 450 will be described later with reference to FIGS. 6 A through 6C.
[72] The first and second block region determination units of the estimation block generation unit 320 for image encoding respectively have the same operating principle as the first and second block region determination units 230 and 240 of the spatiotemporal estimation block generation unit 220 for image decoding.
[73] FIG. 5 is a diagram illustrating an example of a type of a search pattern for estimating a region that is outside a reference frame.
[74] A method of operating an example of the second block region determination unit 240 of the spatiotemporal estimation block generation unit 220 for image decoding, and an example of the second block region determination unit of the estimation block generation unit 320 for image encoding will now be described in detail with reference to FIG. 5.
[75] A region 510 indicates a search pattern region for determining the second block region 440.
[76] A region 520 indicates a similar pattern region searched for by the search pattern region 510.
[77] A region 530 indicates an outside estimation region.
[78] The second block region determination unit 240 may determine pixels of the neighboring blocks of the current block 120 and pixels surrounding the second block region 440 to be the search pattern region 510, wherein the neighboring blocks are from among regions reconstructed prior to the current block 120, and the pixels surrounding the second block region 440 are part of the first block region 430 of the spatiotemporal estimation block (estimation block) 420.
[79] Also, the second block region determination unit 240 may search for the similar pattern region 520 having a minimum difference compared with the search pattern region 510 in the regions reconstructed prior to the current block 120. The similar pattern region 520 has the same form as the search pattern region 510, and a region having a minimum SAD (that is the sum of the absolute values of the difference between pixel values of the search pattern region 510 and pixel values of the similar pattern region 520) in the regions reconstructed prior to the current block 120 is de- termined to be the similar pattern region 520.
[80] The outside estimation region 530 corresponds to a region surrounded by the similar pattern region 520 determined by the search pattern region 510. The second block region determination unit 240 determines the outside estimation region 530 to be an estimation region of the second block region 440, and thereby completes the spati- otemporal estimation block (estimation block) 420.
[81] FIGS. 6A through 6C are diagrams illustrating various methods of estimating a region that is outside a reference frame, according to another embodiment of the present invention.
[82] A method of operating an example for the second block region determination unit
240 of the spatiotemporal estimation block generation unit 220 for image decoding, and an example of the second block region determination unit of the estimation block generation unit 320 for image encoding will now be described in detail with reference to FIGS. 6A through 6C. In particular, the method is related to a method of determining the second block region 440 by using the aforementioned pixels 450.
[83] FIG. 6A provides diagrams illustrating one of the various methods of estimating the region that is outside the reference frame.
[84] A block 610 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of the first block region 430 is inserted vertically to a boundary line and to each pixel of the second block region 440.
[85] A block 620 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of a previously reconstructed neighboring block is inserted vertically to a boundary line and to each pixel of the second block region 440.
[86] A block 630 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of the previously reconstructed neighboring block is inserted horizontally to a boundary line and to each pixel of the second block region 440.
[87] The pixel values of the previously reconstructed neighboring block that is the nearest block to the second block region 440 of the current block 120 are inserted vertically and horizontally to the boundary line and to each pixel of the second block region 440 from the boundary lines between the current block 120 and the previously reconstructed neighboring block, such that the estimation blocks 620 and 630 are generated.
[88] The pixel values of the first block region 430 that is the nearest block region to the second block region 440 are inserted vertically to each pixel of the second block region 440 from the boundary line between the first block region 430 and the second block region 440, and such that the estimation block 610 is generated.
[89] FIG. 6B provides diagrams illustrating another method from among the various methods of estimating the region that is outside the reference frame.
[90] A block 640 indicates the spatiotemporal estimation block (estimation block) 420 in which the pixel values of the previously reconstructed neighboring block are inserted diagonally left to each pixel of the second block region 440.
[91] A block 650 indicates the spatiotemporal estimation block (estimation block) 420 in which the pixel values of the previously reconstructed neighboring block are inserted diagonally right to each pixel of the second block region 440.
[92] A block 660 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of the first block region 430 is inserted diagonally right to each pixel of the second block region 440.
[93] A block 670 indicates the spatiotemporal estimation block (estimation block) 420 in which each pixel value of the first block region 430 is inserted diagonally left to each pixel of the second block region 440.
[94] The pixel values of the previously reconstructed neighboring block that is the nearest block to the second block region 440 of the current block 120 are inserted diagonally to the boundary line and to each pixel of the second block region 440 from the boundary line between the current block 120 and the previously reconstructed neighboring block, such that the estimation blocks 640 and 650 are generated.
[95] The pixel values of the first block region 430 that is the nearest block region to the second block region 440 are inserted diagonally to each pixel of the second block region 440 from the boundary line between the first block region 430 and the second block region 440, such that the estimation blocks 660 and 670 are generated.
[96] FIG. 6C provides a diagram illustrating another method from among the various methods of estimating the region that is outside the reference frame.
[97] A block 680 indicates the spatiotemporal estimation block (estimation block) 420 in which an average value of pixel values included in the region 510 is determined by estimation pixel values of the second block region 440.
[98] The estimation block 680 determines an average value of pixel values of the previously reconstructed neighboring blocks from among neighboring blocks of the current block 120, and a region that is the same as the search pattern region 510 that comprises pixels of the first block region 430, wherein the pixels are the nearest pixels to the second block region 440, to be an estimation region.
[99] FIG. 7 is a flowchart of a method of estimating and compensating for motion in image decoding, according to another embodiment of the present invention.
[100] In operation 710, a reference block of a reference frame for a current block of a current frame is determined by using a received frame and a motion vector.
[101] In operation 720, when some pixels of the reference block are outside the reference frame, a spatiotemporal estimation block for the current block is generated by using the current frame and the reference frame.
[102] In the current embodiment, the spatiotemporal estimation block is divided into a first block region corresponding to a region included in the reference frame, wherein the region is a part of the reference block, and a second block region corresponding to a region excluded from the reference frame, wherein the region is a part of the reference block, and as such the spatiotemporal estimation block corresponding to the respective regions is generated. That is, temporal estimation using pixel values of the reference block is performed on the first block region of the spatiotemporal estimation block, and spatial estimation using pixel values of a neighboring block of the current block is performed on the second block region of the spatiotemporal estimation block.
[103] FIG. 8 is a flowchart of a method of estimating and compensating for motion in image encoding, according to another embodiment of the present invention.
[104] In operation 810, a reference block having a minimum difference compared with a current block is searched for in a reference frame, a motion vector is calculated, and thereby motion estimation is performed.
[105] In operation 820, when some pixels of the reference block are outside the reference frame, an estimation block for the current block is generated by using a current frame and the reference frame.
[106] In the current embodiment, the spatiotemporal estimation block is divided into a first block region corresponding to a region included in the reference frame, wherein the region is a part of the reference block, and a second block region corresponding to a region excluded from the reference frame, wherein the region is a part of the reference block, and as such the spatiotemporal estimation block corresponding to the respective regions is generated. That is, temporal estimation using pixel values of the reference block is performed on the first block region of the spatiotemporal estimation block, and spatial estimation using pixel values of a neighboring block of the current block is performed on the second block region of the spatiotemporal estimation block.
[107] In operation 830, an image is encoded by using the estimation block and the motion vector. In this manner, motion compensation is performed during image encoding and then the image to be encoded is reconstructed. Thus, it is possible to use the image in which information regarding motion is more accurately reflected.
[108] When the reference block, which is selected by the motion vector, for the current block has the region that is outside the reference frame, the method and apparatus for estimating and compensating for the motion in image decoding according to the present invention use information of the region included in the neighboring block of the current block and the reference frame. Accordingly, the method and apparatus can accurately estimate and compensate for the motion in comparison with the conventional technology.
[109] While the conventional technology uses a padding method which depends on limited information corresponding to boundary pixels of a reference frame, the method and apparatus for estimating and compensating for motion in image decoding according to the present invention use the information of the region included in the neighboring block and the reference frame. By doing so, the method and apparatus has reflected therein correlation with the reference frame, thereby has high compression efficiency. In particular, the compression efficiency of an active image sequence having panning, tilting, zooming in/out, fast camera motion, and fast object motion is greatly increased.
[110] Among exemplary embodiments of the present invention, an effect of the method of estimating and compensating for motion in image decoding is higher than an effect of the method of estimating and compensating for motion in image encoding, in terms of calculation time.
[I l l] The present invention can be applied to video codecs based on temporal motion estimation, or to all methods and apparatuses such as mobile phones, camcorders, digital cameras, Portable Multimedia Players (PMPs), next- generation Digital Video Discs (DVDs), software video codes, and the like which are capable of using the video codecs.
[112] Exemplary embodiments of the present invention can be written as computer programs and can be implemented in general-use digital computers that execute the programs using a computer readable recording medium and other media. In addition, a data structure used in the embodiments of the present invention can be written in a computer readable recording medium through various means. Examples of the computer readable recording medium include magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs)An example of other media is carrier waves (e.g., transmission through the Internet).
[113] While this invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The exemplary embodiments should be considered in a descriptive sense only and not for purposes of limitation. Therefore, the scope of the invention is defined not by the detailed description of the invention but by the appended claims, and all differences within the scope will be construed as being included in the present invention.

Claims

Claims
[1] L A method of estimating and compensating for motion in image decoding, the method comprising: determining a reference block of a reference frame indicated by a motion vector of a current block of a current frame being decoded; and generating a spatiotemporal estimation block of the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame.
[2] 2. The method of claim 1, wherein, when a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the generating of the spatiotemporal estimation block comprises: determining the first block region of the reference block to be an estimation region of the first block region of the spatiotemporal estimation block; and generating an estimation region of the second block region of the spatiotemporal estimation block by using at least one of pixel values of neighboring blocks of the current block, and pixel values of an estimation region of the first block region.
[3] 3. The method of claim 1, wherein, when a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the generating of the spatiotemporal estimation block comprises: determining pixels of the neighboring blocks of the current block and pixels surrounding the second block region to be a search pattern region, wherein the neighboring blocks are from among regions reconstructed prior to the current block, and the pixels surrounding the second block region are part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block; searching for an outside estimation region surrounded by a similar pattern region having a minimum difference compared with the search pattern region in the regions reconstructed prior to the current block; and determining the outside estimation region to be an estimation region of the second block region of the spatiotemporal estimation block.
[4] 4. The method of claim 3, wherein the searching for the outside estimation region comprises searching for the similar pattern region having a same form as the search pattern region in the regions reconstructed prior to the current block, and having a minimum SAD (Sum of Absolute Differences) between pixel values of the search pattern region and pixel values of the similar pattern region.
[5] 5. The method of claim 3, wherein the determining of the estimation region of the second block region of the spatiotemporal estimation block comprises: selecting at least one of a pixel line of the neighboring blocks of the current block and a pixel line surrounding the second block region, wherein the pixel line surrounding the second block region is a part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block; and determining pixels of the selected pixel line to be estimation pixels related to pixels of the second block region located either vertically or diagonally to the selected pixel line.
[6] 6. The method of claim 3, wherein the determining of the estimation region of the second block region of the spatiotemporal estimation block comprises determining an average value between pixel values of the neighboring blocks of the current block and pixel values surrounding the second block region to be an estimation pixel value of the second block region, wherein the pixel values surrounding the second block region are a part of the first block region of the spatiotemporal estimation block, when the spatiotemporal estimation block is allocated to the current block.
[7] 7. A method of estimating and compensating for motion in image encoding, the method comprising: searching for a reference block having a minimum difference compared with a current block in a reference frame, calculating a motion vector, and thereby performing motion estimation; generating an estimation block for the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame; and encoding an image by using the estimation block and the motion vector.
[8] 8. The method of claim 7, wherein, when a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the generating of the estimation block comprises: determining the first block region of the reference block to be an estimation region of the first block region of the estimation block; and generating an estimation region of the second block region of the estimation block by using at least one of pixels of neighboring blocks of the current block and pixels of an estimation region of the first block region.
[9] 9. The method of claim 8, wherein the generating of the estimation region of the second block region of the estimation block comprises: determining pixels of the neighboring blocks of the current block and pixels surrounding the second block region to be a search pattern region, wherein the neighboring blocks are from among regions reconstructed prior to the current block, and the pixels surrounding the second block region are part of the first block region of the estimation block, when the estimation block is allocated to the current block; searching for an outside estimation region surrounded by a similar pattern region having a minimum difference compared with the search pattern region in the regions reconstructed prior to the current block; and determining the outside estimation region to be an estimation region of the second block region of the estimation block.
[10] 10. The method of claim 9, wherein the searching for the outside estimation region comprises searching for the similar pattern region having a same form as the search pattern region in the regions reconstructed prior to the current block, and having a minimum SAD between pixel values of the search pattern region and pixel values of the similar pattern region.
[11] 11. The method of claim 8, wherein the generating of the estimation region of the second block region of the estimation block comprises: selecting at least one of a pixel line of the neighboring blocks of the current block and a pixel line surrounding the second block region, wherein the pixel line surrounding the second block region is a part of the first block region of the estimation block, when the estimation block is allocated to the current block; and determining pixels of the selected pixel line to be estimation pixels related to pixels of the second block region located either vertically or diagonally to the selected pixel line.
[12] 12. The method of claim 8, wherein the generating of the estimation region of the second block region of the estimation block comprises determining an average value between pixel values of the neighboring blocks of the current block and pixel values surrounding the second block region to be an estimation pixel value of the second block region, wherein the pixel values surrounding the second block region are a part of the first block region of the estimation block, when the estimation block is allocated to the current block.
[13] 13. A motion estimation and compensation apparatus in image decoding, the motion estimation and compensation apparatus comprising: a reference block determining unit which determines a reference block of a reference frame for a current block of a current frame being decoded; and a spatiotemporal estimation block generation unit which generates a spati- otemporal estimation block of the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame.
[14] 14. The motion estimation and compensation apparatus of claim 13, wherein, when a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the spatiotemporal estimation block generation unit comprises: a first block region determination unit which determines the first block region of the reference block to be an estimation region of the first block region of the spatiotemporal estimation block; and a second block region determination unit which determines pixels of neighboring blocks of the current block and pixels surrounding the second block region to be a search pattern region, wherein the pixels surrounding the second block region are part of the first block region of the spatiotemporal estimation block, and determines a region corresponding to the second block region of the spatiotemporal estimation block in the current frame by using the determined search pattern region, when the spatiotemporal estimation block is allocated to the current block.
[15] 15. The motion estimation and compensation apparatus of claim 13, wherein, when a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the spatiotemporal estimation block generation unit comprises: a first block region determination unit which determines the first block region of the reference block to be an estimation region of the first block region of the spatiotemporal estimation block; and a second block region determination unit which selects at least one of a pixel line of the neighboring blocks of the current block and a pixel line surrounding the second block region, wherein the pixel line surrounding the second block region is a part of the first block region of the spatiotemporal estimation block, and determines an estimation region of the second block region of the spatiotemporal estimation block in the current frame by using the selected pixel line, when the spatiotemporal estimation block is allocated to the current block.
[16] 16. A motion estimation and compensation apparatus in image encoding, the motion estimation and compensation apparatus comprising: a motion estimation performing unit which searches for a reference block having a minimum difference compared with a current block in a reference frame, calculating a motion vector, and thereby performs motion estimation; an estimation block generation unit which generates an estimation block for the current block by using the current frame and the reference frame, when some pixels of the reference block are outside the reference frame; and an encoding unit which encodes an image by using the estimation block and the motion vector.
[17] 17. The motion estimation and compensation apparatus of claim 16, wherein, when a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the estimation block generation unit comprises: a first block region determination unit which determines the first block region of the reference block to be an estimation region of the first block region of the estimation block; and a second block region determination unit which determines pixels of neighboring blocks of the current block and pixels surrounding the second block region to be a search pattern region, wherein the pixels surrounding the second block region are part of the first block region of the estimation block, and determines a region corresponding to the second block region of the estimation block in the current frame by using the determined search pattern region, when the estimation block is allocated to the current block.
[18] 18. The motion estimation and compensation apparatus of claim 16, wherein, when a block region in a same position as a region included in the reference frame is defined to be a first block region, wherein the block region is a part of the reference block, and a block region in a same position as a region excluded from the reference frame is defined to be a second block region, wherein the block region is a part of the reference block, the estimation block generation unit comprises: a first block region determination unit which determines the first block region of the reference block to be an estimation region of the first block region of the estimation block; and a second block region determination unit which selects at least one of a pixel line of the neighboring blocks of the current block and a pixel line surrounding the second block region, wherein the pixel line surrounding the second block region is a part of the first block region of the estimation block, and determining an estimation region of the second block region of the estimation block in the current frame by using the selected pixel line, when the estimation block is allocated to the current block.
[19] 19. A computer readable recording medium having recorded thereon a program for executing the method of claim 1.
[20] 20. A computer readable recording medium having recorded thereon a program for executing the method of claim 7.
PCT/KR2008/002274 2007-08-28 2008-04-23 Method and apparatus for estimating and compensating spatiotemporal motion of image WO2009028780A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN2008801050937A CN101796844B (en) 2007-08-28 2008-04-23 Method and apparatus for estimating and compensating spatiotemporal motion of image
EP08741515A EP2186341A4 (en) 2007-08-28 2008-04-23 Method and apparatus for estimating and compensating spatiotemporal motion of image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2007-0086549 2007-08-28
KR1020070086549A KR101396365B1 (en) 2007-08-28 2007-08-28 Method and apparatus for spatiotemporal motion estimation and motion compensation of video

Publications (1)

Publication Number Publication Date
WO2009028780A1 true WO2009028780A1 (en) 2009-03-05

Family

ID=40387466

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2008/002274 WO2009028780A1 (en) 2007-08-28 2008-04-23 Method and apparatus for estimating and compensating spatiotemporal motion of image

Country Status (5)

Country Link
US (2) US8229233B2 (en)
EP (1) EP2186341A4 (en)
KR (1) KR101396365B1 (en)
CN (1) CN101796844B (en)
WO (1) WO2009028780A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2497273A4 (en) * 2009-11-05 2016-07-27 Ericsson Telefon Ab L M Prediction of pixels in image coding
US9781446B2 (en) 2009-12-10 2017-10-03 Thomson Licensing Dtv Method for coding and method for decoding a block of an image and corresponding coding and decoding devices
US20200092576A1 (en) * 2018-09-14 2020-03-19 Google Llc Motion prediction coding with coframe motion vectors

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6765964B1 (en) 2000-12-06 2004-07-20 Realnetworks, Inc. System and method for intracoding video data
US8582882B2 (en) * 2002-02-06 2013-11-12 Koninklijke Philipse N.V. Unit for and method of segmentation using average homogeneity
US8462852B2 (en) * 2009-10-20 2013-06-11 Intel Corporation Methods and apparatus for adaptively choosing a search range for motion estimation
US9654792B2 (en) * 2009-07-03 2017-05-16 Intel Corporation Methods and systems for motion vector derivation at a video decoder
US8917769B2 (en) * 2009-07-03 2014-12-23 Intel Corporation Methods and systems to estimate motion based on reconstructed reference frames at a video decoder
US20110002387A1 (en) * 2009-07-03 2011-01-06 Yi-Jen Chiu Techniques for motion estimation
EP2656610A4 (en) 2010-12-21 2015-05-20 Intel Corp System and method for enhanced dmvd processing
WO2013009104A2 (en) 2011-07-12 2013-01-17 한국전자통신연구원 Inter prediction method and apparatus for same
US9325581B2 (en) * 2013-04-02 2016-04-26 International Business Machines Corporation Context-aware management of applications at the edge of a network
US10341682B2 (en) * 2016-01-19 2019-07-02 Peking University Shenzhen Graduate School Methods and devices for panoramic video coding and decoding based on multi-mode boundary fill

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6026195A (en) * 1997-03-07 2000-02-15 General Instrument Corporation Motion estimation and compensation of video object planes for interlaced digital video
US20050013362A1 (en) * 2003-07-15 2005-01-20 Lsi Logic Corporation Supporting motion vectors outside picture boundaries in motion estimation process
EP1585326A1 (en) * 2004-03-30 2005-10-12 Matsushita Electric Industrial Co., Ltd. Motion vector estimation at image borders for frame rate conversion
KR20070086549A (en) 2004-11-22 2007-08-27 미드 테크놀로지 코오포레이션 Fluid-activated shaft seal

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100249223B1 (en) * 1997-09-12 2000-03-15 구자홍 Method for motion vector coding of mpeg-4
EP0986262A3 (en) 1998-08-31 2004-04-07 Texas Instruments Incorporated Unrestricted motion compensation
US6950469B2 (en) 2001-09-17 2005-09-27 Nokia Corporation Method for sub-pixel value interpolation
US20040001546A1 (en) 2002-06-03 2004-01-01 Alexandros Tourapis Spatiotemporal prediction for bidirectionally predictive (B) pictures and motion vector prediction for multi-picture reference motion compensation
US7822123B2 (en) 2004-10-06 2010-10-26 Microsoft Corporation Efficient repeat padding for hybrid video sequence with arbitrary video resolution
KR100624304B1 (en) 2004-06-04 2006-09-18 주식회사 대우일렉트로닉스 Apparatus and method for de-interlacing adaptively field image by using motion
US7623682B2 (en) 2004-08-13 2009-11-24 Samsung Electronics Co., Ltd. Method and device for motion estimation and compensation for panorama image
KR100677142B1 (en) * 2004-08-13 2007-02-02 경희대학교 산학협력단 Motion estimation and compensation for panorama image
KR100716992B1 (en) * 2005-02-04 2007-05-10 삼성전자주식회사 Method for encoding and decoding of stereo video, and apparatus thereof
FR2881898A1 (en) 2005-02-10 2006-08-11 Thomson Licensing Sa METHOD AND DEVICE FOR CODING A VIDEO IMAGE IN INTER OR INTRA MODE
KR20060123939A (en) * 2005-05-30 2006-12-05 삼성전자주식회사 Method and apparatus for encoding and decoding video

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6026195A (en) * 1997-03-07 2000-02-15 General Instrument Corporation Motion estimation and compensation of video object planes for interlaced digital video
US20050013362A1 (en) * 2003-07-15 2005-01-20 Lsi Logic Corporation Supporting motion vectors outside picture boundaries in motion estimation process
EP1585326A1 (en) * 2004-03-30 2005-10-12 Matsushita Electric Industrial Co., Ltd. Motion vector estimation at image borders for frame rate conversion
KR20070086549A (en) 2004-11-22 2007-08-27 미드 테크놀로지 코오포레이션 Fluid-activated shaft seal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2186341A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2497273A4 (en) * 2009-11-05 2016-07-27 Ericsson Telefon Ab L M Prediction of pixels in image coding
US9781446B2 (en) 2009-12-10 2017-10-03 Thomson Licensing Dtv Method for coding and method for decoding a block of an image and corresponding coding and decoding devices
US20200092576A1 (en) * 2018-09-14 2020-03-19 Google Llc Motion prediction coding with coframe motion vectors
US11665365B2 (en) * 2018-09-14 2023-05-30 Google Llc Motion prediction coding with coframe motion vectors

Also Published As

Publication number Publication date
EP2186341A1 (en) 2010-05-19
KR20090021758A (en) 2009-03-04
CN101796844A (en) 2010-08-04
EP2186341A4 (en) 2012-04-11
CN101796844B (en) 2012-08-22
US20090060359A1 (en) 2009-03-05
US20120263239A1 (en) 2012-10-18
US8229233B2 (en) 2012-07-24
KR101396365B1 (en) 2014-05-30

Similar Documents

Publication Publication Date Title
US8229233B2 (en) Method and apparatus for estimating and compensating spatiotemporal motion of image
US10958933B2 (en) Image encoding/decoding apparatus and method
JP7328337B2 (en) Video processing method and apparatus
JP5580453B2 (en) Direct mode encoding and decoding apparatus
US8503532B2 (en) Method and apparatus for inter prediction encoding/decoding an image using sub-pixel motion estimation
US9113110B2 (en) Method and apparatus for estimating motion vector using plurality of motion vector predictors, encoder, decoder, and decoding method
US7580456B2 (en) Prediction-based directional fractional pixel motion estimation for video coding
JP5406222B2 (en) Video coding and decoding method and apparatus using continuous motion estimation
JP5976197B2 (en) Method for processing one or more videos of a 3D scene
US20120121020A1 (en) Motion image encoding apparatus, motion image decoding apparatus, motion image encoding method, motion image decoding method, motion image encoding program, and motion image decoding program
US20090220005A1 (en) Method and apparatus for encoding and decoding image by using multiple reference-based motion prediction
JP2008167449A (en) Method and apparatus for encoding/decoding image
EP2380354A1 (en) Video processing method and apparatus with residue prediction
JP4898415B2 (en) Moving picture coding apparatus and moving picture coding method
US10075691B2 (en) Multiview video coding method using non-referenced view video group
CN112449180A (en) Encoding and decoding method, device and equipment
Kim et al. Multilevel Residual Motion Compensation for High Efficiency Video Coding
KR101567990B1 (en) Method and Apparatus for Encoding and Decoding Motion Vector
KR20140026579A (en) Method and apparatus for encoding and decoding motion vector
JP2006136011A (en) Image encoding method and apparatus, and decoding method and apparatus
US20160366434A1 (en) Motion estimation apparatus and method
JP2008283302A (en) Motion image processor and motion image processing method

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880105093.7

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08741515

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008741515

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE