EP2232869A1 - Motion encoding without the transmission of motion information and motion decoding - Google Patents
Motion encoding without the transmission of motion information and motion decodingInfo
- Publication number
- EP2232869A1 EP2232869A1 EP08858284A EP08858284A EP2232869A1 EP 2232869 A1 EP2232869 A1 EP 2232869A1 EP 08858284 A EP08858284 A EP 08858284A EP 08858284 A EP08858284 A EP 08858284A EP 2232869 A1 EP2232869 A1 EP 2232869A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- motion
- information
- candidate
- image portion
- current image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/44—Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
Definitions
- the present invention relates to image coding and decoding techniques and more particularly to the encoding and decoding of the motion.
- the current image portion is represented by chrominance luminance information and motion information between that current image portion and a reference image portion. This makes it possible to exploit the temporal correlations between the images.
- the motion information includes one or more components such as a translational motion vector, a rotation, a pixellic component, and a subpixel component.
- the encoding generally includes determining information for the reconstruction of the motion components. For example, a prediction of a motion vector is performed and the information for reconstruction includes the identifier of the prediction function used as well as a residue corresponding to the difference between the prediction and the original motion vector.
- the information for the reconstruction also includes other information of various types that make it possible to obtain the components of the movement at the decoder.
- this information is called related or "side information" and includes size parameters, function identifiers to use or other.
- This information for the reconstruction is transmitted to the decoder.
- the decoder reconstructs the different components of the estimated motion, i.e. applies the inverse prediction, combines the result with the residue and the related information.
- the decoder then applies the motion components with the reference image portion and combines the result with the luminance and chrominance information to obtain the decoded picture.
- the motion components are information whose size increases with the resolution of the images and the information for the reconstruction of these components occupies a significant part of the bandwidth.
- An object of the present invention is to improve this situation by making it possible to transmit motion information using less bandwidth.
- an object of the present invention is a method of encoding at least a portion of a current image comprising a motion estimation between the current image portion and a plurality of candidate image portions to form images.
- motion components evaluating a performance criterion for each candidate image portion, selecting a reference image portion from the candidate image portions using said performance criteria, and coding information in an output stream to a decoder, characterized in that, for at least one motion component, the motion estimation, the evaluation of performance criteria and the selection of a portion of a reference image use only information considered as available at said decoder and no motion information is inserted into said output stream.
- another object of the invention is a method of decoding at least a portion of a current image characterized in that it comprises a reception of a coding information stream of the portion of current image, a motion estimation for the current image portion from a plurality of candidate reference image portions and already decoded image portions, to form corresponding motion components, an evaluation of a performance criterion for each candidate image portion, selecting from said candidate image portions using said criteria to obtain said image portion reference and decoding of the current image portion from said reference portion and corresponding motion components.
- the encoder uses only information available at the decoder and the decoder reproduces the calculations made at the coder. Consequently, for at least one motion component, no reconstruction information of said component must be transmitted, which saves bandwidth.
- said estimation delivers several motion components and the method further comprises an information determination for the reconstruction of at least one of said motion components and the insertion of this information for the reconstruction in the stream. Release.
- the decoding method comprises receiving the reconstruction information and corresponding reconstruction of a motion component.
- the motion estimation and the evaluation of the performance criteria use correlations between fragments of the candidate picture portions and corresponding fragments that are neighbors and distinct from the portion of the picture. current image.
- the motion estimation and the evaluation of the performance criteria use correlations between neighboring and distinct fragments of the candidate image portions and corresponding and distinct corresponding fragments. of the current image portion.
- the invention also relates to computer programs and devices corresponding to these coding and decoding methods.
- FIG. 1 is a diagram showing two stations in communication respectively provided with an encoder and a video decoder
- FIG. 2 is a flowchart of the coding method according to one embodiment of the invention
- FIGS. 3A, 3B and 3C show various embodiments of a particular step of the coding method
- FIG. 4 is a flowchart of the decoding method according to one embodiment of the invention.
- the invention is applicable to any system for coding and decoding images, such as, for example, the system shown with reference to FIG. 1.
- This system allows the coding of a video sequence of a digital television stream F and its transmission from a transmitter 1 containing a video encoder 2 comprising a controller or computer 3 associated with a memory 4.
- the transmission is provided to a receiving station 5 containing a decoder 6 also comprising a controller or computer 7 and a memory 8 .
- the transmitter 1 comprises an antenna transmitting on a digital television radio channel a data stream ⁇ in a format such as the so-called DVB format and the station 5 is a personal computer.
- the method of the invention will now be described at the coding level.
- the encoder 2 receives the stream F of image data of a video sequence. This stream is processed to encode each image or image portion.
- image portion generally refers to an element of the video sequence. Depending on the standards used, the term image may be replaced by the term frame and the term portion by the term block or macroblock.
- the coding method begins with a motion estimation step 10 between the current image portion and candidate image portions.
- the candidate image portions are, for example, the portions corresponding to or close to the portion of the current image in a previous image.
- Motion components such as motion vectors are obtained through estimation 10 for each candidate image portion.
- motion estimation implements only information considered as available to the decoder.
- information considered available to the decoder means information which the coder estimates or knows that they have been decoded. This is information previously transmitted and for which no error message has been received or for which a positive acknowledgment of receipt has been received, or information stored in both the encoder and the decoder.
- the motion estimation is based on a technique called pairing or correlation of blocks or lines (in English "block / line matching").
- a motion vector Vmvt is estimated by considering blocks or lines L of the candidate image portion that are found in distinct but neighboring portions. of the portion to be coded CODE on the current picture I (N).
- These distinct and adjacent portions are image portions already transmitted and which the encoder believes are available at the decoder.
- the motion estimation is based on correlations between "pattern matching" contours of image fragments belonging to the CAND candidate image portions of a previous image. and the same outlines on fragments of the current image distinct and adjacent to the image portion to be coded CODE.
- motion estimation algorithms for obtaining motion vectors from these data are known and will not be described in detail here.
- the motion components comprise a translation vector and a rotation. At the end of the motion estimation step 10, motion components are obtained for each candidate image portion.
- the motion estimation is followed by the determination of information for the reconstruction of each motion component.
- This is for example the determination of the most suitable prediction functions, a residual which represents the difference between the image portion to be encoded and a portion of the candidate image transformed by the prediction function or information so-called "side information".
- This information for reconstruction is calculated in a conventional manner per se.
- the method then comprises a step 14 of evaluating a performance criterion for each portion of candidate image. This evaluation implements a criterion using only information considered available at the level of the decoder.
- the performance criterion can not be based on the distortion of the current image portion because it is not available at the decoder.
- a usable performance criterion is the absolute sum of the differences between the lines L on the previous image and the corresponding lines on the current image.
- a usable performance criterion is the absolute sum of the differences between the lines weighted by the contour information.
- a performance criterion is calculated for each portion of candidate image from information available at the decoder.
- the information for the reconstruction is also available and can be used for calculation of the performance criterion.
- the reconstruction information used should be available at the decoder. For example, the residue can not be used.
- the method then comprises a selection 16 for retaining a reference image portion from among the candidate image portions. This selection is based on the performance criteria previously obtained. Like the motion estimation 10 and the performance evaluation 14, the selection 16 uses only information considered as available at the decoder. In particular, the selection does not use the residue information.
- the selection consists in retaining the portion of candidate image with the best performance evaluation.
- the selection is based on the performance evaluation weighted by the available bandwidth. This allows you to select the best compromise between image quality and throughput.
- a reference image portion is thus available as well as the associated motion components and information for the reconstruction of these components.
- the coding method then comprises a step 18 encoding non-motion related information such as chrominance and luminance.
- this step 18 uses the motion components or reconstruction information to encode information such as luminance.
- the motion vector associated with the reference image portion is used. This step 18 is implemented in a conventional manner and will not be described in more detail.
- the method comprises a step of emitting the stream ⁇ to the decoder.
- none of the information for the reconstruction of the motion components is inserted into the output stream.
- neither the prediction residues of the motion vectors nor the identifiers of the reference image portion are inserted in the stream output, so that a significant fraction of the bandwidth is available.
- the information relating to the coding of the other parameters of the image is inserted in this output stream in a conventional manner during this step 20.
- the fact that the motion estimation, the evaluation of performance criteria and the selection of a reference image portion only use information considered as available at the level of the decoder makes it possible not to transmit the information for the first time. reconstruction of motion components. As a result, the transmission of motion information requires less bandwidth.
- the stream ⁇ is received by the decoder.
- the stream contains no information for the reconstruction of the motion components.
- the decoding method begins with a motion estimation step 24 for the current image portion. This estimation is carried out identically to that carried out during the coding described with reference to step 10 of FIG. 2.
- the motion estimation uses the information already decoded to evaluate a motion vector for each candidate image portion and the current image portion.
- the line correlation and edge correlation techniques described above with reference to FIGS. 3A to 3C are directly applicable.
- the motion estimation method used at the decoder in step 24 must be the same as that used at the encoder in step 10.
- the motion estimation 24 delivers, for each portion of candidate reference image, motion components such as motion vectors.
- the decoding method then comprises a step 26 of evaluating a performance criterion for each portion of candidate image. This evaluation is carried out according to the same methods as the evaluation 14 implemented during the coding. Accordingly, a performance criterion is associated with each candidate image portion.
- the motion estimation 24, the evaluation of performance criteria 26 and the selection 28 are therefore performed identically to the encoder and the decoder. This is made possible by the fact that these operations use only information considered as available at the decoder.
- the decoder thus has a reference image portion and associated motion components without it being necessary to transmit any of the information necessary for the reconstruction of the components of the image. movement.
- the decoding method finally comprises a step 30 of decoding the current image portion.
- the reference image portion is transformed by the associated motion components. This therefore makes it possible to obtain the motion information of the current image portion.
- the other parameters of the current image portion such as the luminance and chrominance information, are obtained in a conventional manner. Obtaining this information requires, in some embodiments, the use of the motion vector and the reference image portion as identified in steps 24 to 28.
- the use of the same motion estimation, the same evaluation of the performance criteria and the same selection to the encoder and the decoder makes it unnecessary to transmit the motion information between the encoder and the decoder.
- other embodiments can be envisaged.
- other techniques of motion estimation, evaluation of performance criteria and selection are usable.
- techniques known as "error concealment” or “inpainting” can be implemented. In any case, these techniques must use only data considered available at the decoder.
- the step 12 of determining the information necessary for the reconstruction is not carried out during the coding.
- Such an embodiment makes it possible to reduce the calculations at the encoder.
- the embodiment presented with reference to Figure 2 keeps the conventional order of operations and limits the changes to be made to existing equipment. In addition this allows performance evaluation using the information for reconstruction.
- the method is applied to only a part of the motion components.
- part of the motion components is encoded in a conventional manner, possibly using information available only at the encoder level.
- the reconstruction information is transmitted.
- it is useless to transmit the information for the reconstruction For example, the information for the reconstruction of the rotation component is transmitted while that for the translation component is not.
- the information for the reconstruction of a pixellic component is transmitted while those relating to a sub pixel component are not.
- an identifier of the matching method used during the motion estimation is transmitted to the decoder.
- the decoder uses only information that is available at its level but it receives an identifier of the method to be used.
- the candidate image portions are in the same image as the image portion to be encoded and decoded.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0708314 | 2007-11-28 | ||
PCT/FR2008/052087 WO2009071803A1 (en) | 2007-11-28 | 2008-11-19 | Motion encoding without the transmission of motion information and motion decoding |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2232869A1 true EP2232869A1 (en) | 2010-09-29 |
Family
ID=39712715
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP08858284A Ceased EP2232869A1 (en) | 2007-11-28 | 2008-11-19 | Motion encoding without the transmission of motion information and motion decoding |
Country Status (5)
Country | Link |
---|---|
US (1) | US8731045B2 (en) |
EP (1) | EP2232869A1 (en) |
JP (1) | JP5324594B2 (en) |
CN (1) | CN101919251B (en) |
WO (1) | WO2009071803A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5625512B2 (en) * | 2010-06-09 | 2014-11-19 | ソニー株式会社 | Encoding device, encoding method, program, and recording medium |
FR2980942A1 (en) | 2011-09-30 | 2013-04-05 | France Telecom | IMAGE ENCODING AND DECODING METHOD, IMAGE ENCODING AND DECODING DEVICE AND CORRESPONDING COMPUTER PROGRAMS |
FR3022724A1 (en) * | 2014-06-19 | 2015-12-25 | Orange | IMAGE ENCODING AND DECODING METHOD, IMAGE ENCODING AND DECODING DEVICE AND CORRESPONDING COMPUTER PROGRAMS |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW283289B (en) * | 1994-04-11 | 1996-08-11 | Gen Instrument Corp | |
DE19541457C1 (en) * | 1995-11-07 | 1997-07-03 | Siemens Ag | Method for coding a video data stream of a video sequence consisting of picture blocks |
CN1748427A (en) * | 2003-02-04 | 2006-03-15 | 皇家飞利浦电子股份有限公司 | Predictive encoding of motion vectors including a flag notifying the presence of coded residual motion vector data |
US20040258147A1 (en) * | 2003-06-23 | 2004-12-23 | Tsu-Chang Lee | Memory and array processor structure for multiple-dimensional signal processing |
US7499495B2 (en) * | 2003-07-18 | 2009-03-03 | Microsoft Corporation | Extended range motion vectors |
US7898951B2 (en) * | 2003-08-13 | 2011-03-01 | Jones Farm Technology 2, Llc | Encoding and transmitting variable bit streams with utilization of a constrained bit-rate channel |
KR100586882B1 (en) * | 2004-04-13 | 2006-06-08 | 삼성전자주식회사 | Method and Apparatus for supporting motion scalability |
JP2007043651A (en) * | 2005-07-05 | 2007-02-15 | Ntt Docomo Inc | Dynamic image encoding device, dynamic image encoding method, dynamic image encoding program, dynamic image decoding device, dynamic image decoding method, and dynamic image decoding program |
US8908765B2 (en) * | 2007-11-15 | 2014-12-09 | General Instrument Corporation | Method and apparatus for performing motion estimation |
-
2008
- 2008-11-19 WO PCT/FR2008/052087 patent/WO2009071803A1/en active Application Filing
- 2008-11-19 US US12/744,544 patent/US8731045B2/en active Active
- 2008-11-19 EP EP08858284A patent/EP2232869A1/en not_active Ceased
- 2008-11-19 CN CN2008801252214A patent/CN101919251B/en active Active
- 2008-11-19 JP JP2010535429A patent/JP5324594B2/en active Active
Non-Patent Citations (1)
Title |
---|
STEFFEN KAMP ET AL: "Decoder Side Motion Vector Derivation", 82. MPEG MEETING; 22-10-2007 - 26-10-2007; SHENZHEN; (MOTION PICTURE EXPERT GROUP OR ISO/IEC JTC1/SC29/WG11),, no. M14917, 16 October 2007 (2007-10-16), XP030043523 * |
Also Published As
Publication number | Publication date |
---|---|
US8731045B2 (en) | 2014-05-20 |
JP2011505095A (en) | 2011-02-17 |
WO2009071803A1 (en) | 2009-06-11 |
CN101919251B (en) | 2013-07-10 |
JP5324594B2 (en) | 2013-10-23 |
US20100266046A1 (en) | 2010-10-21 |
CN101919251A (en) | 2010-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0707428B1 (en) | Motion vector differential coding method with median prediction | |
EP0675652B1 (en) | Method and circuit for estimating motion between images of two interlaced fields, and digital signal coding devices comprising such a circuit | |
EP1834488A1 (en) | Video encoding method and device | |
EP3852371B1 (en) | Prediction of a movement vector of a current image partition pointing to a reference area that covers several reference image partitions, encoding and decoding using such a prediction | |
EP2377323B1 (en) | Image prediction by subdivision of causal regions of reference and coding using such prediction | |
EP1591962B1 (en) | Method and device for generating candidate vectors for image interpolation systems using motion estimation and compensation | |
EP2232869A1 (en) | Motion encoding without the transmission of motion information and motion decoding | |
WO2010146314A1 (en) | Encoding motion vectors using competition between predictors | |
EP2810436B1 (en) | Encoding and decoding by means of selective inheritance | |
EP1603341A1 (en) | Method and device for image interpolation systems using motion estimation and compensation | |
EP2410749A1 (en) | Method for adaptive encoding of a digital video stream, particularly for broadcasting over xDSL line | |
EP2761871B1 (en) | Decoder side motion estimation based on template matching | |
EP2736261A1 (en) | Method For Assessing The Quality Of A Video Stream | |
EP0858227B1 (en) | Process and device for coding by luminance estimation for digital video signals | |
FR2769784A1 (en) | FASHION CODING METHOD IN BINARY SHAPE CODING | |
EP3050298B1 (en) | Video encoding and decoding by inheritance of a field of motion vectors | |
WO2003053065A2 (en) | Method and device for compressing video-packet coded video data | |
WO2007003836A2 (en) | Video coding method and device | |
EP3350931B1 (en) | Optimised transmission of video data over a wireless network | |
FR3042368A1 (en) | MULTI-VIEW ENCODING AND DECODING METHOD, MULTI-VIEW ENCODING AND DECODING DEVICE AND CORRESPONDING COMPUTER PROGRAMS | |
Hofer et al. | H. 264 Compress-Then-Analyze Transmission in Edge-Assisted Visual SLAM | |
FR2905785A1 (en) | Image e.g. predictive type image, coding method for video compression application, involves calculating reference image for current image, by compensating movement of preceding image, for providing movement compensated reference image | |
FR3027481A1 (en) | DECODER, METHOD AND SYSTEM FOR DECODING MULTIMEDIA STREAMS | |
Ahmed et al. | Evaluating the Efficiency of Video Transmission Using a New Circular Search Algorithm Based on the Motion Estimation for a Single User | |
Lie et al. | Prescription-based error concealment technique for video transmission on error-prone channels |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20100615 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA MK RS |
|
17Q | First examination report despatched |
Effective date: 20101027 |
|
DAX | Request for extension of the european patent (deleted) | ||
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ORANGE |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20180222 |