EP2380350A1 - Systeme et procede de codage video - Google Patents

Systeme et procede de codage video

Info

Publication number
EP2380350A1
EP2380350A1 EP09768199A EP09768199A EP2380350A1 EP 2380350 A1 EP2380350 A1 EP 2380350A1 EP 09768199 A EP09768199 A EP 09768199A EP 09768199 A EP09768199 A EP 09768199A EP 2380350 A1 EP2380350 A1 EP 2380350A1
Authority
EP
European Patent Office
Prior art keywords
module
current image
macroblocks
coding
zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP09768199A
Other languages
German (de)
English (en)
French (fr)
Inventor
Jean-Pierre Morard
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sagemcom Broadband SAS
Original Assignee
Sagemcom Broadband SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sagemcom Broadband SAS filed Critical Sagemcom Broadband SAS
Publication of EP2380350A1 publication Critical patent/EP2380350A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2365Multiplexing of several video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/40Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding

Definitions

  • the encoding performed by the server is an encoding of the spatiotemporal type such as H264.
  • the H264 standard is a video coding standard de- jointly developed by VCEG (Video Coding Experts Group) and MPEG (Moving Pictures Experts Group). This standard makes it possible to encode video streams with a bit rate less than two times less than that obtained by the MPEG2 standard for the same quality.
  • a spatio-temporal encoding fully encodes only part of the images to be transmitted in order to reconstitute a video.
  • the H264 standard contains the types of images known and defined in the MPEG2 standard, namely:
  • a motion compensation module receiving motion vectors and providing at least one predicted zone, said coding system being characterized in that said data receiving module further receives a real motion vector of at least one displaced zone said current image, said coding system comprising: means for allocating said real motion vector to the macroblocks belonging to said displaced zone;
  • macroblock denotes a rectangular elementary region of the image having a size between 4x4 and 16x16 pixels (via 8x16, 8x8, ).
  • Each macroblock itself consists of luminance blocks and chrominance blocks.
  • Motion estimation in the context of a spatiotemporal coding is an operation which requires a very important computing power.
  • the system according to the invention makes it possible to overcome a part of this estimate by advantageously using the provision of an already existing motion vector. Thanks to the invention, the provision of the motion vector relative to a zone (typically a rectangle within a frame or "frame" in English) having been displaced makes it possible not to calculate the motion vectors for the macroblocks that are in such a moved area.
  • the real motion vector is directly injected on the input of the compensation module.
  • the system comprises means for transmitting only the macroblocks not belonging to said displaced zone to said motion vector estimation module; a subtracter for effecting the difference between the pixels of the current image and the predicted zone and providing a residual error corresponding to this difference; a frequency transform module applying a frequency transform on each macroblock processed by said estimation module as well as on said residual error; a module for quantizing data from said frequency transform module;
  • an entropic coder for coding data from said quantization module.
  • said video coding is a spatio-temporal coding H264 - the screen of said server is displayed by said client terminal on a screen according to a RFB protocol "Remote Frame Buffer" such as the VNC protocol "Virtual Network Computing", said real movement vector of said displaced zone is determined in the following cases: horizontal or vertical scrolling of said displaced zone with a browser-type application; o moving a graphical window of the operating system of said server; o transition from one transparency to another transparency in the case of a slide show; o Flash type animation.
  • said client terminal is a video decoder;
  • said current image as well as said real motion vector are initially coded in an RGB format and then undergo a transformation in a YUV format.
  • a motion estimation module 105 also hereinafter referred to as a motion vector estimation module
  • the reception module 101 receives as input a predictive image F n .
  • F n is the current image of the entire server screen.
  • the invention relates only to the coding of the predictive images, the intra-predictive coding of the images I continuing to be done according to known techniques. Thus, to make the diagram clearer, the means necessary for intra-predictive coding have been deliberately omitted.
  • the reception module 101 also receives as input information on the zones having undergone a displacement (also called zone displaced in the remainder of the description) in the image F n .
  • the displaced zone is a rectangular zone generally represented by a quadruplet (x, y, I, h): x and y represent respectively the abscissa and the ordinate of the point at the top left of the zone, I represents the width of the rectangle and h is the height of said rectangle.
  • this real vector can be obtained by the server via the programming interfaces of its graphical environment, also called API (Application Programming Interface) for graphical user interface (GUI "Graphical User Interface") of the software application running on the server and used by the client terminal or operating system (or “operating system” in English) of the server, Windows TM for example.
  • API Application Programming Interface
  • GUI graphical user interface
  • This real motion vector is known to the software application since the latter is at the initiative of the displacement of the area following an event (typically an event generated by a click or mouse movement or typing) of the user final via the client terminal.
  • an event typically an event generated by a click or mouse movement or typing
  • the size of the rectangle can also be obtained by functions of the "Windows" type. innerHeight "and” Windows. innerWidth ".
  • the server can obtain values characterizing the real movement vector of the zone moved by the user via the client terminal.
  • the motion vector m (mx, my) ⁇ coded in RGB format is also transformed into YUV12 format.
  • the input data processing module 102 comprises:
  • each current image F n to be encoded is divided by means 103 into macroblocks corresponding to a rectangular elementary region of the image having a variable size between 4x4 and 16x16 pixels (via 8x16, 8x8, ).
  • the means 104 knowing the displaced areas of the image F n as well as their real motion vectors make it possible to attribute to the macroblocks belonging to a displaced zone the same real motion vector. Therefore, the means 1 19 will guide only the macroblocks not affected by a zone moved to the motion estimation module 105, the actual motion vectors of the other macroblocks being transmitted directly to the motion compensation module 106 via the means 118.
  • the function of the motion estimation module 105 is to retrieve a macroblock of the current image F n in at least one previous image F n- i of the server screen in its entirety (it could also be a posterior image in the case of an image B and even a plurality of images before and / or after).
  • a motion vector which corresponds to the difference between the position of the selected region and that of the selected region. macroblock.
  • the motion vectors that have been retained by the estimation module (in addition to the real motion vectors transmitted by the means 1 18) are transmitted to the motion compensation module 106. This gives a prediction error due to the fact that the region retained in the past image is not exactly equal to the macroblock analyzed.
  • a predicted picture P is obtained.
  • Subtractor 109 then calculates a residual error D n between the pixels of F n and the predicted picture P.
  • a frequency transform (of discrete cosine transform type DCT "Discrete Cosine Transform” or Hadamard transform) is applied via the frequency transform module 1 12 on each macroblock that has undergone motion estimation, as well as on the residual error D n .
  • This transform makes it possible to have a frequency representation of the modified zones.
  • the data from the frequency transform module 1 12 are then quantized (ie coded on a limited number of bits) by the quantization module 1 13 to provide transformed and quantized parameters X.
  • the function of the quantization module 1 13 is to define different quantification steps depending on whether certain components will be judged or not significant visually; these quantization steps are defined in a quantization step table.
  • the module 1 14 of inverse quantizing retrieves the processed and quantized parameters X which then pass through the module 115 of inverse frequency transform that operates an inverse frequency transform to recover a quantized version D 'n of the residual error D n; this quantized version D ' n is then added to the macroblocks of the predicted zone P by the adder 1 10; the image at the output of the adder 1 10 is then processed by the deblocking filter to provide a reconstructed image F ' n corresponding to a set of reconstructed zones having the same position, the same width and the same height as the modified areas.
  • F ' n is used internally by the decoder 100 to estimate the quality of the encoding.
  • the quantized results X from the quantization module 1 13 are then reordered by the reordering module 108 to group together the non-zero coefficients so as to allow an efficient representation of the other coefficients having a zero value.
  • the data then undergoes a final phase of entropy coding compression via the entropy coder 120.
  • the function of the encodes encoder is to re-encode the data differently in order to reduce the number of bits necessary for their encoding by approaching as close as possible to the minimum of theoretical bits (which is fixed by entropy).
  • the entropy encoder 120 constructs an output stream ⁇ in a Network Abstraction Layer (NAL) format defined to allow the use of the same video syntax in many network environments.
  • NAL Network Abstraction Layer
  • the invention is not limited to the embodiment just described.
  • the invention has been more particularly described in the context of the H264 coding but it applies to any type of spatiotemporal coding: this is for example the case of MPEG2 coding or VC1 coding. (SMPTE video compression standard "Society of Motion Picture and Television Engineers").
  • motion vector has been described as a two-dimensional vector but it is also possible to use a three-dimensional motion vector, for example in the case of a graphical interface such as Aero TM which is the graphical interface of Windows Vista TM for displaying 3D effects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
EP09768199A 2008-12-30 2009-11-16 Systeme et procede de codage video Withdrawn EP2380350A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0859113A FR2940736B1 (fr) 2008-12-30 2008-12-30 Systeme et procede de codage video
PCT/FR2009/052193 WO2010076439A1 (fr) 2008-12-30 2009-11-16 Systeme et procede de codage video

Publications (1)

Publication Number Publication Date
EP2380350A1 true EP2380350A1 (fr) 2011-10-26

Family

ID=40688557

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09768199A Withdrawn EP2380350A1 (fr) 2008-12-30 2009-11-16 Systeme et procede de codage video

Country Status (6)

Country Link
US (1) US8731060B2 (zh)
EP (1) EP2380350A1 (zh)
CN (1) CN102318344B (zh)
BR (1) BRPI0923824A2 (zh)
FR (1) FR2940736B1 (zh)
WO (1) WO2010076439A1 (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9060010B1 (en) * 2012-04-29 2015-06-16 Rockwell Collins, Inc. Incorporating virtual network computing into a cockpit display system for controlling a non-aircraft system
US9602822B2 (en) * 2013-04-17 2017-03-21 Qualcomm Incorporated Indication of cross-layer picture type alignment in multi-layer video coding
CN104144349B (zh) * 2014-07-09 2017-08-29 中电科华云信息技术有限公司 基于h264的spice视频编解码扩展方法及系统
EP3050302B1 (en) 2014-09-25 2018-01-03 Huawei Technologies Co. Ltd. A server for providing a graphical user interface to a client and a client
CN104469400B (zh) * 2014-12-17 2018-02-23 浪潮软件集团有限公司 一种基于rfb协议的图像数据压缩方法
CN113628240A (zh) 2017-04-21 2021-11-09 泽尼马克斯媒体公司 通过预期运动矢量的玩家输入运动补偿
CN113542752A (zh) * 2019-02-19 2021-10-22 西安万像电子科技有限公司 视频数据处理方法及装置
CN113365083B (zh) * 2021-07-08 2022-10-11 广州市保伦电子有限公司 一种基于h.265实现yuv444图像编解码方法

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5809201A (en) * 1994-06-24 1998-09-15 Mitsubishi Denki Kabushiki Kaisha Specially formatted optical disk and method of playback
US5864681A (en) * 1996-08-09 1999-01-26 U.S. Robotics Access Corp. Video encoder/decoder system
JP3351705B2 (ja) * 1997-04-25 2002-12-03 日本ビクター株式会社 動き補償符号化装置、動き補償符号化方法、及び記録媒体への記録方法
EP0936813A1 (en) * 1998-02-16 1999-08-18 CANAL+ Société Anonyme Processing of digital picture data in a decoder
US6563953B2 (en) * 1998-11-30 2003-05-13 Microsoft Corporation Predictive image compression using a single variable length code for both the luminance and chrominance blocks for each macroblock
JP4447197B2 (ja) * 2002-01-07 2010-04-07 三菱電機株式会社 動画像符号化装置および動画像復号装置
US20030160814A1 (en) * 2002-02-27 2003-08-28 Brown David K. Slide show presentation and method for viewing same
US7321626B2 (en) * 2002-03-08 2008-01-22 Sharp Laboratories Of America, Inc. System and method for predictive motion estimation using a global motion predictor
US7738551B2 (en) * 2002-03-18 2010-06-15 International Business Machines Corporation System and method for processing a high definition television (HDTV) image
EP1377040A1 (en) * 2002-06-19 2004-01-02 STMicroelectronics S.r.l. Method of stabilizing an image sequence
US6925123B2 (en) * 2002-08-06 2005-08-02 Motorola, Inc. Method and apparatus for performing high quality fast predictive motion search
EP1574067B1 (en) * 2002-12-17 2017-06-07 Zoran (France) Processing or compressing n-dimensional signals with warped wavelet packets and bandlets
JP4536325B2 (ja) * 2003-02-04 2010-09-01 ソニー株式会社 画像処理装置および方法、記録媒体、並びにプログラム
JP4575370B2 (ja) * 2003-05-02 2010-11-04 エヌエックスピー ビー ヴィ ビデオアーチファクトを低減するためのバイアスされた動きベクトル補間
NO318973B1 (no) * 2003-07-01 2005-05-30 Tandberg Telecom As Fremgangsmate for stoyreduksjon
JP2005039340A (ja) * 2003-07-15 2005-02-10 Hitachi Ltd 再生装置
US7724827B2 (en) * 2003-09-07 2010-05-25 Microsoft Corporation Multi-layer run level encoding and decoding
JP4687994B2 (ja) * 2004-04-09 2011-05-25 ソニー株式会社 画像処理装置および方法、記録媒体、並びにプログラム
US8503530B2 (en) * 2004-05-27 2013-08-06 Zhourong Miao Temporal classified filtering for video compression
US7881546B2 (en) * 2004-09-08 2011-02-01 Inlet Technologies, Inc. Slab-based processing engine for motion video
US7516255B1 (en) * 2005-03-30 2009-04-07 Teradici Corporation Method and apparatus for providing a low-latency connection between a data processor and a remote graphical user interface over a network
FR2886044B1 (fr) * 2005-05-23 2007-06-22 Canon Kk Procede et dispositif d'affichage d'images d'une sequence video
US8159543B2 (en) * 2006-05-09 2012-04-17 Nxp B.V. Processing device with jitter extraction and equipment comprising such a device
US20070291839A1 (en) * 2006-06-15 2007-12-20 Faraday Technology Corp. Method and device for multimedia processing
KR100803611B1 (ko) * 2006-11-28 2008-02-15 삼성전자주식회사 영상의 부호화, 복호화 방법 및 장치
EP2102805A1 (en) * 2006-12-11 2009-09-23 Cinnafilm, Inc. Real-time film effects processing for digital video
US8825739B2 (en) * 2007-07-04 2014-09-02 International Business Machines Corporation Method and apparatus for controlling multiple systems in a low bandwidth environment
EP2175607A1 (en) * 2008-10-08 2010-04-14 NEC Corporation Method for establishing a thin client session
US8219759B2 (en) * 2009-03-16 2012-07-10 Novell, Inc. Adaptive display caching
US20100332613A1 (en) * 2009-06-30 2010-12-30 Jorg Brakensiek Method and apparatus for providing content and context analysis of remote device content

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2010076439A1 *

Also Published As

Publication number Publication date
US20110268191A1 (en) 2011-11-03
BRPI0923824A2 (pt) 2015-07-14
FR2940736B1 (fr) 2011-04-08
WO2010076439A1 (fr) 2010-07-08
US8731060B2 (en) 2014-05-20
CN102318344B (zh) 2014-10-08
FR2940736A1 (fr) 2010-07-02
CN102318344A (zh) 2012-01-11

Similar Documents

Publication Publication Date Title
EP2380350A1 (fr) Systeme et procede de codage video
JP7047119B2 (ja) 変換領域における残差符号予測のための方法および装置
US10013746B2 (en) High dynamic range video tone mapping
RU2316909C2 (ru) Способ и устройство для масштабируемого по цветовому пространству видеокодирования и декодирования
US20210377542A1 (en) Video encoding and decoding method, device, and system, and storage medium
TW201711473A (zh) 針對視訊寫碼處理高動態範圍及廣色域視訊資料
TW201931853A (zh) 具有聯合像素/變換為基礎之量化之視頻寫碼之量化參數控制
TWI713354B (zh) 用於顯示器調適之色彩重映射資訊sei信息發信號
JP7251882B2 (ja) Cbfフラグの効率的なシグナリング方法
WO2014026097A1 (en) Two-step quantization and coding method and apparatus
EP2380352A2 (fr) Procede d'encodage par segmentation d'une image
JP7397878B2 (ja) イントラ・サブ・パーティション・コーディング・モードのための方法及び装置
US20120170663A1 (en) Video processing
JP7247349B2 (ja) イントラ予測のための成分間線形モデリングの方法、装置、デコーダ、エンコーダ、およびプログラム
WO2021164014A1 (zh) 视频编码方法及装置
EP4300958A1 (en) Video image encoding method, video image decoding method and related devices
EP4277274A1 (en) Layered encoding and decoding methods and apparatuses
EP2015584A2 (fr) Système et procédé de codage vidéo
RU2786086C1 (ru) Способ и устройство кросс-компонентного линейного моделирования для внутреннего предсказания
TWI821013B (zh) 視頻編解碼方法及裝置
JP7508621B2 (ja) イントラ予測のためのクロスコンポーネント線形モデリングの方法、装置、デコーダ、エンコーダおよびプログラム
Azadegan et al. Improving video quality by predicting inter-frame residuals based on an additive 3D-CNN model
RU2772313C2 (ru) Способ и устройство для фильтрации изображений с адаптивными коэффициентами множителя
竹内健 Video Coding Methods for Bit-depth, Color-gamut and Perceptual Quality Scalabilities
Ginzburg et al. DCT-Domain Coder for Digital Video Applications

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110722

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20161222

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SAGEMCOM BROADBAND SAS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20190601