WO2017137311A1 - Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel - Google Patents

Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel Download PDF

Info

Publication number
WO2017137311A1
WO2017137311A1 PCT/EP2017/052316 EP2017052316W WO2017137311A1 WO 2017137311 A1 WO2017137311 A1 WO 2017137311A1 EP 2017052316 W EP2017052316 W EP 2017052316W WO 2017137311 A1 WO2017137311 A1 WO 2017137311A1
Authority
WO
WIPO (PCT)
Prior art keywords
tree
coding
chroma
unit
image
Prior art date
Application number
PCT/EP2017/052316
Other languages
French (fr)
Inventor
Fabrice Urban
Franck Galpin
Tangi POIRIER
Fabrice Leleannec
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to BR112018015558-6A priority Critical patent/BR112018015558A2/en
Priority to CN201780010687.9A priority patent/CN108605134B/en
Priority to CN202210747508.2A priority patent/CN115052153A/en
Priority to EP17702390.0A priority patent/EP3414903B1/en
Priority to MX2018009737A priority patent/MX2018009737A/en
Priority to RU2018130816A priority patent/RU2739251C2/en
Priority to EP22153784.8A priority patent/EP4040789B1/en
Priority to CA3014332A priority patent/CA3014332A1/en
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to MYPI2018702371A priority patent/MY189780A/en
Priority to US16/076,170 priority patent/US11109045B2/en
Priority to JP2018539051A priority patent/JP7100582B2/en
Priority to KR1020187022905A priority patent/KR20180111839A/en
Priority to CN202210747507.8A priority patent/CN115052152A/en
Priority to EP23196190.5A priority patent/EP4266684A3/en
Publication of WO2017137311A1 publication Critical patent/WO2017137311A1/en
Priority to US17/408,317 priority patent/US20210385466A1/en
Priority to JP2022106895A priority patent/JP2022137130A/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/119Adaptive subdivision aspects, e.g. subdivision of a picture into rectangular or non-rectangular coding blocks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/147Data rate or code amount at the encoder output according to rate distortion criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/96Tree coding, e.g. quad-tree coding

Definitions

  • the present principles generally relate to image/video encoding and decoding. 2. Background.
  • image data contains one or several arrays of samples (pixel data) in a specific image/video format which specifies all information relative to the pixel values of an image (or a video) and all information which may be used by a display and/or any other device to visualize and/or decode an image (or video) for example.
  • Image data comprises at least one component, in the shape of a first array of samples, usually a luma (or luminance) component, and, possibly, at least one other component, in the shape of at least one other array of samples, usually a color component. Or, equivalently, the same image data may also be represented by a set of arrays of color samples, such as the traditional trichromatic RGB representation.
  • Pixel data, relative to a pixel is represented by a vector of C values, where C is the number of components.
  • Each value of a vector is represented with a number of bits which defines a maximal dynamic range of the pixel values.
  • An image unit comprises image data that are represented by a luminance channel and at least one chrominance channel.
  • image data may be represented in the well-known YCbCr, YUV, RGB color spaces but the present principles are not limited to a specific color space.
  • an image unit comprises a luma unit that represents the luminance channel of the image unit and at least one chroma unit that represents the chrominance channel of the image unit.
  • a non-limitative example of an image unit is a Coding Unit or a
  • Transform unit as defined in HEVC or a block or macroblock as defined in most of MPEG standards.
  • An image unit may be any square or rectangular part of an image.
  • the image of a sequence of image is divided into so-called Coding-tree Units (CTU), which size is typically 64x64, 1 28x1 28, or 256x256 pixels.
  • CTU Coding-tree Units
  • Each CTU is represented by a Coding-tree in the compressed domain as shown in Fig. 1. As illustrated, this may be a quad-tree partitioning (division, splitting) of the CTU, where each leaf is called a Coding Unit (CU).
  • a Coding Unit (CU) contains the main information for coding an image unit and may be further split into Prediction Units (PU) and Transform Units (TU).
  • the Prediction Unit (PU) contains the information for predicting the pixel values inside an image unit and the Transform Unit (TU) represents the pixels of an image unit on which the transform is applied and thus the remaining of the encoding process.
  • a current Transform Unit (TU) and the Prediction Unit (PU) are of same size; 2)
  • the Prediction Unit PL is composed of 4 Transform Unit TUs, each
  • Transform Unit TU can be split.
  • the Luma and Chroma channels (in case of YUV video) follow the same partitioning (quad-tree) (except for small blocks in 4:2:0 or 4:2:2 sampling where the chroma TUs cannot be split).
  • the prediction relies on previously decoded pixels from the same or other image, then the residual is transformed following a Transform Unit TU quad-tree.
  • the PU may contain several smaller TUs that can be further split into smaller TUs in a quad-tree fashion. In this case, chroma TUs follow the Luma TU quad-tree. For small blocks, when not in 4:4:4 sampling, the chroma TUs cannot be split.
  • Fig. 2 shows an example of a segmentation into TUs using a quad-tree (Residual Quad-Tree (RQT).)
  • RQT Residual Quad-Tree
  • the partitioning into TUs is signaled and maximum and minimum transform sizes are signaled in slice header. For quad- tree nodes between these bounds, subdivision flags are coded.
  • the same RQT is used for both luma and chroma components of each CU.
  • a coding-tree may be a quadtree, a binary-tree or a triple-tree used for coding an image unit according to non-limitative examples.
  • This solution results in fully separate Luma and Chroma coding-trees, respectively obtained by splitting a luma unit and a chroma unit relative to an image unit, but results in additional signaling cost.
  • the problem solved by the present principles is to improve the coding efficiency of an image unit when separate coding-trees are used for coding the luma and chroma units relative to said image unit.
  • the problem to-be solved is how to efficiently compress an image unit comprising image data represented by multiple channels.
  • the present principles set out to remedy at least one of the drawbacks of the prior art with a method for encoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel.
  • the method comprises obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit.
  • obtaining said chroma coding-tree comprises:
  • the present principles relate to a method for decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel.
  • the method comprises obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit.
  • obtaining said chroma coding-tree comprises:
  • the present principles relate to a device comprising a processor configured to implement the above methods, a signal having syntax element related to an image unit comprising picture data, a computer program product comprising program code instructions to execute the steps of the above method when this program is executed on a computer, and a non-transitory storage medium carrying instructions of program code for executing steps of the above method when said program is executed on a computing device.
  • - Fig. 1 illustrates an overall video structure of the prediction and transform partitioning
  • - Fig. 2 shows an example of a segmentation into TUs using a quadtree
  • - Fig. 3 shows an example of Luma and Chroma coding-tree splitting depending on independantChromaTuFlag syntax element in accordance with an example of present principles
  • - Fig. 4 shows an example of the syntax of the transform tree in accordance with an example of present principles
  • FIG. 5 shows an example of syntax of a transform unit in accordance with an example of present principles
  • FIG. 6 shows another example of Luma and Chroma coding-tree splitting depending on independantChromaTuFlag syntax element in accordance with an example of present principles
  • FIG. 7 shows an example of an architecture of a device in accordance with an example of present principles
  • FIG. 8 shows two remote devices communicating over a communication network in accordance with an example of present principles.
  • - Fig. 9 shows the syntax of a signal in accordance with an example of present principles.
  • each block represents a circuit element, module, or portion of code which comprises one or more executable instructions for implementing the specified logical function(s).
  • the function(s) noted in the blocks may occur out of the order noted. For example, two blocks shown in succession may, in fact, be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending on the functionality involved.
  • the present principles are described for encoding/decoding an image unit of an image but extends to the encoding/decoding image units of a sequence of images (video) because each image unit of each image of the sequence is sequentially encoded/decoded as described below.
  • the present principles relate to a method for encoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel.
  • the method obtains a luma coding-tree LUMAQ by splitting a luminance unit representative of the luminance channel of said image unit and obtains a chroma coding-tree CHROQ by splitting a chrominance unit representative of at least one chrominance channel of said image unit.
  • Obtaining said chroma coding-tree CHROQ comprises determining whether said chroma coding-tree CHROQ and said luma coding-tree LUMAQ are identical, and signaling, in a signal S, an information data INFO indicating whether said chroma coding-tree CHROQ and said luma coding-tree LUMAQ are identical.
  • the information data INFO is a flag equal to a first value when the chroma coding-tree CHROQ and the luma coding-tree LUMAQ are identical and to a second value otherwise.
  • the information data INFO when the information data INFO equals to said second value, the information data INFO further indicates that the chrominance unit is not split.
  • Fig. 3 shows an example of Luma and Chroma coding-trees when the image unit is a residual transform Unit (TU) as defined in HEVC and the information data INFO is a flag denoted independantChromaTuFlag.
  • TU residual transform Unit
  • the flag independantChromaTuFlag is coded as an additional syntax element contained in the "transform tree" syntax element of the HEVC specification as shown in Fig. 4. (HEVC, section 7.3.8.8 Transform tree syntax).
  • Fig. 5 shows an example of syntax of a transform unit.
  • a maximum size MS for example MS equals to the size of the image unit to be encoded
  • a splitting strategy can be to optimally decide by rate/distortion optimization whether to keep the size of the current leaves of the chroma coding-tree CHROQ as big as possible or to split them.
  • the chroma and luma coding-trees are identical until a given decomposition level and stops the splitting of the leaves of the chroma coding-tree CHROQ for higher decomposition levels.
  • the information data INFO is signalled for at least one decomposition level of the chroma coding-tree CHROQ when said at least one decomposition level is split, said information data INFO indicates whether said at least one decomposition level of the chroma coding-tree follows the splitting of the same level of the luma coding-tree.
  • said information data INFO further indicates when stopping the splitting of a leaf of the Chroma coding-tree CHROQ.
  • Fig. 6 shows an example of separate CU coding-tree per channel type or per component in accordance with an example of present principles.
  • an optimal luma and chroma coding-tree are obtained when the image unit is a residual transform Unit (TU) as defined in HEVC and the information data INFO is a flag denoted independantChroma TuFlag.
  • TU residual transform Unit
  • the chrominance unit is split into 4 sub-units 1 - 4.
  • No independantChromaTuFlag is transmitted for sub-units 2 and 3 (because corresponding Luma sub-units are not split).
  • each of the embodiments and variants can be performed on a per-chroma-channel basis, i.e. a chroma coding-tree is computed for each chrominance channel.
  • the information data INFO is then signaled for each chrominance channel.
  • the present principles further relate to a method for encoding an image, said image comprising at least one image unit encoded according to an encoding method as above described in accordance with the present principles.
  • the present principles further relate to a method for decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel.
  • the method obtains a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtains a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit.
  • Obtaining said chroma coding-tree comprises determining whether said chroma coding-tree and said luma coding-tree are identical.
  • the method further signals an information data indicating whether said chroma coding-tree and said luma coding-tree are identical.
  • the decoding method comprises obtaining, from a signal or a memory, an information data INFO indicating whether a chroma coding- tree CHROQ and a luma coding-tree LUMQ are identical.
  • Said information data INFO may be a flag equals to a first value when the chroma coding-tree and the luma coding-tree are identical and to a second value otherwise.
  • Said information data INFO when equals to said second value (i.e. when the chroma coding-tree and the luma coding-tree are not identical), may also indicate that the chrominance unit is not split.
  • the present principles further relate to a method for decoding an image, said image comprising at least one image unit encoded according to an encoding method as above described in accordance with the present principles.
  • the modules are functional units, which may or not be in relation with distinguishable physical units. For example, these modules or some of them may be brought together in a unique component or circuit, or contribute to functionalities of a software. A contrario, some modules may potentially be composed of separate physical entities.
  • the apparatus which are compatible with the present principles are implemented using either pure hardware, for example using dedicated hardware such ASIC or FPGA or VLSI, respectively « Application Specific Integrated Circuit » , « Field-Programmable Gate Array » « Very Large Scale Integration » or from several integrated electronic components embedded in a device or from a blend of hardware and software components.
  • Fig. 7 represents an exemplary architecture of a device 120 which may be configured to implement a method described in relation with Figs. 1-6.
  • Device 120 comprises following elements that are linked together by a data and address bus 1 21 :
  • microprocessor 1 22 which is, for example, a DSP (or Digital Signal Processor); - a ROM (or Read Only Memory) 123;
  • RAM or Random Access Memory
  • the battery 126 is external to the device.
  • the word « register » used in the specification can correspond to area of small capacity (some bits) or to very large area (e.g. a whole program or large amount of received or decoded data).
  • the ROM 123 comprises at least a program and parameters.
  • the ROM 1 23 may store algorithms and instructions to perform techniques in accordance with present principles. When switched on, the CPU 122 uploads the program in the RAM and executes the corresponding instructions.
  • RAM 124 comprises, in a register, the program executed by the CPU 122 and uploaded after switch on of the device 120, input data in a register, intermediate data in different states of the method in a register, and other variables used for the execution of the method in a register.
  • the implementations described herein may be implemented in, for example, a method or a process, an apparatus, a software program, a data stream, or a signal. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method or a device), the implementation of features discussed may also be implemented in other forms (for example a program).
  • An apparatus may be implemented in, for example, appropriate hardware, software, and firmware.
  • the methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to processing devices in general, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device.
  • Processors also include communication devices, such as, for example, computers, cell phones, portable/personal digital assistants ("PDAs"), and other devices that facilitate communication of information between end-users.
  • communication devices such as, for example, computers, cell phones, portable/personal digital assistants ("PDAs"), and other devices that facilitate communication of information between end-users.
  • the image comprising an image unit or the image unit to be encoded is obtained from a source.
  • the source belongs to a set comprising:
  • a local memory e.g. a video memory or a RAM (or Random Access Memory), a flash memory, a ROM (or Read Only Memory
  • a storage interface e.g. an interface with a mass storage, a RAM, a flash memory, a ROM, an optical disc or a magnetic support;
  • a communication interface e.g. a wireline interface (for example a bus interface, a wide area network interface, a local area network interface) or a wireless interface (such as a IEEE 802.1 1 interface or a Bluetooth® interface); and
  • a wireline interface for example a bus interface, a wide area network interface, a local area network interface
  • a wireless interface such as a IEEE 802.1 1 interface or a Bluetooth® interface
  • an image capturing circuit e.g. a sensor such as, for example, a CCD (or Charge-Coupled Device) or CMOS (or Complementary Metal-Oxide-Semiconductor)).
  • a sensor such as, for example, a CCD (or Charge-Coupled Device) or CMOS (or Complementary Metal-Oxide-Semiconductor)
  • the decoded image unit or the decoded image comprising a decoded image unit is sent to a destination; specifically, the destination belongs to a set comprising:
  • a local memory e.g. a video memory or a RAM, a flash memory, a hard disk ;
  • a storage interface e.g. an interface with a mass storage, a RAM, a flash memory, a ROM, an optical disc or a magnetic support;
  • a communication interface e.g. a wireline interface (for example a bus interface (e.g. USB (or Universal Serial Bus)), a wide area network interface, a local area network interface, a HDMI (High Definition Multimedia Interface) interface) or a wireless interface (such as a IEEE 802.1 1 interface, WiFi ® or a Bluetooth ® interface); and
  • a wireline interface for example a bus interface (e.g. USB (or Universal Serial Bus)
  • a wide area network interface e.g. USB (or Universal Serial Bus)
  • a wide area network interface e.g. USB (or Universal Serial Bus)
  • a local area network interface e.g. HDMI (High Definition Multimedia Interface) interface
  • a wireless interface such as a IEEE 802.1 1 interface, WiFi ® or a Bluetooth ® interface
  • a signal S is generated.
  • the signal S has syntax element related to an image unit comprising picture data represented by a luminance channel and at least one chrominance channel.
  • Said syntax element defines a luma coding-tree obtained by splitting a luminance unit representative of the luminance channel of said image unit and a chroma coding-tree obtained by splitting a chrominance unit representative of at least one chrominance channel of said image unit.
  • the Signal is formatted to comprise an information data INFO indicating whether said chroma coding-tree and said luma coding-tree are identical, and said information data INFO further indicates that the chrominance unit is not split when said chroma and luma coding-trees are not identical.
  • said information data further indicates whether or not said at least one decomposition level of the chroma coding-tree follows the splitting of the same level of the luma coding-tree.
  • the signal S is sent to a destination.
  • the signal S is stored in a local or remote memory, e.g. a video memory (124) or a RAM (124), a hard disk (123).
  • the signal S is sent to a storage interface (125), e.g. an interface with a mass storage, a flash memory, ROM, an optical disc or a magnetic support and/or transmitted over a communication interface (125), e.g. an interface to a point to point link, a communication bus, a point to multipoint link or a broadcast network.
  • the signal S is obtained from a source.
  • the signal S is read from a local memory, e.g. a video memory (124), a RAM (124), a ROM (123), a flash memory (123) or a hard disk (123).
  • the bitstream is received from a storage interface (125), e.g. an interface with a mass storage, a RAM, a ROM, a flash memory, an optical disc or a magnetic support and/or received from a communication interface (125), e.g. an interface to a point to point link, a bus, a point to multipoint link or a broadcast network.
  • device 120 being configured to implement an encoding method described in relation with Fig. 1 -6, belongs to a set comprising: - a mobile device ;
  • a video server e.g. a broadcast server, a video-on-demand server or a web server.
  • device 120 being configured to implement a decoding method described above, belongs to a set comprising:
  • the device A comprises a processor in relation with memory RAM and ROM which are configured to implement a method for encoding image comprising at least one image unit or an image unit as described above and the device B comprises a processor in relation with memory RAM and ROM which are configured to implement a method for decoding as described above.
  • the network is a broadcast network, adapted to broadcast still images or video images from device A to decoding devices including the device B.
  • the signal S is intended to be transmitted by the device A and received by the device B.
  • Fig. 9 shows an example of the syntax of such a signal when the data are transmitted over a packet-based transmission protocol.
  • Each transmitted packet P comprises a header H and a payload PAYLOAD.
  • a bit of the header H for example, is dedicated to represent the information data carried by the signal S.
  • multiple flags may be used to represent the information data INFO as above described and carried by the signal S.
  • Implementations of the various processes and features described herein may be embodied in a variety of different equipment or applications.
  • Examples of such equipment include an encoder, a decoder, a post-processor processing output from a decoder, a pre-processor providing input to an encoder, a video coder, a video decoder, a video codec, a web server, a set- top box, a laptop, a personal computer, a cell phone, a PDA, and any other device for processing an image or a video or other communication devices.
  • the equipment may be mobile and even installed in a mobile vehicle.
  • a computer readable storage medium can take the form of a computer readable program product embodied in one or more computer readable medium(s) and having computer readable program code embodied thereon that is executable by a computer.
  • a computer readable storage medium as used herein is considered a non-transitory storage medium given the inherent capability to store the information therein as well as the inherent capability to provide retrieval of the information therefrom.
  • a computer readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. It is to be appreciated that the following, while providing more specific examples of computer readable storage mediums to which the present principles can be applied, is merely an illustrative and not exhaustive listing as is readily appreciated by one of ordinary skill in the art: a portable computer diskette; a hard disk; a read-only memory (ROM); an erasable programmable read-only memory (EPROM or Flash memory); a portable compact disc read-only memory (CD-ROM); an optical storage device; a magnetic storage device; or any suitable combination of the foregoing.
  • the instructions may form an application program tangibly embodied on a processor-readable medium.
  • Instructions may be, for example, in hardware, firmware, software, or a combination. Instructions may be found in, for example, an operating system, a separate application, or a combination of the two.
  • a processor may be characterized, therefore, as, for example, both a device configured to carry out a process and a device that includes a processor-readable medium (such as a storage device) having instructions for carrying out a process. Further, a processor-readable medium may store, in addition to or in lieu of instructions, data values produced by an implementation.
  • implementations may produce a variety of signals formatted to carry information that may be, for example, stored or transmitted.
  • the information may include, for example, instructions for performing a method, or data produced by one of the described implementations.
  • a signal may be formatted to carry as data the rules for writing or reading the syntax of a described example of the present principles, or to carry as data the actual syntax-values written by a described example of the present principles.
  • Such a signal may be formatted, for example, as an electromagnetic wave (for example, using a radio frequency portion of spectrum) or as a baseband signal.
  • the formatting may include, for example, encoding a data stream and modulating a carrier with the encoded data stream.
  • the information that the signal carries may be, for example, analog or digital information.
  • the signal may be transmitted over a variety of different wired or wireless links, as is known.
  • the signal may be stored on a processor-readable medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Color Television Systems (AREA)

Abstract

The present principles relates to a method for encoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel, the method comprising obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit. The method is characterized in that obtaining said chroma coding-tree comprises: - determining whether said chroma coding-tree and said luma coding-tree are identical; and - signaling an information data indicating whether said chroma coding-tree and said luma coding-tree are identical.

Description

Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel. 1. Field.
The present principles generally relate to image/video encoding and decoding. 2. Background.
The present section is intended to introduce the reader to various aspects of art, which may be related to various aspects of the present principles that are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present principles. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
In the following, image data contains one or several arrays of samples (pixel data) in a specific image/video format which specifies all information relative to the pixel values of an image (or a video) and all information which may be used by a display and/or any other device to visualize and/or decode an image (or video) for example.
Image data comprises at least one component, in the shape of a first array of samples, usually a luma (or luminance) component, and, possibly, at least one other component, in the shape of at least one other array of samples, usually a color component. Or, equivalently, the same image data may also be represented by a set of arrays of color samples, such as the traditional trichromatic RGB representation.
Pixel data, relative to a pixel, is represented by a vector of C values, where C is the number of components. Each value of a vector is represented with a number of bits which defines a maximal dynamic range of the pixel values.
An image unit comprises image data that are represented by a luminance channel and at least one chrominance channel. Typically, image data may be represented in the well-known YCbCr, YUV, RGB color spaces but the present principles are not limited to a specific color space. Thus, an image unit comprises a luma unit that represents the luminance channel of the image unit and at least one chroma unit that represents the chrominance channel of the image unit.
A non-limitative example of an image unit is a Coding Unit or a
Transform unit as defined in HEVC or a block or macroblock as defined in most of MPEG standards. An image unit may be any square or rectangular part of an image.
In some video compression standards like H.265/HEVC (High Efficiency Video Coding (HEVC), Recommendation ITU-T H.265 | International Standard ISO/IEC 23008-2, 1 0/2014), the image of a sequence of image (video) is divided into so-called Coding-tree Units (CTU), which size is typically 64x64, 1 28x1 28, or 256x256 pixels.
Each CTU is represented by a Coding-tree in the compressed domain as shown in Fig. 1. As illustrated, this may be a quad-tree partitioning (division, splitting) of the CTU, where each leaf is called a Coding Unit (CU). A Coding Unit (CU) contains the main information for coding an image unit and may be further split into Prediction Units (PU) and Transform Units (TU). The Prediction Unit (PU) contains the information for predicting the pixel values inside an image unit and the Transform Unit (TU) represents the pixels of an image unit on which the transform is applied and thus the remaining of the encoding process.
In HEVC standard, two geometries then coexist: prediction partitioning and transform partitioning and two main cases happen in intra prediction:
1 ) A current Transform Unit (TU) and the Prediction Unit (PU) are of same size; 2) The Prediction Unit PL) is composed of 4 Transform Unit TUs, each
Transform Unit TU can be split.
In the second case, the Luma and Chroma channels (in case of YUV video) follow the same partitioning (quad-tree) (except for small blocks in 4:2:0 or 4:2:2 sampling where the chroma TUs cannot be split).
The prediction relies on previously decoded pixels from the same or other image, then the residual is transformed following a Transform Unit TU quad-tree. The PU may contain several smaller TUs that can be further split into smaller TUs in a quad-tree fashion. In this case, chroma TUs follow the Luma TU quad-tree. For small blocks, when not in 4:4:4 sampling, the chroma TUs cannot be split.
Fig. 2 shows an example of a segmentation into TUs using a quad-tree (Residual Quad-Tree (RQT).) The partitioning into TUs is signaled and maximum and minimum transform sizes are signaled in slice header. For quad- tree nodes between these bounds, subdivision flags are coded. The same RQT is used for both luma and chroma components of each CU.
In H.265/HEVC, only one RQT is transmitted for both luma and chroma components of each CU. When the TU coding-tree is deep, this generates signaling cost for chroma units that could advantageously be replaced by coded coefficient.
On the opposite, separating the partitioning of the chroma and Luma component has been proposed in MediaTek Inc, "Block partitioning structure for next generation video coding", ITU-T SG1 6, COM 16 - C 966 R3 - E, Geneva, October 2015), where for each image unit, two distinct coding-trees are defined: One for Luma and one for Chroma. A coding-tree may be a quadtree, a binary-tree or a triple-tree used for coding an image unit according to non-limitative examples.
This solution results in fully separate Luma and Chroma coding-trees, respectively obtained by splitting a luma unit and a chroma unit relative to an image unit, but results in additional signaling cost. The problem solved by the present principles is to improve the coding efficiency of an image unit when separate coding-trees are used for coding the luma and chroma units relative to said image unit.
More generally, the problem to-be solved is how to efficiently compress an image unit comprising image data represented by multiple channels.
3. Summary.
The following presents a simplified summary of the present principles in order to provide a basic understanding of some aspects of the present principles. This summary is not an extensive overview of the present principles. It is not intended to identify key or critical elements of the present principles. The following summary merely presents some aspects of the present principles in a simplified form as a prelude to the more detailed description provided below.
The present principles set out to remedy at least one of the drawbacks of the prior art with a method for encoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel. The method comprises obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit. According to the present principles, obtaining said chroma coding-tree comprises:
- determining whether said chroma coding-tree and said luma coding- tree are identical; and
- signaling an information data indicating whether said chroma coding- tree and said luma coding-tree are identical.
According to another of their aspects, the present principles relate to a method for decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel. The method comprises obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit. According to the present principles, obtaining said chroma coding-tree comprises:
- determining whether said chroma coding-tree and said luma coding- tree are identical; and
- signaling an information data indicating whether said chroma coding- tree and said luma coding-tree are identical.
According to other of their aspects, the present principles relate to a device comprising a processor configured to implement the above methods, a signal having syntax element related to an image unit comprising picture data, a computer program product comprising program code instructions to execute the steps of the above method when this program is executed on a computer, and a non-transitory storage medium carrying instructions of program code for executing steps of the above method when said program is executed on a computing device.
The specific nature of the present principles as well as other objects, advantages, features and uses of the present principles will become evident from the following description of examples taken in conjunction with the accompanying drawings.
4. Brief Description of Drawings.
In the drawings, examples of the present principles are illustrated. It shows:
- Fig. 1 illustrates an overall video structure of the prediction and transform partitioning;
- Fig. 2 shows an example of a segmentation into TUs using a quadtree;
- Fig. 3 shows an example of Luma and Chroma coding-tree splitting depending on independantChromaTuFlag syntax element in accordance with an example of present principles; - Fig. 4 shows an example of the syntax of the transform tree in accordance with an example of present principles;
- Fig. 5 shows an example of syntax of a transform unit in accordance with an example of present principles;
- Fig. 6 shows another example of Luma and Chroma coding-tree splitting depending on independantChromaTuFlag syntax element in accordance with an example of present principles;
- Fig. 7 shows an example of an architecture of a device in accordance with an example of present principles;
- Fig. 8 shows two remote devices communicating over a communication network in accordance with an example of present principles; and
- Fig. 9 shows the syntax of a signal in accordance with an example of present principles.
Similar or same elements are referenced with the same reference numbers.
6. Description of Example of the present principles.
The present principles will be described more fully hereinafter with reference to the accompanying figures, in which examples of the present principles are shown. The present principles may, however, be embodied in many alternate forms and should not be construed as limited to the examples set forth herein. Accordingly, while the present principles are susceptible to various modifications and alternative forms, specific examples thereof are shown by way of examples in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit the present principles to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present principles as defined by the claims. The terminology used herein is for the purpose of describing particular examples only and is not intended to be limiting of the present principles. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises", "comprising," "includes" and/or "including" when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Moreover, when an element is referred to as being "responsive" or "connected" to another element, it can be directly responsive or connected to the other element, or intervening elements may be present. In contrast, when an element is referred to as being "directly responsive" or "directly connected" to other element, there are no intervening elements present. As used herein the term "and/or" includes any and all combinations of one or more of the associated listed items and may be abbreviated as"/".
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element without departing from the teachings of the present principles.
Although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
Some examples are described with regard to block diagrams and operational flowcharts in which each block represents a circuit element, module, or portion of code which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in other implementations, the function(s) noted in the blocks may occur out of the order noted. For example, two blocks shown in succession may, in fact, be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending on the functionality involved.
Reference herein to "in accordance with an example" or "in an example" means that a particular feature, structure, or characteristic described in connection with the example can be included in at least one implementation of the present principles. The appearances of the phrase in accordance with an example" or "in an example" in various places in the specification are not necessarily all referring to the same example, nor are separate or alternative examples necessarily mutually exclusive of other examples.
Reference numerals appearing in the claims are by way of illustration only and shall have no limiting effect on the scope of the claims.
While not explicitly described, the present examples and variants may be employed in any combination or sub-combination.
The present principles are described for encoding/decoding an image unit of an image but extends to the encoding/decoding image units of a sequence of images (video) because each image unit of each image of the sequence is sequentially encoded/decoded as described below.
The present principles relate to a method for encoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel.
The method obtains a luma coding-tree LUMAQ by splitting a luminance unit representative of the luminance channel of said image unit and obtains a chroma coding-tree CHROQ by splitting a chrominance unit representative of at least one chrominance channel of said image unit.
Obtaining said chroma coding-tree CHROQ comprises determining whether said chroma coding-tree CHROQ and said luma coding-tree LUMAQ are identical, and signaling, in a signal S, an information data INFO indicating whether said chroma coding-tree CHROQ and said luma coding-tree LUMAQ are identical.
This improves the coding efficiency compared to the prior art because additional syntax used to encode the chroma and luma coding-trees is limited compared to a separate coding of these two coding-trees. According to an embodiment, the information data INFO is a flag equal to a first value when the chroma coding-tree CHROQ and the luma coding-tree LUMAQ are identical and to a second value otherwise.
According to an embodiment, when the information data INFO equals to said second value, the information data INFO further indicates that the chrominance unit is not split.
Fig. 3 shows an example of Luma and Chroma coding-trees when the image unit is a residual transform Unit (TU) as defined in HEVC and the information data INFO is a flag denoted independantChromaTuFlag.
When independantChromaTuFlag=0, the luma and chroma coding- trees are identical (bottom part in Fig. 3) and, when independantChromaTuFlag= 1 , the luma and chroma coding-trees are not identical (top part in Fig. 3).
According to a variant, the flag independantChromaTuFlag is coded as an additional syntax element contained in the "transform tree" syntax element of the HEVC specification as shown in Fig. 4. (HEVC, section 7.3.8.8 Transform tree syntax).
Fig. 5 shows an example of syntax of a transform unit.
According to an embodiment, illustrated in Fig. 3, when said chroma and luma coding-trees are determined as being not identical, if the size of at least one leaf L of said chroma coding-tree CHROQ is larger than a maximum size MS (for example MS equals to the size of the image unit to be encoded), then said at least one leaf L is recursively split until the sizes of the leaves of said chroma coding-tree CHROQ reach said maximum size MS.
In Fig. 3, the leaves of the chroma coding-tree CHROQ (top in Fig. 3) are not split because the size of the current leaves equals to a maximum size of the TU size (MS).
The result is that the size of the leaves of the chroma coding-tree CHROQ are as big as possible (in the limit of the given maximum size MS).
A splitting strategy can be to optimally decide by rate/distortion optimization whether to keep the size of the current leaves of the chroma coding-tree CHROQ as big as possible or to split them. As an example, for each image unit to be encoded, the coding is done for independantChromaTuFlag=0 and for independantChromaTuFlag= 1, i.e. splitting or not the chrominance channel according to the luma coding-tree LUMAQ, the distortion and the bit-rate are computed for both situations and the best rate/distortion compromise is kept i.e the flag value leading to the lowest rate/distorsion J=D + lambda * rateCost, where D is a L2 norm between the source (original image unit) and reconstructed block (decoded image unit), rateCost the bit count of the coded piece of bitstream, and lambda a coding parameter. This technique is well known and used in the Joint Model of the MPEG/ITU H.264/AVC, in the reference software of H.265/HEVC, and in the Joint Exploration Model encoding methods "High Efficiency Video Coding (HEVC) Test Model 16 (HM 16) Encoder Description, JCTVC-R1002, Sapporo, Japan, 30 June - 7 July 2014".
According to an embodiment, the chroma and luma coding-trees are identical until a given decomposition level and stops the splitting of the leaves of the chroma coding-tree CHROQ for higher decomposition levels.
According to an embodiment, the information data INFO is signalled for at least one decomposition level of the chroma coding-tree CHROQ when said at least one decomposition level is split, said information data INFO indicates whether said at least one decomposition level of the chroma coding-tree follows the splitting of the same level of the luma coding-tree.
According to a variant, said information data INFO further indicates when stopping the splitting of a leaf of the Chroma coding-tree CHROQ.
Fig. 6 shows an example of separate CU coding-tree per channel type or per component in accordance with an example of present principles.
Here, for example, an optimal luma and chroma coding-tree are obtained when the image unit is a residual transform Unit (TU) as defined in HEVC and the information data INFO is a flag denoted independantChroma TuFlag.
Following said optimal splitting, at a first decomposition level of a CU of the CTU, the chrominance unit, relative to said CU, is split into 4 sub-units 1 - 4. At a second decomposition level, the chroma sub-units 1 -3 are not further split and the independantChromaTuFlag= 1 is signalled for said chroma sub- unit 1 . No independantChromaTuFlag is transmitted for sub-units 2 and 3 (because corresponding Luma sub-units are not split). The chroma sub-unit 4 is further split into 4 other sub-units 41 -44 and the independantChromaTuFlag=0 is signalled for said sub-unit 4. Finally, the sub- units 41 -44 are not further split and the independantChromaTuFlag= 1 is signalled for said chroma sub-unit 41 (No independantChromaTuFlag transmitted for sub-units 2-4).
This leads to more syntax to be transmitted than the above embodiment but allows determining the optimal size for the leaves of the chroma coding- tree, between the maximum size MS and the size of the leaves of the luma coding-tree LUMAQ at the same decomposition level.
Note that each of the embodiments and variants can be performed on a per-chroma-channel basis, i.e. a chroma coding-tree is computed for each chrominance channel. The information data INFO is then signaled for each chrominance channel.
The present principles further relate to a method for encoding an image, said image comprising at least one image unit encoded according to an encoding method as above described in accordance with the present principles.
The present principles further relate to a method for decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel. The method obtains a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtains a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit.
Obtaining said chroma coding-tree comprises determining whether said chroma coding-tree and said luma coding-tree are identical. The method further signals an information data indicating whether said chroma coding-tree and said luma coding-tree are identical. Various embodiments and variants of the decoding method may be easily deduced from the above description of the encoding method, in particular from the description of Fig. 1-6.
For example, the decoding method comprises obtaining, from a signal or a memory, an information data INFO indicating whether a chroma coding- tree CHROQ and a luma coding-tree LUMQ are identical. Said information data INFO may be a flag equals to a first value when the chroma coding-tree and the luma coding-tree are identical and to a second value otherwise. Said information data INFO, when equals to said second value (i.e. when the chroma coding-tree and the luma coding-tree are not identical), may also indicate that the chrominance unit is not split.
The present principles further relate to a method for decoding an image, said image comprising at least one image unit encoded according to an encoding method as above described in accordance with the present principles.
On Fig. 1 -6, the modules are functional units, which may or not be in relation with distinguishable physical units. For example, these modules or some of them may be brought together in a unique component or circuit, or contribute to functionalities of a software. A contrario, some modules may potentially be composed of separate physical entities. The apparatus which are compatible with the present principles are implemented using either pure hardware, for example using dedicated hardware such ASIC or FPGA or VLSI, respectively « Application Specific Integrated Circuit » , « Field-Programmable Gate Array », « Very Large Scale Integration », or from several integrated electronic components embedded in a device or from a blend of hardware and software components.
Fig. 7 represents an exemplary architecture of a device 120 which may be configured to implement a method described in relation with Figs. 1-6.
Device 120 comprises following elements that are linked together by a data and address bus 1 21 :
- a microprocessor 1 22 (or CPU), which is, for example, a DSP (or Digital Signal Processor); - a ROM (or Read Only Memory) 123;
- a RAM (or Random Access Memory) 124;
- an I/O interface 1 25 for reception of data to transmit, from an application; and
- a battery 126.
In accordance with an example, the battery 126 is external to the device. In each of mentioned memory, the word « register » used in the specification can correspond to area of small capacity (some bits) or to very large area (e.g. a whole program or large amount of received or decoded data). The ROM 123 comprises at least a program and parameters. The ROM 1 23 may store algorithms and instructions to perform techniques in accordance with present principles. When switched on, the CPU 122 uploads the program in the RAM and executes the corresponding instructions.
RAM 124 comprises, in a register, the program executed by the CPU 122 and uploaded after switch on of the device 120, input data in a register, intermediate data in different states of the method in a register, and other variables used for the execution of the method in a register.
The implementations described herein may be implemented in, for example, a method or a process, an apparatus, a software program, a data stream, or a signal. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method or a device), the implementation of features discussed may also be implemented in other forms (for example a program). An apparatus may be implemented in, for example, appropriate hardware, software, and firmware. The methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to processing devices in general, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device. Processors also include communication devices, such as, for example, computers, cell phones, portable/personal digital assistants ("PDAs"), and other devices that facilitate communication of information between end-users. In accordance with an example of encoding or an encoder, the image comprising an image unit or the image unit to be encoded is obtained from a source. For example, the source belongs to a set comprising:
- a local memory (123 or 124), e.g. a video memory or a RAM (or Random Access Memory), a flash memory, a ROM (or Read Only
Memory), a hard disk ;
- a storage interface (125), e.g. an interface with a mass storage, a RAM, a flash memory, a ROM, an optical disc or a magnetic support;
- a communication interface (125), e.g. a wireline interface (for example a bus interface, a wide area network interface, a local area network interface) or a wireless interface (such as a IEEE 802.1 1 interface or a Bluetooth® interface); and
- an image capturing circuit (e.g. a sensor such as, for example, a CCD (or Charge-Coupled Device) or CMOS (or Complementary Metal-Oxide-Semiconductor)).
In accordance with an example of the decoding or a decoder, the decoded image unit or the decoded image comprising a decoded image unit is sent to a destination; specifically, the destination belongs to a set comprising:
- a local memory (123 or 124), e.g. a video memory or a RAM, a flash memory, a hard disk ;
- a storage interface (125), e.g. an interface with a mass storage, a RAM, a flash memory, a ROM, an optical disc or a magnetic support;
- a communication interface (125), e.g. a wireline interface (for example a bus interface (e.g. USB (or Universal Serial Bus)), a wide area network interface, a local area network interface, a HDMI (High Definition Multimedia Interface) interface) or a wireless interface (such as a IEEE 802.1 1 interface, WiFi ® or a Bluetooth ® interface); and
- a display.
In accordance with examples of encoding or encoder, a signal S is generated. The signal S has syntax element related to an image unit comprising picture data represented by a luminance channel and at least one chrominance channel. Said syntax element defines a luma coding-tree obtained by splitting a luminance unit representative of the luminance channel of said image unit and a chroma coding-tree obtained by splitting a chrominance unit representative of at least one chrominance channel of said image unit. The Signal is formatted to comprise an information data INFO indicating whether said chroma coding-tree and said luma coding-tree are identical, and said information data INFO further indicates that the chrominance unit is not split when said chroma and luma coding-trees are not identical.
According to a variant, said information data further indicates whether or not said at least one decomposition level of the chroma coding-tree follows the splitting of the same level of the luma coding-tree.
The signal S is sent to a destination. As an example, the signal S is stored in a local or remote memory, e.g. a video memory (124) or a RAM (124), a hard disk (123). In a variant, the signal S is sent to a storage interface (125), e.g. an interface with a mass storage, a flash memory, ROM, an optical disc or a magnetic support and/or transmitted over a communication interface (125), e.g. an interface to a point to point link, a communication bus, a point to multipoint link or a broadcast network.
In accordance with examples of decoding or decoder, the signal S is obtained from a source. Exemplarily, the signal S is read from a local memory, e.g. a video memory (124), a RAM (124), a ROM (123), a flash memory (123) or a hard disk (123). In a variant, the bitstream is received from a storage interface (125), e.g. an interface with a mass storage, a RAM, a ROM, a flash memory, an optical disc or a magnetic support and/or received from a communication interface (125), e.g. an interface to a point to point link, a bus, a point to multipoint link or a broadcast network.
In accordance with examples, device 120 being configured to implement an encoding method described in relation with Fig. 1 -6, belongs to a set comprising: - a mobile device ;
- a communication device ;
- a game device ;
- a tablet (or tablet computer) ;
- a laptop ;
- a still image camera;
- a video camera ;
- an encoding chip;
- a still image server ; and
- a video server (e.g. a broadcast server, a video-on-demand server or a web server).
In accordance with examples, device 120 being configured to implement a decoding method described above, belongs to a set comprising:
- a mobile device ;
- a communication device ;
- a game device ;
- a set top box;
- a TV set;
- a tablet (or tablet computer) ;
- a laptop ;
- a display and
- a decoding chip.
According to an example of the present principles, illustrated in Fig. 8, in a transmission context between two remote devices A and B over a communication network NET, the device A comprises a processor in relation with memory RAM and ROM which are configured to implement a method for encoding image comprising at least one image unit or an image unit as described above and the device B comprises a processor in relation with memory RAM and ROM which are configured to implement a method for decoding as described above. In accordance with an example, the network is a broadcast network, adapted to broadcast still images or video images from device A to decoding devices including the device B.
The signal S is intended to be transmitted by the device A and received by the device B.
Fig. 9 shows an example of the syntax of such a signal when the data are transmitted over a packet-based transmission protocol. Each transmitted packet P comprises a header H and a payload PAYLOAD. A bit of the header H, for example, is dedicated to represent the information data carried by the signal S. In variant, multiple flags may be used to represent the information data INFO as above described and carried by the signal S.
Implementations of the various processes and features described herein may be embodied in a variety of different equipment or applications. Examples of such equipment include an encoder, a decoder, a post-processor processing output from a decoder, a pre-processor providing input to an encoder, a video coder, a video decoder, a video codec, a web server, a set- top box, a laptop, a personal computer, a cell phone, a PDA, and any other device for processing an image or a video or other communication devices. As should be clear, the equipment may be mobile and even installed in a mobile vehicle.
Additionally, the methods may be implemented by instructions being performed by a processor, and such instructions (and/or data values produced by an implementation) may be stored on a computer readable storage medium. A computer readable storage medium can take the form of a computer readable program product embodied in one or more computer readable medium(s) and having computer readable program code embodied thereon that is executable by a computer. A computer readable storage medium as used herein is considered a non-transitory storage medium given the inherent capability to store the information therein as well as the inherent capability to provide retrieval of the information therefrom. A computer readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. It is to be appreciated that the following, while providing more specific examples of computer readable storage mediums to which the present principles can be applied, is merely an illustrative and not exhaustive listing as is readily appreciated by one of ordinary skill in the art: a portable computer diskette; a hard disk; a read-only memory (ROM); an erasable programmable read-only memory (EPROM or Flash memory); a portable compact disc read-only memory (CD-ROM); an optical storage device; a magnetic storage device; or any suitable combination of the foregoing.
The instructions may form an application program tangibly embodied on a processor-readable medium.
Instructions may be, for example, in hardware, firmware, software, or a combination. Instructions may be found in, for example, an operating system, a separate application, or a combination of the two. A processor may be characterized, therefore, as, for example, both a device configured to carry out a process and a device that includes a processor-readable medium (such as a storage device) having instructions for carrying out a process. Further, a processor-readable medium may store, in addition to or in lieu of instructions, data values produced by an implementation.
As will be evident to one of skill in the art, implementations may produce a variety of signals formatted to carry information that may be, for example, stored or transmitted. The information may include, for example, instructions for performing a method, or data produced by one of the described implementations. For example, a signal may be formatted to carry as data the rules for writing or reading the syntax of a described example of the present principles, or to carry as data the actual syntax-values written by a described example of the present principles. Such a signal may be formatted, for example, as an electromagnetic wave (for example, using a radio frequency portion of spectrum) or as a baseband signal. The formatting may include, for example, encoding a data stream and modulating a carrier with the encoded data stream. The information that the signal carries may be, for example, analog or digital information. The signal may be transmitted over a variety of different wired or wireless links, as is known. The signal may be stored on a processor-readable medium.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. For example, elements of different implementations may be combined, supplemented, modified, or removed to produce other implementations. Additionally, one of ordinary skill will understand that other structures and processes may be substituted for those disclosed and the resulting implementations will perform at least substantially the same function(s), in at least substantially the same way(s), to achieve at least substantially the same result(s) as the implementations disclosed. Accordingly, these and other implementations are contemplated by this application.

Claims

1 . A method for encoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel, the method comprising obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit, characterized in that obtaining said chroma coding-tree comprises:
- determining whether said chroma coding-tree and said luma coding- tree are identical; and
- signaling an information data indicating whether said chroma coding- tree and said luma coding-tree are identical.
2. A method for decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel, the method comprising obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit, characterized in that obtaining said chroma coding-tree comprises:
- determining whether said chroma coding-tree and said luma coding- tree are identical; and
- signaling an information data indicating whether said chroma coding- tree and said luma coding-tree are identical.
3. A device for encoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel, the device comprising means for obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and means for obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit, characterized in that the means for obtaining said chroma coding-tree are further configured to:
- determine whether said chroma coding-tree and said luma coding- tree are identical; and
- signal an information data indicating whether said chroma coding-tree and said luma coding-tree are identical.
4. A device for decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel, the device comprising means for obtaining a luma coding-tree by splitting a luminance unit representative of the luminance channel of said image unit and means for obtaining a chroma coding-tree by splitting a chrominance unit representative of at least one chrominance channel of said image unit, characterized in that the means for obtaining said chroma coding-tree are further configured to:
- determine whether said chroma coding-tree and said luma coding-tree are identical; and
- signal an information data indicating whether said chroma coding-tree and said luma coding-tree are identical.
5. The method of claim 1 -2 or a device of claim 3-4, wherein the information data is a flag equals to a first value when the chroma coding-tree and the luma coding-tree are identical and to a second value otherwise.
6. The method or a device of claim 5, wherein when the information data equals to said second value, the information data further indicates that the chrominance unit is not split.
7. The method of claim 1 -2, 5-6 or a device of claim 3-6, wherein said chroma and luma coding-trees are determined as being not identical, if the size of at least one leaf of said chroma coding-tree is larger than a maximum size, then said at least one leaf is recursively split until the sizes of the leaves of said chroma coding-tree reach said maximum size.
8. The method of claim 1 -2, 5-7 or a device of claim 3-7, wherein the chroma and luma coding-trees are identical until a given decomposition level and the method stops the splitting of the leaves of the chroma coding-tree CHROQ for higher decomposition levels.
9. The method of claim 1 -2, 5-8 or a device of claim 3-8, wherein the information data is signaled for at least one decomposition level of the chroma coding-tree when said at least one decomposition level is split, said information data indicating whether said at least one decomposition level of the chroma coding-tree follows the splitting of the same level of the luma coding-tree.
10. A method for encoding an image comprising at least one image unit, characterized in that said at least one image unit is coded according to a method of one of the claims 1 , 5-9.
1 1 . A method for decoding an image comprising at least one image unit, characterized in that said at least one image unit is decoded according to a method of one of the claims 2, 5-9.
12. A signal (S) having syntax element related to an image unit comprising picture data represented by a luminance channel and at least one chrominance channel, said syntax element defining a luma coding-tree obtained by splitting a luminance unit representative of the luminance channel of said image unit and a chroma coding-tree obtained by splitting a chrominance unit representative of at least one chrominance channel of said image unit, characterized in that the signal is formatted to comprise an information data indicating whether said chroma coding-tree and said luma coding-tree are identical, and said information data further indicating that the chrominance unit is not split when said chroma and luma coding-trees are not identical.
13. The signal of claim 12, wherein said information data further indicates whether said at least one decomposition level of the chroma coding-tree follows the splitting of the same level of the luma coding-tree.
14. A computer program product comprising program code instructions to execute the steps of the method of one of the claims 1 -2, 5-9 when this program is executed on a computer.
15. Non-transitory storage medium carrying instructions of program code for executing steps of the method of one of the claims 1 -2, 5-9 when said program is executed on a computing device.
PCT/EP2017/052316 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel WO2017137311A1 (en)

Priority Applications (16)

Application Number Priority Date Filing Date Title
MYPI2018702371A MY189780A (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
CN201780010687.9A CN108605134B (en) 2016-02-11 2017-02-03 Method and apparatus for encoding/decoding image unit
US16/076,170 US11109045B2 (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
MX2018009737A MX2018009737A (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel.
RU2018130816A RU2739251C2 (en) 2016-02-11 2017-02-03 Method and apparatus for encoding/decoding an image element comprising image data represented by a luminance component channel and at least one chroma component channel
EP22153784.8A EP4040789B1 (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
CA3014332A CA3014332A1 (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
BR112018015558-6A BR112018015558A2 (en) 2016-02-11 2017-02-03 A method and device for encoding / decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel.
CN202210747508.2A CN115052153A (en) 2016-02-11 2017-02-03 Method and apparatus for encoding/decoding image unit
EP17702390.0A EP3414903B1 (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
JP2018539051A JP7100582B2 (en) 2016-02-11 2017-02-03 A method and apparatus for encoding / decoding an image unit containing image data represented by a luminance channel and at least one chrominance channel.
KR1020187022905A KR20180111839A (en) 2016-02-11 2017-02-03 Method and device for encoding / decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
CN202210747507.8A CN115052152A (en) 2016-02-11 2017-02-03 Method and apparatus for encoding/decoding image unit
EP23196190.5A EP4266684A3 (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
US17/408,317 US20210385466A1 (en) 2016-02-11 2021-08-20 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
JP2022106895A JP2022137130A (en) 2016-02-11 2022-07-01 Method and device for encoding/decoding image unit comprising image data represented by luminance channel and at least one chrominance channel

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP16305153.5 2016-02-11
EP16305153 2016-02-11

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/076,170 A-371-Of-International US11109045B2 (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
US17/408,317 Continuation US20210385466A1 (en) 2016-02-11 2021-08-20 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel

Publications (1)

Publication Number Publication Date
WO2017137311A1 true WO2017137311A1 (en) 2017-08-17

Family

ID=55404663

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/052316 WO2017137311A1 (en) 2016-02-11 2017-02-03 Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel

Country Status (12)

Country Link
US (2) US11109045B2 (en)
EP (3) EP4266684A3 (en)
JP (2) JP7100582B2 (en)
KR (1) KR20180111839A (en)
CN (3) CN115052153A (en)
BR (1) BR112018015558A2 (en)
CA (1) CA3014332A1 (en)
MX (1) MX2018009737A (en)
MY (1) MY189780A (en)
RU (2) RU2739251C2 (en)
TW (1) TWI795352B (en)
WO (1) WO2017137311A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019135629A1 (en) * 2018-01-05 2019-07-11 에스케이텔레콤 주식회사 Method for reconstructing chroma block, and image decoding apparatus using same
US20190246122A1 (en) * 2018-02-08 2019-08-08 Qualcomm Incorporated Palette coding for video coding
WO2019203940A1 (en) * 2018-04-19 2019-10-24 Futurewei Technologies, Inc. Luma and chroma block partitioning
WO2019230670A1 (en) * 2018-05-31 2019-12-05 Sharp Kabushiki Kaisha Systems and methods for partitioning video blocks in an inter prediction slice of video data
WO2020055546A1 (en) * 2018-09-14 2020-03-19 Tencent America LLC Method and device for decoding with palette mode
CN110913215A (en) * 2019-12-03 2020-03-24 北京数码视讯软件技术发展有限公司 Method and device for selecting prediction mode and readable storage medium
CN112166607A (en) * 2018-05-29 2021-01-01 交互数字Vc控股公司 Method and apparatus for video encoding and decoding using partially shared luma and chroma coding trees
WO2021056211A1 (en) * 2019-09-24 2021-04-01 富士通株式会社 Video coding and decoding methods and apparatuses, and electronic device
EP3854091A4 (en) * 2018-09-21 2022-08-31 Canon Kabushiki Kaisha Method, apparatus and system for encoding and decoding a tree of blocks of video samples
US11503319B2 (en) 2017-10-06 2022-11-15 Sharp Kabushiki Kaisha Image coding apparatus and image decoding apparatus
JP7478253B2 (en) 2021-03-05 2024-05-02 テンセント・アメリカ・エルエルシー Decoupling transformation partitioning
CN111684797B (en) * 2018-02-08 2024-05-31 高通股份有限公司 Palette coding for video coding

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3662664A4 (en) * 2017-08-03 2020-11-25 Sharp Kabushiki Kaisha Systems and methods for partitioning video blocks in an inter prediction slice of video data
US10861196B2 (en) 2017-09-14 2020-12-08 Apple Inc. Point cloud compression
US10897269B2 (en) 2017-09-14 2021-01-19 Apple Inc. Hierarchical point cloud compression
US11818401B2 (en) 2017-09-14 2023-11-14 Apple Inc. Point cloud geometry compression using octrees and binary arithmetic encoding with adaptive look-up tables
WO2019054838A1 (en) * 2017-09-18 2019-03-21 인텔렉추얼디스커버리 주식회사 Method and apparatus for coding video using merging candidate list according to block division
US10909725B2 (en) 2017-09-18 2021-02-02 Apple Inc. Point cloud compression
US11113845B2 (en) 2017-09-18 2021-09-07 Apple Inc. Point cloud compression using non-cubic projections and masks
US10699444B2 (en) 2017-11-22 2020-06-30 Apple Inc Point cloud occupancy map compression
US10607373B2 (en) 2017-11-22 2020-03-31 Apple Inc. Point cloud compression with closed-loop color conversion
US10939129B2 (en) 2018-04-10 2021-03-02 Apple Inc. Point cloud compression
US10909726B2 (en) 2018-04-10 2021-02-02 Apple Inc. Point cloud compression
US10909727B2 (en) 2018-04-10 2021-02-02 Apple Inc. Hierarchical point cloud compression with smoothing
US11010928B2 (en) 2018-04-10 2021-05-18 Apple Inc. Adaptive distance based point cloud compression
KR20200141450A (en) * 2018-04-11 2020-12-18 인터디지털 브이씨 홀딩스 인코포레이티드 Method for encoding depth values of a set of 3D points orthogonally projected into at least one image area of a projection plane
US11017566B1 (en) 2018-07-02 2021-05-25 Apple Inc. Point cloud compression with adaptive filtering
US11202098B2 (en) 2018-07-05 2021-12-14 Apple Inc. Point cloud compression with multi-resolution video encoding
US11012713B2 (en) 2018-07-12 2021-05-18 Apple Inc. Bit stream structure for compressed point cloud data
US11386524B2 (en) 2018-09-28 2022-07-12 Apple Inc. Point cloud compression image padding
US11367224B2 (en) 2018-10-02 2022-06-21 Apple Inc. Occupancy map block-to-patch information compression
US11430155B2 (en) 2018-10-05 2022-08-30 Apple Inc. Quantized depths for projection point cloud compression
WO2020098786A1 (en) * 2018-11-16 2020-05-22 Mediatek Inc. Method and apparatus of luma-chroma separated coding tree coding with constraints
KR102619997B1 (en) 2019-01-02 2024-01-02 애플 인크. Method for encodign/decodign video signal and apparatus therefor
AU2019201649A1 (en) 2019-03-11 2020-10-01 Canon Kabushiki Kaisha Method, apparatus and system for encoding and decoding a tree of blocks of video samples
US11057564B2 (en) 2019-03-28 2021-07-06 Apple Inc. Multiple layer flexure for supporting a moving image sensor
US11606563B2 (en) * 2019-09-24 2023-03-14 Tencent America LLC CTU size signaling
US11562507B2 (en) 2019-09-27 2023-01-24 Apple Inc. Point cloud compression using video encoding with time consistent patches
US11627314B2 (en) 2019-09-27 2023-04-11 Apple Inc. Video-based point cloud compression with non-normative smoothing
US11538196B2 (en) 2019-10-02 2022-12-27 Apple Inc. Predictive coding for point cloud compression
US11895307B2 (en) 2019-10-04 2024-02-06 Apple Inc. Block-based predictive coding for point cloud compression
US11798196B2 (en) 2020-01-08 2023-10-24 Apple Inc. Video-based point cloud compression with predicted patches
US11475605B2 (en) 2020-01-09 2022-10-18 Apple Inc. Geometry encoding of duplicate points
US11615557B2 (en) 2020-06-24 2023-03-28 Apple Inc. Point cloud compression using octrees with slicing
US11620768B2 (en) 2020-06-24 2023-04-04 Apple Inc. Point cloud geometry compression using octrees with multiple scan orders
US11948338B1 (en) 2021-03-29 2024-04-02 Apple Inc. 3D volumetric content encoding using 2D videos and simplified 3D meshes

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120230421A1 (en) * 2011-03-10 2012-09-13 Qualcomm Incorporated Transforms in video coding

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105120284B (en) * 2009-08-12 2019-06-25 汤姆森特许公司 For chroma coder in improved frame and decoded method and device
US20120230418A1 (en) * 2011-03-08 2012-09-13 Qualcomm Incorporated Coding of transform coefficients for video coding
US9807401B2 (en) * 2011-11-01 2017-10-31 Qualcomm Incorporated Transform unit partitioning for chroma components in video coding
US20130128971A1 (en) * 2011-11-23 2013-05-23 Qualcomm Incorporated Transforms in video coding
WO2013102299A1 (en) * 2012-01-04 2013-07-11 Mediatek Singapore Pte. Ltd. Residue quad tree depth for chroma components
WO2013107027A1 (en) * 2012-01-19 2013-07-25 Mediatek Singapore Pte. Ltd. Methods and apparatuses of cbf coding in hevc
US9462275B2 (en) * 2012-01-30 2016-10-04 Qualcomm Incorporated Residual quad tree (RQT) coding for video coding
US9467701B2 (en) * 2012-04-05 2016-10-11 Qualcomm Incorporated Coded block flag coding
US10009612B2 (en) * 2012-04-12 2018-06-26 Hfi Innovation Inc. Method and apparatus for block partition of chroma subsampling formats
AU2012261713A1 (en) * 2012-12-07 2014-06-26 Canon Kabushiki Kaisha Method, apparatus and system for encoding and decoding the transform units of a residual quad tree
US9743091B2 (en) * 2012-12-17 2017-08-22 Lg Electronics Inc. Method for encoding/decoding image, and device using same
US9648330B2 (en) * 2013-07-15 2017-05-09 Qualcomm Incorporated Inter-color component residual prediction
CA2946779C (en) 2014-05-05 2019-10-01 Mediatek Singapore Pte. Ltd. Method and apparatus for determining residue transform tree representation
US20150373327A1 (en) * 2014-06-20 2015-12-24 Qualcomm Incorporated Block adaptive color-space conversion coding
US20190356915A1 (en) * 2017-01-03 2019-11-21 Lg Electronics Inc. Method and apparatus for encoding/decoding video signal using secondary transform

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120230421A1 (en) * 2011-03-10 2012-09-13 Qualcomm Incorporated Transforms in video coding

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"High Efficiency Video Coding (HEVC) Test Model 16 (HM 16) Encoder Description", JCTVC-R1002, 30 June 2014 (2014-06-30)
KIM J ET AL: "AHG5: Independent chroma transform depth from luma transform depth for non-4:2:0 format", 104. MPEG MEETING; 22-4-2013 - 26-4-2013; INCHEON; (MOTION PICTURE EXPERT GROUP OR ISO/IEC JTC1/SC29/WG11),, no. m28536, 17 April 2013 (2013-04-17), XP030057070 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11936892B2 (en) 2017-10-06 2024-03-19 Sharp Kabushiki Kaisha Image decoding apparatus
US11503319B2 (en) 2017-10-06 2022-11-15 Sharp Kabushiki Kaisha Image coding apparatus and image decoding apparatus
WO2019135629A1 (en) * 2018-01-05 2019-07-11 에스케이텔레콤 주식회사 Method for reconstructing chroma block, and image decoding apparatus using same
CN111684797A (en) * 2018-02-08 2020-09-18 高通股份有限公司 Palette coding for video coding
CN111684797B (en) * 2018-02-08 2024-05-31 高通股份有限公司 Palette coding for video coding
US20190246122A1 (en) * 2018-02-08 2019-08-08 Qualcomm Incorporated Palette coding for video coding
EP4072146A1 (en) * 2018-04-19 2022-10-12 Huawei Technologies Co., Ltd. Luma and chroma block partitioning
WO2019203940A1 (en) * 2018-04-19 2019-10-24 Futurewei Technologies, Inc. Luma and chroma block partitioning
US11109026B2 (en) 2018-04-19 2021-08-31 Huawei Technologies Co., Ltd. Luma and chroma block partitioning
CN112166607A (en) * 2018-05-29 2021-01-01 交互数字Vc控股公司 Method and apparatus for video encoding and decoding using partially shared luma and chroma coding trees
JP2021525468A (en) * 2018-05-29 2021-09-24 インターデジタル ヴイシー ホールディングス, インコーポレイテッド Video coding and decoding methods and equipment using partially shared brightness and saturation coding trees
US11483559B2 (en) 2018-05-29 2022-10-25 Interdigital Vc Holdings, Inc. Method and apparatus for video encoding and decoding with partially shared luma and chroma coding trees
WO2019230670A1 (en) * 2018-05-31 2019-12-05 Sharp Kabushiki Kaisha Systems and methods for partitioning video blocks in an inter prediction slice of video data
CN112204967A (en) * 2018-05-31 2021-01-08 夏普株式会社 System and method for partitioning video blocks in inter-predicted segments of video data
US11025905B2 (en) 2018-09-14 2021-06-01 Tencent America LLC Method and device for decoding with palette mode
WO2020055546A1 (en) * 2018-09-14 2020-03-19 Tencent America LLC Method and device for decoding with palette mode
EP3854091A4 (en) * 2018-09-21 2022-08-31 Canon Kabushiki Kaisha Method, apparatus and system for encoding and decoding a tree of blocks of video samples
US11595699B2 (en) 2018-09-21 2023-02-28 Canon Kabushiki Kaisha Method, apparatus and system for encoding and decoding a tree of blocks of video samples
WO2021056211A1 (en) * 2019-09-24 2021-04-01 富士通株式会社 Video coding and decoding methods and apparatuses, and electronic device
CN110913215B (en) * 2019-12-03 2022-04-12 北京数码视讯软件技术发展有限公司 Method and device for selecting prediction mode and readable storage medium
CN110913215A (en) * 2019-12-03 2020-03-24 北京数码视讯软件技术发展有限公司 Method and device for selecting prediction mode and readable storage medium
JP7478253B2 (en) 2021-03-05 2024-05-02 テンセント・アメリカ・エルエルシー Decoupling transformation partitioning

Also Published As

Publication number Publication date
US11109045B2 (en) 2021-08-31
RU2766561C2 (en) 2022-03-15
RU2020141190A (en) 2021-04-01
RU2020141190A3 (en) 2021-12-07
CN108605134B (en) 2022-07-15
KR20180111839A (en) 2018-10-11
MY189780A (en) 2022-03-07
EP4040789A1 (en) 2022-08-10
RU2018130816A (en) 2020-03-11
US20210385466A1 (en) 2021-12-09
MX2018009737A (en) 2018-11-29
EP3414903A1 (en) 2018-12-19
EP4266684A3 (en) 2024-01-24
TWI795352B (en) 2023-03-11
BR112018015558A2 (en) 2018-12-26
RU2739251C2 (en) 2020-12-22
EP4266684A2 (en) 2023-10-25
JP7100582B2 (en) 2022-07-13
TW201729596A (en) 2017-08-16
US20210006805A1 (en) 2021-01-07
CN108605134A (en) 2018-09-28
CN115052153A (en) 2022-09-13
EP3414903B1 (en) 2022-04-06
JP2022137130A (en) 2022-09-21
JP2019509662A (en) 2019-04-04
CA3014332A1 (en) 2017-08-17
RU2018130816A3 (en) 2020-06-01
CN115052152A (en) 2022-09-13
EP4040789B1 (en) 2023-10-18

Similar Documents

Publication Publication Date Title
US20210385466A1 (en) Method and device for encoding/decoding an image unit comprising image data represented by a luminance channel and at least one chrominance channel
US20230412810A1 (en) Method and device for intra-predictive encoding/decoding a coding unit comprising picture data, said intra-predictive encoding depending on a prediction tree and a transform tree
US11665347B2 (en) Method and apparatus for selecting a coding mode used for encoding/decoding a residual block
CN113016180A (en) Virtual pipeline for video encoding and decoding
TW202107897A (en) Secondary transform for video encoding and decoding
EP3987785A1 (en) Lossless mode for versatile video coding
CN112385212A (en) Syntax element for video encoding or decoding
US11785236B2 (en) Parameter grouping among plural coding units for video encoding and decoding
US20230328284A1 (en) Hybrid texture particle coding mode improvements
US20230262268A1 (en) Chroma format dependent quantization matrices for video encoding and decoding

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17702390

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018539051

Country of ref document: JP

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112018015558

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 20187022905

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: MX/A/2018/009737

Country of ref document: MX

WWE Wipo information: entry into national phase

Ref document number: 3014332

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017702390

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017702390

Country of ref document: EP

Effective date: 20180911

ENP Entry into the national phase

Ref document number: 112018015558

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20180730