AU2016204699A1 - Method and apparatus for encoding videos sharing sao parameter according to color component - Google Patents

Method and apparatus for encoding videos sharing sao parameter according to color component Download PDF

Info

Publication number
AU2016204699A1
AU2016204699A1 AU2016204699A AU2016204699A AU2016204699A1 AU 2016204699 A1 AU2016204699 A1 AU 2016204699A1 AU 2016204699 A AU2016204699 A AU 2016204699A AU 2016204699 A AU2016204699 A AU 2016204699A AU 2016204699 A1 AU2016204699 A1 AU 2016204699A1
Authority
AU
Australia
Prior art keywords
sao
offset
current
type
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU2016204699A
Other versions
AU2016204699B2 (en
Inventor
Alexander Alshin
Elena Alshina
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to AU2016204699A priority Critical patent/AU2016204699B2/en
Publication of AU2016204699A1 publication Critical patent/AU2016204699A1/en
Application granted granted Critical
Publication of AU2016204699B2 publication Critical patent/AU2016204699B2/en
Priority to AU2017265158A priority patent/AU2017265158B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/13Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • H04N19/159Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/174Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a slice, e.g. a line of blocks or a group of blocks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/189Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
    • H04N19/196Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding being specially adapted for the computation of encoding parameters, e.g. by averaging previously computed encoding parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • H04N19/82Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation involving filtering within a prediction loop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • H04N19/86Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving reduction of coding artifacts, e.g. of blockiness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/91Entropy coding, e.g. variable length coding [VLC] or arithmetic coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Color Television Systems (AREA)

Abstract

Abstract The present invention discloses a method and an apparatus for encoding a video and a method and an apparatus for decoding a video, which generates a recovered image of which error compared to an original image is minimized. Disclosed is the method for decoding the video including sample adaptive offset (SAO) adjustment, comprising: obtaining a slice SAO parameter of a current slice from a slice header of a bitstream which is received; obtaining luma SAO usage information for a luma component of the current slice from among the slice SAO parameters, and chroma SAO usage information for a lima component of the current slice from amount the slice SAO parameters, and chroma SAO usage information for chroma components; determining whether to perform an SAO adjustment on the lima component of the current slice on the basis of the luma SAO usage information; and determining the same result for whether to perform the SAO adjustment on a first chroma component and a second chroma component of the current slice on the basis of the chroma SAO usage information.

Description

METHOD AND APPARATUS FOR ENCODING VIDEOS SHARING SAO
PARAMETER ACCORDING TO COLOR COMPONENT
The present application is a divisional application from Australian Patent Application No. 2013275098, the entire disclosure of which is incorporated herein by reference.
Technical Field
The one or more embodiments relate to video encoding and decoding for minimizing an error between an original image and a reconstructed image.
Background Art
As hardware for reproducing and storing high resolution or high quality video content is being developed and supplied, a need for a video codec for effectively encoding or decoding the high resolution or high quality video content is increasing. According to a conventional video codec, a video is encoded according to a limited encoding method based on a macroblock having a predetermined size.
Image data of the space domain is transformed into coefficients of the frequency domain via frequency transformation. According to a video codec, an image is split into blocks having a predetermined size, discrete cosine transformation (DCT) is performed on each block, and frequency coefficients are encoded in block units, for rapid calculation of frequency transformation. Compared with image data of the space domain, coefficients of the frequency domain are easily compressed. In particular, since an image pixel value of the space domain is expressed according to a prediction error via inter prediction or intra prediction of a video codec, when frequency transformation is performed on the prediction error, a large amount of data may be transformed to 0. According to a video codec, an amount of data may be reduced by replacing data that is consecutively and repeatedly generated with small-sized data.
Disclosure of the Invention Technical Problem
The one or more embodiments provide a video encoding method and apparatus, and a video decoding method and apparatus for generating a reconstructed image having a minimized error between an original image and the reconstructed image.
Technical Solution
According to the first aspect, the present invention provides a video decoding apparatus comprising: a decoder which is configured for obtaining slice offset information indicating whether to apply an offset according to an offset type for a current slice, and compensating for samples of the current block by using a offset parameter of the current block, wherein: when the slice offset information indicates that the offset is applied, the decoder is configured for obtaining left offset merging information of a current block among blocks included in the current slice by performing entropy decoding on a bitstream using a context mode, when the left offset merging information indicates that the offset parameter of the current block is determined according to an offset parameter of a left block, the decoder is configured for determining the offset parameter of the current block using the offset parameter of the left block, when the left offset merging information indicates that the offset parameter of the current block is not determined according to the offset parameter of the left block, the decoder is configured for obtaining upper offset merging information of the current block by performing entropy decoding on the bitstream using the context mode, when the upper offset merging information indicates that the offset parameter of the current block is determined according to an offset parameter of an upper block, the decoder is configured for determining the offset parameter of the current block using the offset parameter of the upper block, when the upper offset merging information indicates that the offset parameter of the current block is not determined according to an offset parameter of the upper block, the decoder is configured for obtaining the offset parameter of the current block from the bitstream, the offset parameter comprises at least one of offset type information and offset values, the offset type information indicates the offset type or whether to apply an offset to the current block, and the offset type is one of a band offset type and an edge offset type, when the offset type information indicates the band offset type, absolute values of the offset values is obtained by performing entropy decoding in a bypass mode on the bitstream, and, when the absolute values of the offset values are not zero, signs of the offset values is further obtained from the bitstream.
There may be provided a video decoding apparatus comprising: a storage which is configured for storing reconstructed pixels of a current block; and a processor which is configured for compensating for sample values of the reconstructed pixels of the current block by using an offset parameter of the current block, wherein the processor is configured for obtaining slice offset information indicating whether to apply an offset for a current slice,wherein, when the slice offset information indicates that the offset is applied, the processor is configured for performing entropy decoding on a bitstream using a predetermined context mode and obtaining left offset merging information of a current block among blocks included in the current slice, wherein, when the left offset merging information indicates that an offset parameter of the current block is not determined according to an offset parameter of a left block, the processor is configured for performing entropy decoding on the bitstream using the predetermined context mode, and obtaining upper offset merging information of the current block; wherein, when the left offset merging information indicates that the offset parameter of the current block is determined according to the offset parameter of the left block, the processor is configured for determining the offset parameter of the current block using the offset parameter of the left block, wherein, when the left offset merging information indicates that the offset parameter of the current block is not determined according to the offset parameter of the left block, and when the upper offset merging information indicates that the offset parameter of the current block is determined according to an offset parameter of an upper block, the processor is configured for determining the offset parameter of the current block using an offset parameter of the upper block, wherein, when the upper offset merging information indicates that the offset parameter of the current block is not determined according to the offset parameter of the upper block, the processor is configured for obtaining the offset parameter of the current block from the bitstream; wherein the offset parameter comprises at least one of offset type information and an offset value, and wherein the offset type information of the current block indicates offset type or whether to apply an offset, and the offset type is one of a band offset type and an edge offset type.
Advantageous Effects A sample adaptive offset (SAO) adjustment method for each color component according to various embodiments may share various SAO parameters relating to a SAO operation of a first chroma component and a second chroma component of a current sample, thereby simultaneously performing the SAO adjustment on the first chroma component and the second chroma component, and preventing parallel processing latency in advance. Furthermore, compared to separately sending SAO parameters regarding the first chroma component and the second chroma component, a total number of transmission bits of the SAO parameters may be reduced by half.
Brief Description of the Drawings FIGS. 1A and 1B, respectively, are a block diagram of a video encoding apparatus and a flowchart of a sample adaptive offset (SAO) adjustment method performed by the video encoding apparatus, according to one or more embodiments; FIGS. 2A and 2B, respectively, are a block diagram of a video decoding apparatus and a flowchart of a SAO operation performed by the video decoding apparatus, according to one or more embodiments; FIG. 3 is a block diagram of a video decoding apparatus according to another embodiment; FIG. 4 is a table showing edge classes of edge types, according to one or more embodiments; FIGS. 5A and 5B are a table and a graph showing categories of edge types, according to one or more embodiments; FIGS. 6A through 6C show relationships between first and second chroma components; FIS. 7A Is a diagram showing adjacent largest coding units (LCUs) referred to merge SAD parameters, according to one or more embodiments; FSG. 7B shows syntax structures of a slice header and slice data according ίο one or more efobod Imintst FISG; 7© and7tf show syntax: structures· of SAG pammeters wflh respect· to LCUs according to one or mere ompddlmenfp; FIG. ?E shows asyntax stmcture of context Information for ccntext-adiipi^e binary arithmetic coding $ CAB AC) encoding of SAO parameters according to one or We embodiment; FIG. ?F chows a syntax structure of SAG^patimdteri according to one or more embodirnsnis; FIG, 8 is a block diagram of a vid« encoding appar^ysib®s^;G9i-:CQdirig uniis · according to a tree structure, according to one or more embodiments;
FtG. 9 is a block diagram of a video decoding apparatus based an coding units according to a tree structure, according to one or more omMlmaota; FIG. 10 is a diagram for describing a concept of coding units according to one or momomfeodiwrtta;
Ftp. 111a: m block diagmm oi an image· «coder Oased on coding obits, according to «0 of mdro emtsodio-eots:
FtG. 12 is a block dsagratil ipi ,gn:;|i^il::%»?de.r toed bn accorclng to one or mom embodiments;
FtG, 13 is a diagram illustrating; deepen·,©odfof unto.· and trillions,. atfoatdMg to one of mom imhMnieniSt FIG, 14 is a diagram for deccribiog a rclatlmiship i^tween a coding ynirani Kah^fTriabon units, according to one or more embodiments; FIG, IS m a diagram tor describing fencodinp ififonoslbn rpf coding ante; correspcbding to a coded depth, according to one or more ©rbMmenfs;
Fp. 1i Is a. diagram of deeper coding units according to deptns. ae^fding fo cine sr niore emtodi m^nts 5
FfGS 1 7 through 19 ere diagrams for besolbing a relationship between coding pits, prediction emta, and transformation units, according: to one or mom emtfo^i:ih«fsi FIG, 20 is a diagram for describing a relationship between a coding drib a prediction unifc and a trancfoffoiilbh ys% according: ;to «ceding; mode information of FIG. 21 is a diagram of a physical structure of a disc in which a program is stored, according to one or more embodiments; FIG. 22 is a diagram of a disc drive for recording and reading a program by using a disc; FIG. 23 is a diagram of an overall structure of a content supply system for providing a content distribution service; FIGS. 24 and 25 are diagrams respectively of an external structure and an internal structure of a mobile phone to which a video encoding method and a video decoding method are applied, according to one or more embodiments; FIG. 26 is a diagram of a digital broadcast system to which a communication system is applied, according to one or more embodiments; and FIG. 27 is a diagram illustrating a network structure of a cloud computing system using a video encoding apparatus and a video decoding apparatus, according to one or more embodiments.
Best mode for carrying out the Invention
According to a first aspect, the present invention provides a video decoding method comprising: obtaining slice offset information indicating whether to apply an offset according to an offset type for a current slice; when the slice offset information indicates that the offset is applied, obtaining left offset merging information of a current block among blocks included in the current slice by performing entropy decoding on a bitstream using a context mode; when the left offset merging information indicates that an offset parameter of the current block is determined according to an offset parameter of a left block, determining the offset parameter of the current block using the offset parameter of the left block; when the left offset merging information indicates that the offset parameter of the current block is not determined according to the offset parameter of a left block, obtaining upper offset merging information of the current block by performing entropy decoding on the bitstream using the context mode; when the upper offset merging information indicates that the offset parameter of the current block is determined according to an offset parameter of an upper block, determining the offset parameter of the current block using the offset parameter of the upper block; when the upper offset merging information indicates that the offset parameter of the current block is not determined according to an offset parameter of an upper block, obtaining the offset parameter of the current block from the bitstream; and compensating for samples of the current block by using the offset parameter, wherein the offset parameter comprises at least one of offset type information and offset values, the offset type information indicates the offset type or whether to apply an offset to the current block, and the offset type is one of a band offset type and an edge offset type, wherein the obtaining the offset parameter of the current block from the bitstream comprises: when the offset type information indicates the band offset type, parsing absolute values of the offset values by performing entropy decoding in a bypass mode on the bitstream, and, when the absolute values of the offset values are not zero, further parsing signs of the offset values from the bitstream.
According to an aspect of one or more embodiments, there may be provided a sample adaptive offset (SAO) adjustment method, the method including: obtaining slice SAO parameters with respect to a current slice from a slice header of a received bitstream; obtaining luma SAO use information for a luma component of the current slice and chroma SAO use information for chroma components thereof from among the slice SAO parameters; determining whether to perform a SAO operation on the luma component of the current slice based on the obtained luma SAO use information; and equally determining whether to perform the SAO adjustment on a first chroma component and a second chroma component of the current slice based on the obtained chroma SAO use information.
The method may further include: obtaining SAO parameters of largest coding units (LCUs) with respect to a current LCU from among LCUs of the current slice; obtaining left SAO merging information from among the SAO parameters of the LCUs; and determining whether to predict SAO parameters for a luma component and first and second chroma components of the current LCU by using a luma component and first and second chroma components of an upper LCU neighboring the current LCU based on the left SAO merging information.
The determining of whether to predict the SAO parameters may include: If it is determined that the SAO parameters of the current LCU are not predicted by using SAO parameters of a left LCU based on the left SAO merging information, obtaining upper SAO merging information from among the SAO parameters of the LCUs; and determining whether to predict the SAO parameters for the luma component and the first and second chroma components of the current LCU by using the luma component and the first and second chroma components of the upper LCU neighboring the current LCU based on the upper SAO merging information.
The method may further include: obtaining luma SAO type information for a luma component of the current LCU and chroma SAO type information for chroma components thereof from among the SAO parameters of the LCUs; determining whether to perform a SAO operation on the luma component of the current LCU based on the obtained luma SAO type information; and equally determining whether to perform the SAO adjustment on a first chroma component and a second chroma component of the current LCU based on the obtained chroma SAO type information.
The method may further include: determining which one of an edge SAO adjustment and a band SAO adjustment is performed on the luma component of the current LCU based on the obtained luma SAO type information; and determining which one of the edge SAO adjustment and the band SAO adjustment is performed on the first chroma component and the second chroma component of the current LCU based on the obtained chroma SAO type information.
The method may further include: determining the same edge direction on the first chroma component and the second chroma component of the current LCU based on the obtained SAO parameters.
The obtaining of the luma SAO type information and the chroma SAO type information may include: performing context-adaptive binary arithmetic coding (CABAC)-decoding on a first context bin of the luma SAO type information, and obtaining information indicating whether to perform the SAO adjustment on the luma component of the current LCU; performing CABAC-decoding on remaining context bins of the luma SAO type information in a bypass mode, and obtaining information indicating which one of the edge SAO adjustment and the band SAO adjustment is performed on the luma component of the current LCU; performing CABAC-decoding on a first context bin of the chroma SAO type information, and obtaining information indicating whether to perform the SAO adjustment on the chroma components of the current LCU; and performing CABAC-decoding on remaining context bins of the chroma SAO type information in the bypass mode, and obtaining information indicating which one of the edge SAO adjustment and the band SAO adjustment is performed on the chroma components of the current LCU.
The method may further include: performing CABAC-decoding by using the same context mode for the left SAO merging information and upper SAO merging information with respect to the luma component and the chroma components of the current LCU.
The method may further include: performing CABAC-decoding in a bypass mode to obtain magnitude information of an offset from among the SAO parameters of the LCUs, wherein the obtained magnitude information of the offset indicates offset magnitude within a range based on a bit depth of a video, and wherein, if the bit depth is 8 bits, the offset magnitude is equal to greater than 0 and equal to or smaller than 7, and, if the bit depth is 10 bits, the offset magnitude is equal to greater than 0 and equal to or smaller than 31.
The method may further include: if it is determined that the band SAO adjustment is performed on the current LCU, performing CABAC decoding on bits of invariable bit lengths in a bypass mode so as to obtain information regarding a band left start position from at least one piece of the obtained luma SAO type information and the obtained chroma SAO type information.
The method may further include: if it is determined that the band SAO adjustment is performed on the current LCU, obtaining an offset value for the SAO adjustment from the SAO parameters of the LCUs; and, if the obtained offset value is not 0, further obtaining sign information of the offset value from the SAO parameters of the LCUs.
The method may further include: obtaining an offset value for the edge type SAO adjustment from the SAO parameters of the LCUs; and determining a sign of the offset value based on the determined edge direction.
According to another aspect of one or more embodiments, there may be provided a SAO adjustment method, the method including: determining whether to perform a SAO operation on a luma component of a current slice; equally determining whether to perform the SAO adjustment on a first chroma component and a second chroma component of the current slice; generating slice SAO parameters with respect to the current slice including luma SAO use information indicating whether to perform the SAO adjustment on the luma component of the current slice and chroma SAO use information indicating whether to perform the SAO adjustment on the first chroma component and tlie second chroma component; and outputting a slice header including sice SAG parameters.
Use method lustier #:ph^tot 8AG:p^ for a luma ixtpppfiilt and first and mmtMtdlti^ma edntpone^ts of a current LCU by u&amp;rvg SAD parameters with rasped !q a luma component and <first ami second Piircrea components of a let LOU neighboring the current LOU based on LOUs of die cprfenl stipe·: geiioraiing left SAG merging ir^rmatiod tor the current ICO fc&amp;sed on to-determination; determining whether to predict the SAO parameters far the turn a component sod trip first and second chroma components of the cyrroni ICO by using SAO parameters. with rptpaof to a tema component and first and second; chroma pdfhponeht® pi ah yp|fer LOU neighboring the c^g^_ntV/;L-GtJ5::;.;^^hi^r^ii^ri0. :pp|^^':vSAO;.·:·' merging mic^natfe!ii:i6r';:®s:;cyFrent LOU based; so the deternilhatien; and genpraiog SAO parameters of LCUs with respect to the current LGU including at least one piece of the led SAO merging information and the upper SAO merging information.
The method may further include; determining whether to perform; #108A0 operation on a luma component of the current LCU. equally determining whether to prform tite SAO adjygtmont on a first chroma component and a second chroma coihpbhaht of the; current TOO; ::#rid generating SAO parameters rofi the LCUs with respect to the current ICU indicting luma SAO type Mormatioii irKfieafeg whsdier to pertemi the SAO adjustment on the turn a component of the current ICU and chroma SAG type Information Indicating whether to perform the SAG adiuatmerit an the firal;: chroma cenponeet aod the second chroma epriponent THd method msy further include: determining which cue of an edge SA0 adiustmeetand a hind SAO a^ctmam !i :padermed on the furna component of trie current ICO; determining which one of Ore edge SAO adjustment and the ter^ SAG Ι^φΜπφτφ.Ιβ perform^ on the first ohroma component and thp speend opmponant of the current LCU:. and iineratipg luma SAO type informAtibn Wipaing which one of the edge SAO adjustment and the hand SAO adjustment is performed on '^;ί^^^!βΐΐ!ίρο^ηΙ,^ί6Ϊίΐ^τΐ8 SAO type ihldrmaion indicating which one of the edge SAO adjustment and: the band SAO adjustment is performed on the first chroma ppmpbneht and the second chroma componedt.
The method maytefther include: genemtingithfbfthaiibh regarding ilia same ed ge tfteitiqp of trie ' \ippmpip?ife'Fit-:::e^ ihb sieged ipbremi; ^fhipsnpm Af t# current LCU.
The generating of the luma SAO type information and the chroma SAO type information may include: performing CABAC-encoding on a first context bin of information indicating whether to perform the SAO operation on the luma component of the current LCU, and performing CABAC-encoding on remaining context bins of information which one of the edge SAO adjustment and the band SAO adjustment is performed on the luma component of the current LCU in a bypass mode.
The generating of the SAO parameters of the LCUs may include: performing CABAC-encoding by using the same context mode for the left SAO merging information and upper SAO merging information from among the SAO parameters of the LCUs with respect to the current LCU.
The method may further include: performing CABAC-encoding in the bypass mode on magnitude information of an offset from among the SAO parameters of the LCUs.
The method may further include: if it is determined that the band SAO adjustment is performed on the current LCU, performing CABAC-encoding on bits of invariable bit lengths of information regarding a band left start position from at least one piece of the obtained luma SAO type information and the obtained chroma SAO type information in the bypass mode.
The generating of the SAO parameters of the LCUs may include: if it is determined that the band SAO adjustment is performed on the current LCU, determining an offset value for the band SAO adjustment; and generating the SAO parameters of the LCUs further including the determined offset value, wherein the generating of the SAO parameters includes: if the obtained offset value is not 0, determining a sign of the offset value; and generating the SAO parameters of the LCUs further including sign information of the offset value.
According to another aspect of one or more embodiments, there may be provided a video decoding apparatus, the apparatus including: a SAO parameter obtainer for obtaining slice SAO parameters with respect to a current slice from a slice header of a received bitstream, and obtaining luma SAO use information for a luma component of the current slice and chroma SAO use information for chroma components thereof from among the slice SAO parameters; a SAO determiner for determining whether to perform a SAO operation on the luma component of the current slice based on the obtained luma SAO use information, and equally determining whether to perform the SAO adjustment on a first chroma component and a second chroma component of the current slice based on the obtained chroma SAO use information; and a SAO adjuster for performing the SAO adjustment on the luma component and the first and second chroma components of the current slice reconstructed by performing decoding on encoded symbols of the current slice obtained from the received bitstream based on a determination of the SAO determiner.
According to another aspect of one or more embodiments, there may be provided a video encoding apparatus, the apparatus including: an encoder for performing prediction, transformation, and quantization on a current slice of a viode and performing inverse prediction, inverse transformation, and motion compensation on quantized transformation coefficients; a SAO determiner for determining whether to perform a SAO operation on a luma component of the current slice, and equally determining whether to perform the SAO adjustment on a first chroma component and a second chroma component of the current slice; and a SAO parameter encoder for generating slice SAO parameters with respect to the current slice, the slice SAO parameters including luma SAO use information indicating whether to perform the SAO adjustment on the luma component and chroma SAO use information indicating whether to perform the SAO adjustment on the first chroma component and the second chroma component based on a determination of the SAO determiner, and generating a slice header including the slice SAO parameters.
According to another aspect of one or more embodiments, there may be provided a non-transitory computer-readable recording medium having recorded thereon a computer program for executing the SAO adjustment method.
Mode for Invention
Hereinafter, video encoding operations and video decoding operations using a sample adaptive offset (SAO) operations based on pixel classification, according to one or more embodiments, will be described with reference to FIGS. 1 through 7F. Also, a SAO operation based on pixel classification in video encoding operations and video decoding operations based on coding units having a tree structure, according to one or more embodiments, will be described with reference to FIGS. 8 through 20. Hereinafter, an 'image' may denote a still image or a moving image of a video, or a video itself.
Video encoding operations and a video decoding operations using SAO adjustment based on pixel classification, according to one or more embodiments, will now be described with reference to FIGS. 1 through 7F. A video encoding apparatus to and a video denoting apparatus 20 that will be described below with reference to FIGS. 1A, li? 2A, a SAD operation do order Is minimise an ermr betp^p bpgirtii pixels shdreapnstecied pixels,. By performing the SA0 operation the video encoding apparatus lOeiassiieds pixels of aach Image block into preset pixel groups, allocates each pixel to a cosresfxirfng pixelgroup*·.; and encodes an offset value indicabeg an average value of errors between the poginai pixels and the recdPslniOted pixeh included in the same pixel group. ^biptes are signaled between the yies #neddih|;apparatus to and the video decoding apparatus 20. That Is. the video encoding apparatus 10 may encode and iiwiiipit samples sn the form pfi a bteirearn, and the ddpO; dfCbdiog apparatus 2© may parse and reconstruct the samples tern Hie received Plstream. In order to minimise an error between original pbals and racbnMruatsi pixols % adjusting pixel values of tte reconstruotcd pixels by an offset determined according to pixel classification,. the video #^aadingapparatus ID-arid the video decoding apparatus 2t> slpppl SAO parameters for the mo adjust mont Between the video encoding appaMPs 10 and .the vsdm decoding apparatus 20, offset values afo encodcd and fmnseofvM As iha 1¾¾ paramsliifa spdt that the offset values are decoded frocri the SAO parameters.
:!ta* f hi video decoding apparatus^ according to an embodiment may generate a ro^dstipctddIrnap having anwimded em>r poteen an oaginaf iniage and the decoding a received bitstr^lfcigenersiing rebdnatucM pxefs of each of Is^| btoeiks: reconstructing offset valpei from the bitstream, pbi adfystmg the i^rMlrysted pixels by the offset values.
An operation of the video encoding apparatus 10 that performs a SAO operation will now be ddipdbed with reference re FIGS, i A and 18, An operation pf the video decoding! ap^ratdi.:SAO adiustmeotwi now he described with reference:to FIGS,. 2A and 2B,
BilB.· 1A and 16, rocpeeSvdly,: are a block diagram of i|>t video encoding apparatus 10 and a flowchart of a SAD operation performed by the vsdeo encoding apparatus d 0, according to one: or more embodiments
The video encoding apparatus 10 includes an encoder 12, a SAO detenninar 14, lifa SAO parameter encoder 1S.
The video encoding apparatus id receives: Ian Input:: of Images .such asisllcaa dfia:; vld|dr; splits dash Imagi; into blocks, and ;gndsdi#:dash;dibble hiypls: square shape, a lectanguiar shape, or an arbitrary geometrical shape, and i: not Imlied to a data unit having a precfetarmined size. fh'd block according to one or more embedments may be a largest ceding unit |LCU? or a CU among coding unite according tc i tree structure. Video encoding and decoding methods based cm coding units aoserdiog to a tree structure will be described below with reference to FIGS. 8 t|tmy|h
The wdeo encoding apparatus 10 may split each input image into LCUs, add may output resultant data generated by performing prediction K^defprrrmtlcid, aftrf.potmpy apcddlng on samples of each LCU, as abbstream* Samples W an LCU maybe pixel value data of p*xeis included in the LCU.
Ibe encoder 12 may mcliv^duaily-encode LCUs Of a picture. lb©·-encoder 12 may woocte a current LCU based on coding unite split from the encrentLCU pnd Νντηρ: a tree structure.
In order to-«codeThe current LCU, Ibt orroodor 12 may encotfe iampfea fey performing item predicting, inter prediction, transformation, and quantisation on each of coding units included in the current LCU and having a tree structure.
The encoder 11 may mconstfud the ensadted saoiptes iqpluded1 in the gyrront LCU by performing dipiniaation, inverse aanflmwtalbn^· and.infer -ormrs eompensebon on each of tie.coding units haying a tree stmtrfure ® :ae id dooqde fe#;: coding units
In order to minimize aft error between: original pixels before the current tCM is encoded and recocMroeted pixels alter the current LCU is decocted, the vkfee eneocing apparatus 10 may determine offset values indicating dlfererv&amp;e values between ire dflglniai plxeii and liio meortitrygmd pixals.
The encoder 12 may perform predlollon. transfomiotson, and qyahtbata on a current slice of fhe video and perform ^quanlmnoa inverse nanstormauen, and motion ©omponssfeori on qyaa&amp;ed tmnstdrmafpn epeftetents, The encoder 12 may firstly perform predsetk,in! translprmationv and quantization m amih gi emllig unfla a! tfe# current siee cl the video. In order to generate a reference ipjage tor Inter predfehon, the encoder ! 2 may perform dequanfiiatlon. Inverse tarfetenmalsm and motion compensation on the quantized transformation coefficients to generate a reconstructed image. A reconstructed image of a previeus-.image may be referred fa far infer prediction of a next Image.
The SAO determiner 14 may perform SAC operations for each color cbmpdfieeL For example, with respect to a vCrCb color image, the SAD operations may be performed or* a luma component (a Y component) and first and second ehroma exponents (Cr4Ml Cb compoosrnts},
The SAC determiner 14 may determine whether to perform the SAO operations pm A fuete component of thecu front sloe. The SAO determiner 14 may equally determine whotter to perform ••tto®·'·ϊ#Α^····:<3p.^mioos:. on -first and ιοοομΙ chroma components of tho corrent dice. That. is, if the SAO operation may do performed on a frat: eiirphii color component, the SAO operations may be performed on the second; chroma component, and, it the SAO operation may not be performed on the first chroma color component, tee SAO operation may not be performed on the second chroma component
The SAO: parameter encoder IS may generate a slice SAO parameter with respect to $1# oyrwrst slice to include the sire® SAO parameter In «-dfe©: fteaAr of the Current slice.
The SAO parameter encoder 18 may generate luma SAO hi® mfohhafion Indicating whether to perform the SAO operation on the luma component according to a ^termination of' the SAG determiner 14. The SAD parameter encoder 16 may fparate chroma SAO use mtomYaipn if^dioallog ^ether te perform the SAO operation eb the firatand iecerid eompaheete abOTlbg determiner T4,
The SAO parameter encoder 18 may mpludg the luma SAO use information and the chroma SAO use «η formation in the sH^.^AO
The SAO determine? 14 may determine the offsetvalues with reapeet to LCLls, SAO parameter including the offset values, a SAO type, and a SAO class may also be determined with aspect to LCOs.
The SAO determiner 14 may determine the SAO type according ro a pixel value sfe^ifls^pn:'--mpthod el the current too. The: SAO type aesprding to embodiments may be dbteiteinprfasm edge type or a band type, According to A iwf yalue classification method of a current block, it may be determined whether to classify pixels of the ccmbht block aoserding to the edge typo or Ste ta^d typo, tf the SAO type is the edge type. according to a direction and a shape of edges ibfpted between theroeensffucted pixels ot fhe current LOU and their adjacent pixels, an ^tept botweep^the raobhsimcied pixela and the original pixels may be determined,
If thsSAO typo is the band type, from among a pfuroihy of bands obtained by dividing a total range of pixel: values of the rocoo^jclod pixels of' aft' offset between the reconstruct pwels and the original pixels mckteed m each bate may bo determined. The bands te ypfcrmly or «uliferrnly dydini the total radge of the pixel value*.···
Accordingly, the SAO t4::iii£§? tetbmfrte tee SAG typo of the currerl LCU: which indicates the based on spa&amp;l Gdaraasasics of ptxoi values of tli# diiirlnt LClb
The SAO tetermtaer 14 may detennine a SAG class of eachof the reeenslmcfed plplyiciordlbg lb the lAO type oi Jhi dyrrwt LOU, Τΐ^;:|^ζί#|^ may: W determined as an edge class or a band class.
Withnpspeteto the edgd type, tee edge bides may indicate a directer- of edges iddiied between the recsnstmcted pixels and theu adjacent ρΐχρ% The edge class maytiodic^tean 0^901:^0800..-010^^0^, 45". or 135\
If the SAO type:i$Ttl#::#d|d· type, may-tetermtet the tego class of each of the recomdruoted pixels pi the currant LCU.
With mspoot to tee band type, from -attsong a plurality of bands that are a l^detiirmlaid number ofctehlfteous pixel value Intervals obtained by dividing a iota! m$* of pisret yatudi pf Ihe pprrpnf LCU, tie band class may Indicate positions of the tends te winch pixel values of the recmisfrueted pixels belong.
For example, with respect to a. sample hayiop;S pW^tue.et-3 Ms, e:;lotaf range of the pixel value is tram 0 to 255 and the pixel value may he classified into a total of 32 bands, In this ddp, from afte^I^^ a predetermined numbers! tends To Λίώ pfel vaJyoaof the reconstructed pixels befongmay be determined, The tend piite may fotibatE ;astart te#te|d lift titepositid^ ^ ocntteMdes bante by using one of tend Wiett tern i »31.
Wit re^tet to tte edge type, the reconstmcied pixels of the eyrcentiCU may be classified late « ptede^mitped numter sf aategones seeertengts the sbapo of edges formed between tee fesoaatocted pixels and their adpeote pixote, For example^ teeeriinp to teteoip: ste local valfeyof a concave edge, a curved earner of a oonoave edge, a curved teomor of a convex edge, and a local ipeak of a convex edge, the reconstructed pixels may be classified into lour categories. According to an edge shape of each of ftp reconstructed pixelsettfte current LCU one a! the feur cotegotes may tefermlhte values of the reconslmcted pixels of tec current-LCU teten^Ae reconstructed pixels may be classified into a predetermined number of categories. For example, according tp feaod Mices of tour continuous feeds imma start band postal, Le,,. a start: position of 0 leftmost bawl, indicated by the band class, the reconstructed pixels may be classified Into four categories, Accord^ to one oi'thetour bands Jo which each ef !M recossfejotod piixete of the curtepl: tOtl boto^fT otm dl the four catogorios maybe Jlloribioed,
Hw SAD toesmloer 14 may deteouioe a sategGry of each. of the reooosbpclodt pixels With respect to the reconstructed pixels of the current LCU, which belong fo the same category· the SAQ ctetemiiner 14 may determine offset values by wring ^iorerMte Mfoes betwpoo the reconstructed pixels ar^d the oaginal pixels. In each category, an average at the difference values between the reconsbocted pixels and the original pkels, Ι ο.., an average erbsr of the lecensteoted ptxelsv n«y be determined : as an dipt vilei eorre^ category. The SAO dctemrinor 14 may
rlitermteo an Offset value of each, category and maydetermine oiset valuos cl all categories as the effect values of the asient LCU
For example, il tee SAD typo of the current LCU is the edge type ami the meonstruoted pixels are etesifM into tout categories according to edge shapes; or f the; SA0 tyi^ pf the Current: LCU Is the band type and the roeonstrueted pbels ere Classified info four categories according fo indices -of foyr;iter«^3^'.l^dSi. W&amp;: SAD determiner 14 may determine tear offset values. ,^Mge;:..#fr^.· bepyieh the reconstructed pixels atid the original |M^eis,r'
Each of the ohset values may be greater then or equal to a prose! minimum value and may: pb less ften m equal to a preset maximum valyer
The SAD parameter phCOfef IS. .may encode and output SAD parameters iocludibl: thaSAC) type; the SAD class, and the SAQ-. values of the current LOU, wtaoh are dotormibed by the SAO determiner14. ;SA© parameters of each block may include a. SAO type and SAO vmlues of fee hbei; As the SAO type;an off type, fc edge type, or the band type may be output. II the SAO ype rs the off type it may bo indicated that SAO operations is not applied to the current LCU, In ihip caae? ethtr SAD parameters of the current LCU do not need te be encoded 1 the is the edge type, the S.AQ parameters may inetybo offset values ludlyidPely oofrespendteg te edge classes. Also:, if the SAO type Is the band typte the SAQ parameters may include offset values mdividuaify corresponding to bancs. 1¾¾ l$m SAO parameter encoder "8 may encode SAO parameters ol each blocR, A precast όί ouipyptog the SAO parameters v$*iB now be des.es ί bed tb d^Mi wth reference to a flowchart of the SAO operation of FI G, 111bbtow.
The encoder It.-may -encode a current 1GU erneing e plurality of LCfUs b! ths currant slice based or? coding gn*s having a tree stn.jhtpfi*
In operation 11 ? the SAO parameter deterrrite 14 may determine whether to pplOrto the jSAO operation: on the luma component of the .currem dice. fh operation 13, thi SAO parameter detarmsne? 14 may equally determine whether to perform the SA© opiraiipn; on fcr-st sad seeped ehipnia ο«Μ^ϊϊ®ρ^:·οί·0ρ:!!3Μ|ΓΛ step. tpigeratieb:11:,:the SAO parameter cfefemhner 14 mayf enerato toe fuma SAO ass Ihfermalion according io a .determination in aperattoe iit and may generate the chroma SAO use infoirnation according to a determination m operative 13,. The SAO porameler determiner 1A may generate the slict 'SA©f^#rnelerincludingtb# tom^.SAO use information and the efeema SAO «so information regardtop the currant slice. SAO parameter determiner 14:;may outpot thg slice teadter generate in operation is, :j-SA^iaa^aii^ 14.may determine a first SAOparamefer of ih&amp; current LOU. The first SAO parameter may feciucite a SAO typo indicating whetoer a prxol value classification method of the current LOU Is an edge typo or a band typo,,a SAO class indicating a?T edge direction acaardiiiTo the edge ipe or a. band range according, to the band type, and SAO values indteaflng difer^c^ yaly^ betereen reconstructed pixels and original pixels inducted *n the SAO class.
The SAO ptoamitof encoder 1# may output effect values eerresponding to e of categories.
In eppratipn IT, 4 the SAO parameter encoder 16: pbtptifs SAO typo Information ihilibeitpg the edge rtypdit.aojoi^fng to an edge direction d the reconstructed pixels kfk6ii4^ii^":^ «<*0® 4S\ or 135' may be output.
In operation ?7. i? the SAG parameter encoder 16 outputs SAO type mionroatmn ibdfeahng the band type, a band class iodic&amp;iioa. a band position of |he retoaatmeted pixels included in the current f.CU may he output
In ppprauen 1'T* 18 Outputs thiO§A0: information mdmaimg the band type, .-m an ofeot «too,sera velyp intonnetlpp indlealpg: whether me sliest value is 0 or not may be output, tf the offset value is 0. the SAO prameter encoder 16 may output only the «erfvilua information m the offset value,
It the dll sat value ss hot 0, the SAD parameter encoder 16 may further output srgh Normation md'oating whether the offset vafye li a ^esiyyo number or sod a remainder* 'which amfoltowed by the sefo value information.
In fpoi^iipn 17;, if the SAO pcmiwforeneodor 16 outputrS&amp;O typo information irfdaiftg the edge type, the zem value Information arid the remainder may be output With respeot to the edge type, the sign information of the ohsel vslus sfoes ootneeefta bo:; pulpyt be»ae a ago of tie offset vain© la pridiblabiP based on only a category according to an edge .shape*· A .prqeess of predicting thosign of the offset vafy© willlti© cfeapril|pd: below with reference to FIGS, ^ M,i:
The SAO determiner 14 may ctet^l*^':Wheiiier;iS':|^iferm-d»v^^ dparaibn·.: and SAO typos who respect to ICUs according to color components.
The SAO determiner 14 may determine whether |o perform the SAO operation on a luma component of ms current LCU. The SAO parameter encoder 16 may generate luma SAO type information indicating whether to perform the SAO operation on the luma component of the current LCU. the SAO determiner t4; may squally determine whetim ta perform the SAO operation on frsl and second chroma cxiini^e^-o!:;^· currant LCU, The SAO parameter encoder 16 roaygonemte chroma SAO typo information indicating whether to perfpsm the S&amp;O ^erpfon on the first and aeeondehmma eamponents ol the current LCU.
The SAO determiner t4 may determine which one of an edge SAO operation and a band SAO operation tc performed so the iumo eompenoet pi the current LCU, The SAO pmmeier encoder16 may generate luma SAO type Information indicating which sob of the edge SAO operation and die band SAO operation is performed on the lyma component of the sumoMLOU,
The SAO determiner ?4 may determine which one of the edge SAO operation and the hand SAO aperatlsh la performed on the first and second chroma oompemonts el the current LCU. The SAO pommifor encoder 16 may- generate chroma SAD type information Indicating which prie cl the e^t:;^AO.pp^alon -£$$$$ band BAOpperifiprV is performed on the first and second drr©m components silh© current LCU. i the SAp iiterfoilipr' 14 deform inos to pohbrm the edge SAOodemffon on the first and: seajmfcc^Oma: components of the current LCy^fc diss tie sanie edge ciresttoo with respeet io lfts lrsi arte second shrema components ot the 'current tCli Thus, fie SAC) parameter pppofer 1i may gprierite a SAD parameter Ifidudteg iniomiafort or? the same edge dlrmtetm of me Iasi and sasooct chroma components of the current LCUl
The SAC parameter determiner if may include the luma SAD type; Interaiallpn and tea chroma $AO type inteffnatron in the SAQ parameter of the currant 1¾¾..
The SAQ paramwitCf eccotlgGif-fifsay ouiput SAQ merging information of tea ourpnt ICU indicating wiiptear tp adop|&amp;;second SAO parameter of ope of a Pit LCU and sn upper tCU nesghhonng tee current LOU as a teat SAO parameter of tee current LCy, Pased .€P:,.ss8Pfness feetwean the first SAQ patwnitet and the §mm$·: SAQ itSAOparametets of at least one of tea left and ypi»r LOOa oi tte tiirrent LGli are tee same as those of the current LCU, the SAO parameter encoder 16 may net encode the SAO parameters pf tee current LOU and may encode only the SAO meigteg tefemmtiom in teie ease, SAO merging mfprmatioo Indicating teat the SAG^ parameters of tee left or upper ICU am adopted as the SAO parameters of the current LCU may he ootpte, I the SAQ parameters oftte left add upper LCUrare different from tee SAQ: parameters of tee current ICU, tea SAQ parameter encoder 16 may ennode the SAO merging informaLon and the SAO pammotors of the current LCU, in this oassu SAO maygifip information fndfcabng teal the SAQ parameter of the left or upper LCU are not adopted as the SAQ parameters of the eyrrent LCU may be output,
It the second SAQ parameter oi the ieh lOU or tee upper LCU ol the current LCU ifif same as the first SAG pifimetap the imf SAQ parameter may oe prtdlPtei oil ted; eecprid SA0 paMd^tefi: When tee SAO parameter encoder 1S adopts tee pepotid SAQ parameter a§; the first S.AO parameter, the SAG pamnvdec encoder 16 may output only the SAO and- not Pylpyt tNd SAO type, the SAO
Gass, and tee offset values of the current LCU
If the second SAO parameter of the left LCU or tee upper LQU of the current LOU is not the same as the first SAO parameter, the first $AO pars meter may he predicted separately from the second SAO parameter. In ops mu on 19. when tbs SAC parameter enopder 16 does teat adoptthe second SAQ parameter as the first SAQ parameter, the SAQ^ may output fftefir# SAP piritmter fo fhdp# tep SAQ typp, the SAG class, and the offset values of She current IGU. rn addition to the SAG merging ihtbrm&amp;tien oi the current LCU,
Wt*en die SAD parameter encoder T6 outputs a SAG type, a SAD class, and #fset values of the first; SAG parameter, the SA.0 pMameter oooodor 16 may saquersbaliy output the SAO the olsef vate for each Mtogofy;-anii: the SAGicfass- of the current LCU,
If the SAO operaion is. 14 may determine SAO merging information aril SAO parameters of each ot the LOOs, IP this case, the SAO parameter encoder 16 may output SAO use infomiafen indicaiog that the SA© Operation m performed Oh Jie cuoart sloe, and fen may output fho SAO rnerglag information and the SAD parameters d each of the LOtls,
If the SAO epsraion Is: oot pertemied on the oyffaot sfi^¥ fhs SAD determiner 14 mhyippf need to deforrnioe an ohcet; of each of the LCUs ol the current slice, sod the SAO parameter encoder 16 may output only SAO adjustment is not performed o*s the eorrnnf pile©,
The SAO determiner 14 may not differently determine the SAO parameters of the purrprit LCU for each ceior component but may ecfuaJy detarmia© thorn with respect to ibra&amp;a<t on the SAG- parameter el the ielt LCU or the Upper LCU nelghtoring the euireaf LCU,:
Tho SAO d0tormiher - l4'Ui^vd#c«^ihp. whether to predict the SAO parameters with respect to the luma compooeht jii the firsti-aml.-.of. Ifwet;:· current LCU by using SAG paiamete with respectto-a lutm component and tel aM ^#s^r^'''cH|iiigiiiTfT^ -:of tt^ai Wt lOU cif the current LCU among the tGUs of the current clioat
Tim iAO ^parameter eneoder 1 i rhay pheiiie left SAD merglngrinfonnatlon for the current LCU based sh y^ether le preset the SAD parameter using the SAO paramete Of til© lot LCU, . fe&amp;::SAO merging information may be generated wsthouf distinction of the luma component and the If ml and second chroma components.
The SAO determiner t4 may determine whether to predict She SAO parameters with'respect to the luma component arid the first and second chrohia compdoehia of the current IGU by using SAO pai&amp;metei s w«sh respect to st:ΙϋιτΜ’-soi^|3W^|€:MriisJ-v6rM·aaursi····' second chroma cOmpohid^ of the upper LCU of the current LCU among the LOy$ ot the current slice,
The SAO parameter encoder 16 may generate upper SAG merging Information ferlh© current LCU based on whefher to predict -LCU by ussng the SAO parameters oi the upper LCU.
The SAO parameter encoder 16 may generate:SAO parameters el the LCUa Including the SAO merging information of the left: ICU and the SAC) merg ing mformahcri or lb© upper LCD with respect to the current LCU.
The' i$i$feg:'encoding apparatus 10-'·may perform entropy encoding fp encoding symbol including quantised transformation coefficients and encoding intomiauon to generate a bitstream. The video encoding apparatus 10 may perform contoAbadapisve blnaiy arithmetic ceding (CASAC) based entropy encoding on SAG parameter.
The iridftbi may ppirfcmfi CAQAC enpddfnii off a fftet. com©*! fen Indicating sntamiaiion infeuded ln the luma SAD type teformatteh Fogariug whether to perform tho SAO operation on the Dm3 component of the current LCU)
Ida video enoodsng apparatus 10 may perform the CASAG encoding, -n a bypass mode, on remaining context bins indicating information included in the luma SAO type informalioa regarding which one of tho edge SAO operation and the band SAO operation i§ performed: be tho iuma CDmpc-neot of tbo cufrenl LCU,
Hid ytddp oripo#ng ppptr^up Id may fterfopp !f)fe - Context mode, on the loft SAG merging Information and the SAO merging Information among the SAO pommptem ©ifho UDUs with respect to the cunrent LCU. - apparatus 10 may perform the GABAC encoding, In the bpaas modayon information of offsets Deluded in the SAG parameters of thi LiJOs. The magnitude information of offsets may indicate offset magnitude wrtein a tonge baaed on a bit ddpfh of a vktea For example, when the bit depth is 8 fete, the offset magnitude may be equal to greater than €i and equal to or smaller than 71 For wtotfter 10 bits, the ofteel oiagoiti.rde may be equal to greater than 0 and equal toor smaller than 31.
When is determined that the band SAO operation is performed on the current LCUt fte yideo enosfeog appamtes to may ^rfefm the OAiAC oPSoding, in the bypass mod©, on fefe ot an fovaPablo bit length of intomiaoon regarding a hand left start position of it toast pTip of the luma SAO typo information and tho chroma SAD typo rntootiatiofb VftteP Its. detephlhed that the band SAO operation «s performed on tee current LCU, the -SAG 0@ϊβρπ^ρίκ|^ effect valet for the band SAG
Accordingly, the SAD parameter encoder 10 may #fte tCUs ^yrtherinclydingilie ciedl ^Mofe fer fhe band SAG operation.
When the aflsel vrstue lor the band SAD operation is not 0, iire SAC Peterminer 14Q lirfecrMlermine a sign of the offset value, Aoedtfdtngfy* ..^.:^0 pa^sh^W' wmdmk 'ib may generate GAO paratpata· of The 101¼ tudAer wbdini spn ioferrhation #tho offset value.
The video encoding apparatus 1Ρ may include a central processor I not shown) for collectively centreing the encoder t2. the SAG determiner 14. and tie SAG parameter encoder Tl,: Alternatively, the encoder 12. Ifid SAO detenninir' 1¾. ··#§ parameter encoder ISmay bo driven by their individual processors (opt showm that cooperatively operate t6 oca'^r^li ;.¥tc3ect;:-: 1|0, Alternatively, &amp;n eternal processor (not outsi^:Jh®':\4^enco^rig apparatus to may oodtrsIfAe encoder 12; the SAG dctennmgr 14. and Ida SAG parameter encoder 16.
The video encoding apparatus 10 may include one Or mere data storages (not shown) tor storing input and output data of the encoder 12, the SAG determiner 14. and the SAD parameter encoder HV The video encoding apparatus 10 may include a memory sgniotlor mot output to and from the date stiraf#, tn order m perform a video encoding· dpeimion including bthaformation ami St output a result of the video encoding operate, the video encoding apparatus 10 may operate in association with an internal m external video encoding prp^ppr, The; internal video encoding processor of the video encoding apparatus 10 may be an iodopentteii^^ for pcmorrnmg a vtdoo encoding operation* Aim lip video gnoodtng :|ppitisitu^riOt-:^'.^sntfi@ii processing unit, or a grepNc pracessing unit may indude d video encoding processor module 'to perform a basic video encoding operation, FIGS. 2A:ancf.jaB, respectively, are a block. diagram if the video decoding βρρ€0Π8^ι«ί-^@:;:βιϊέί.'·.«ι·: fUdewdi^t of &amp; SAG operation performed by video decoding apparafus 2d, acceding to one or more embedments
Tib®· video dooodNg apparatus 2P includes a SAD parameter dbtalner 22, a SAG determiner 24, and a SAG adjustor 26. •The video decoding apparatus 20 receives s bus-ream including enpedid dsit of a video, The video decoding apparatus 20 may parse encoded video samples tom the rdeeivtfl i^itriafid may peborm oniropy decoding, deguantizabem, inverse rppiil^iri oc^p^ri^t^i W irr^Qe fefeiO^: iso generate BWSWts^i:-|^d|v and ti'ys rimy generate a reconstructed image,
The Video decoding ipparafoo 20tnay ra©erve offset values indicating ddierersce yayhj^^betwwe^ ;.«^ftisl pipfs and reconstructed pixels, and may minimise an error between -m ,·ό^Μ image and the refjonstrudad image. The vfcfee -decoding ippamt^s l0:may receive encoded data of each LCU of the video, and may reconstruct the LCU based on coding units spfii from the ICO and! having a tree mumm-
The SAG parameter obtained 22 may odea*n slice SAG parameters with respect to a current shoe from &amp; slice header of ό received bitstream. The SAG parameter OPtainor 22 dray obtam luma SAG uso iinfodhalion for o luma componeo! of the curmpf
ose for:froth fh^ iiCd SAG pelFi&amp;il! 1¾ fi?·.
The SAG determiner 24 may determine whether to perform SAD opemtioo Oh iha luma component of Hie current slied Oasid on |hi !yrtfaSAOjjs<g infem^atforhobtilhid%* the SAO parameter obtainer 22..
Tiii SAG .ioterniioer 24 may equally determine whether to perform the SAO operation: m a firii chroma oompopanf and a coeood .chroma component of the current alfoe dasod:ifol SAO yse Jotormation obtained by the- SAO parameter obtainer 22.. That Is, if the SAO operation is performed on the first chroma component, the SAO operation may be performed on the second chroma odfoponpnL and if the SAO operation is not performed en the first chroma component, the SAG operation may not be performed on the second chromacomp^ent.
The video decoding apparatus 2D may perform decoding on encoded symbols including encoded samples add encddlng mfomiitien of the 'af^lifipe obtained fmrfi the received bifofiwa to reconstmcf the cor®fit afie® 1110: ¾ rfpoter Sf hity perform the SAO bprfrsfion on each of the luma component: and foe: first and second awpunonta of the current siico acccrding to a determinahon of the SAO rleterminer 24i:
Operaiboa: bf mconatmoifog sarnies of a ourmni 100 and adjyslng offsets will new be· ctescit3®d wath reference to FK3*. 28. ip eperMfon 2i( the SAG pardmcier obt&amp;feer 22 may obtain tie sioa SAO paramoiem yrfth mspect to the currem sleefram the sfieeheader of tie received: bitstream, in operation 23, the SAO poramoter nbtaindr :22 may obtain fie luma SAG use information and the chroma- SAO use Information fern the slice SAG paramoiors.; 25, Ihte dmprminer 14 may detem^ne y#pA^ tfee SAO operation on toe luma component of the'burrem sice based on the luma SAO am ihffermatioe obtained in operation 2&amp; If the feme SAQ ose information indicates thad: the SAO ippralefi Is: ^dsTOM< SAO ajd|yei@r S0 may perform the SAO epemtioo on a ipom soiqc component ofthe current slim- ih;operation 27, the SAO determiner 24 may equally determine whether to perform the SAO operation 00 the first chroma eaimponei^snd: the secor^ ehmma osmpon&amp;nt of the current slice based on the chream SAO use information stained m operation 23. ti fhe chroma SAO use information indicates that the SAO operation te performed, ttvj SAO adlyitar may perform the SAO optionee the first chroma component and the second chroma component of the current alee.
The SAO parameter obfalaer. -22. .8^.: ¾¾¾¾¾¾ SAO merging information of the oyrrant LOU1 idto th# reooivad bltstraafii. The SAO merging information of the current LCU irKfc&amp;tes whether to adopt a second SAO parameter of a left or upper LCU cl the current LCU as a first SAO parameter of the current LCU.
Th# SAO parameter ohtoteer 12 m^ faconatruet the first SAO parameter including a SAO typo. Olfsei values, and a SAO glass of ihe ourrent LCU, bOMd db ffte SAO Atergiog mfqimaftcto.
The SAO parameiar ofefajnec -^.in^.^prmfne. whether to febbnstryet tte SAO typo, the ofeefc values, pyrrent LCU to he ®o Same as fhosa of the second SAO ftammeter, or to extract the SAO type, the offset values, and the SAO class from the bitstream, based on the SAO merging iofsrm&amp;iim
The SA&amp; ditirmliar 24 may determine whether a pixel value classification method elite current !_Cii le art edge typo ora hand type, baaed on tho SAO type determined by the SAO parameter oblaioer 22. Based on the SAO type, an off type, the ddpb typS i dr the band type may be determined, lithe SAO type Is tho of type, it may applied to the current LCU. In this ease,, ether SAO parameter of the eonent LCU do not need to fee parked.
The SAO determiner 24 may determine a band range according to an edge direction according to the edge type er a bertd range according to a band type of the .current LCU, based m!He?ilAO parameter obtainer 22.
The SAG determiner 24 may; deftrftfte: difference values pixels and original pixels included hi the above-determined SAG. ' offset values determined by the SAD parameter obtaioer 22.
The SAO adjuster 26 may adjust pixel values of samples reconstructed baajg&amp;fltt poding units .spKi from ft# current LGU and having a free structure,. by the difereoas
Tte SA© parameter obf&amp;fter 22 may detemifte to adopt the....second SAG or upper LCUasfte first SAO parameter, based on fte SAG merging infonrnation, ip ihla caaos tft SAG dfeiOrrtftfer-24 may not extract the firtt BA© parameter of the oinfem LCU and may reconstruct tte.: first SAG pararnate to fop Itw isrna.ail ftp prevftaaiy i^pbPifpptgd faaoPd SAG parmriaMp
The SAO parameter oblafter 22 may dstemme ml toadopi thnseeoncl SAG ^mmeftr as the fkil SAO paraPietop bose'd an the SAO merging mfmm&amp;tiom ft this OMt, ftp SAG determsner 24 ftif extract and &amp;cMfeudt ftp irsl SAG parameter fdiiowad by the SAG merging
The $AO parameter oDtalner 22 may axiracf common SAGnierging Informafien pi ftaftma eompommi the isrsi dimma oQpiponem, aPcl fto second ehmmaeompooent of liio pprrent LOU, The SAO 0etemYirw 24 may dftemana whether ft raeenstruct SAG parameters of the luma component SAO parameters ol fta first ebronm component, and SAG parameters of the second oftftfta component to those ol m adjacent LCU, based on the common SAG ftergmg mtomiaiion.
The SAO determmet 24 may reconstruct a soremoo SAO type of the listehmnm component end the second chroma component of tie current LCU
The SAO determiner 24 may d^minfe. · -m: prediftftiined number of categories, based on the SAO parameters.· Each of fife Offer or equal to a preset maximum value, I SAO type information lodicaMS the edge type, tin SAO determiner 24 may determine an edge direction of the rnsonstmoted pficeiaindietted in the current LOU as Oh 90", 45\ or 1352 based 0« ft# SAG class, I fie SAQ type informaiort Indiemps the band type, the SAO determiner 24 may determine positions of bands to which pixel values of the reconstructed pixels belong, »#d on tho SAO class. '!Hhe SAG type information mdcaies the band type, the SAG determiner 24 may determine whether an offset value is 0 or not. based on mm value inforniation of the offset value, ff the offset v#ue is determined as o based on the zero value inld«ftatiCf&amp;· information of the offset vafue other than the erne value information is oof recenstrooted. fthe offset value· is not delwrmrmd as 0 tes^d^i the iero value infomiaUon, fe SAG determiner 24 may determine whether the offsbi value is a positive number otic negilive domfeep Msed on s;gn informalon of the offset value, which is followed by the sero value mfomiason, The SAG determiner 24 may finally cteteonma an elf sat value by reconstructing a remainder of the offset vatu©, whsch is followed by thp iign information.
If 11¾ SAO type Information indicates the edge type and if the aifset value i&amp; not determined sntd based on the asm value Information: of !h©offset value, the SAO determiner 24 may finally determine the offset value Iff#Tomita^r^ rhe offset value, which <s followed by the sera value Shipmiatted.
The vkrea decoding apparatus 20 may obtain the SAG parameters based on solor components to perform the SAO operation, the SAG parameter obtainet 22 may obtain SAO parameters efeashoi the LCtfo df the oudprt: tic® /fmrri, a:. bflstreim. The SAO parameter obtainer 22 mayobtate at lbast:dh@:0i let SAD merging information and upper SAD merging infamiattenfrom the file LCUs.
The SAP f aramater obialner 22 may determirre whether M predict SAD fwsw^tbmyii#i· · Respect' &amp; jhe luma csmpsneht and the first and second chroma components of the current tCU by using SAG parameters with respect to a luma component andfirst and second chroma eempnehts #lht u|^r LCU neighboring tite current LCU based on the felt SAO merging iiitomiahon.
If ®i lift SAG merging mtenmafoo indicates that a currant SAO parameter is to ba pfadieidd by using mo SAO parameters. of the left iMi SAG paraimpters Tor each color #ii respeet to the left LCU may be adapted as SAG parameters for' each coisr cumpsnant of the PUffant ICU. Isr eacli color ccinpohem II the SAO parameters of the current LCU are detemuned not to be predicted by u&amp;nsj the SAO parameters of the left LCU based on fid faff SAO merging ihfomiaibhj [be SAO parameter obtemer 22 may further obtain upper SAO merging Information MP ifte bitstream..
The SAG parameter odtelner 23 may determine whether to predict the SAG parameters of the luma component and the first and second chroma components of the cjmmt LCU by using the SAG parameters with respect to the luma component and the first and second chroma oompopents of fhe upper LCU neighboring the curreril Lpy based on tie upper'SAO merging information,
If the upper $AO merging information indicates that the current SAO parameter is ip be predicted by using the SAO parameters of the upper LCU, SAO parameter for each color component with respect to the upper LCU may he adopted as the SAO parameters for each color component of the current LGU, for each color component
If the upper SAO rmj^icg rnfermation indicates thaf the SAO paramelcri of IlfO currem LCU are not to be prociiPtbd by using the SAO parameter of the upper 10:¾ ;tfe: &amp;ΑΪ3 pammetar obfPteer 22 wajAodiiw ihe SAO parameters for each color somponenf of thi isyhihf IC^y frpm the bitstream.
The SAG parameter obt&amp;ner 22. may obtain luma SAD type iifomiation for the luma component of the currant LCU and chroma SAO type mfemmtion for the chroma components thereof from the SAID parameters of the tCUs.
The SAG determiner M may determine whether §e perfpim the luma component of the current LCU based on the tema SAG type Scfermetton,: 1¾ SAG ed|dster ^ may ©r net perform the SAG operation m the fuma component cl the c^irr^irit .L00.:.#is3^i;^p^: 1¾ - .of the SAO deterrmoer 24,
Tbe SAO'de^iii^eh^i^; easily determine whether in perform the SAG operation: on the first and second Phreim current LCU based on the chroma SAG lyp# InfpmtaticMt, The SAO adjuster 26 may ormay not perform tee SAG operation on the firotand second chroma components ei the cynrent LOU according to the defermihaipn of the SAO dererminer SA, may rtetirimf do whether to perform tlm SAG operation based on a firatbit of each efthe luma SAO type infom^ion and the chroma SAO type ihfcmMtidn. n the SAO operation is determined 10 be performed for each color opponent, a aeccmi biLand remaining bits of the corresponclni SAO type information may be obtained.
Tl^i-SA0:d^t^ittfe^,;:24 may deter raine which one of an edge SAO cporMion end a hand SAG OiBr^ion is perfohTied Ph the luma component of the current LUC based on the fpma SAO typo lnfg#aibnr The second bit of the UmaSAOtypo informabdrf may Moate ite edg® ^ band SAD operation*. The SAD adjuster 26 may perform one .m the edge SAG operation and the band SAD operation on the luma component #fbi mlrrent; LCU according to a determination ol the SAD determiner 24
Thd SAG defmFninesr 24 may «quafly determine whrch one ol Ihe edge SAG ©perMion and the Mod operation is performed on the hrst and ©eeend chroma components d the corneal LOU based m the ebrsma SAO idormaisp, 1¼¾ second b«l of the chroma SAO typo information may indicate the edge SAG operation or the band SAG ppeiation- The SAD adjuster 26 may simultaneously perform the ed^e SAG operation or the band SAD operation on the first and second chroma components at the pymeri TGII abdOrdmg to the detirminahon of the SAG determiner 24 the edge SAO operation is determined to be pertonned on the iirot arKf ieportt efadma ©dmponepts pi the current LCU, the SAO determiner 24 m&amp;ydetermiee the Irst ahd second chroma componeols of the current LOU to haw the same edge drrecfeoo baaed on die chroma SAG type infomietlbn.
The SAO parameter ebtotner 24 may period CABAC decoding on a Irat nontext hth oftie fume SAD type information so as to obtain the tumaSAQ type informaaoe, information indicalni Wietherto pedbrfr! the SAD operation on the tome component of the currem LCU may bo ebtatnod bydecoding the first semext hln of the luma SAG type intbrmatibh:i
The SAO para meter obtainer 24 may perfmwi the GAIA© deoocfcg on remaining context bms ot the luma SAG type totormation in t bypass moish Information indioalieg which one of the edge SAO operation and the hand SAO'fifpe^iiP lA?^^f^..di;Tbia;' lbmrcm?ip©r^::dfibt'cu.ftent LCU may he dtMmd by cteo^ir^iiiije spemalriip-imlext. bins of the luma SAO type information,
Similarly·, the SAO parameter ohtalner M may pedorm fhe CASAC deoading on a first context bin of the chrema SAO chroma SAO type totormatim fniomtoflon indicating whether to perform: the BAD operation on Me first and seooMJ chroma compononfe ot the curmnt LGy may be obtained by decoding toe if^tpontexf bin of toe chroma SAG type fntomi&amp;ien.
The SAD parametor obiainor 24 may perform the CABAC decoding on remaining context bins of the chmia SAO type mtormatton in the bypass mode- information iMioetlnf which one ot the adgo SAO operation and the band SAO aperaibrr is performed on the first and ^oohd chroma rornponeris of the eorrent tCiJ may fed obtained by decoding the remaining context bins of the chroma SAO type information,
The iAQ parameter: dblaiher M may perform the CA8AC decoding by using the sap® context mod® os alio obtain iho fett iA© wwiing mfermMionancf fie yppor SAO merging informatiofi ottbo o®rr^| LOiJ-
The SMI parsmotep detainer 24 may pederm IheCABAC decoding in the bypass mod® :sq as to obtain magnitude Monnaion ot ofctstetadect i® the SAG pmmeters-af the current ICU The. obtained mafpiysfe IhtpEPstidn^#biisett value equal to or simitar than a reahlt^liih^ue· ^Μβί dfepih of a video. The magnitude Information c? offsets may indicate offset magnitude wthm a rang® Ipsod gh fte- W$#iT 0: video, for example, whan th® bit: d^lhhif % Ifc ' magnitude may fe equal to greater than 0 and equal to or smaiterthan A and, wfienthe fet depth Is 10 bits, the offset magnitude may be equal to greater Pari &amp; aodequal to or' smaller than 31,
When g is road from a second bit of the chroma SAO type information that the tend SAD operation la perfanm&amp;d on the current LCU:. tie SAG pnramotor abtairtor 24 may perform..the CABAC decoding, in &amp;e bypass mods, on Mts oi an invariable toft length Iplkapng m® second btt of the chroma SAG fypO iptorrnalon. Intontinibn regarding:a bond left start position may be obtamod !irpm;^vhik:Of'tfe':in^i^bfe..Wt Ien0fi of al 1.aas0;nnp::bf: Ifii· luma SAO type inforroafeh and the chroma SAG typo information,;
The SAO paminbfirpblilter 24 may ototalnan cite vain® for fie SAO operation ftei the SAG parameMm of t#LCtJs.:
When 0io band SAG opeMtimls^d^iafminodin be poiannod on the current LOU from the luma SAG typo snfomiahon or the chroma SAC) type infomiabon, It the obtained cifept vaiup i# not % the fynhir obtiin sign Inlormalon
of the oftepE value from -'the^ SAO ^heh the edge SAO operation la determined to be petfomied on thecueeni IGII from the tome SAG type loformaften or^tho sltronia SAO typo infermalidry t aigo of the offset value may be determined based on an edge direction determined based on SAO nlasslnftmtiaidm
The vide® decoding apparatus 20 may include a central processor fnot shown) for potteetiyhfy ppnf ping theSAD parameter ohteteer 22, the SAO determiner 24, andffte SAG adfostar @6, Aifenta^vely, the SAO parameter obtainer 22, the SAG-cteteffiiinef ^^^ρΐ'-βίδ SAG adjuster 26 may bo driven hy thoir individual processor® Got shown) control tevfefeo <tea3^ng::stpparatesr20.:·:· Alternatively, an·;·· sklema! prGce#Qf|rtot sboyrnfoutsidie Hie video dessdihg apparatus 20 may control the SAO parameter obtasner 22. the SAG ck4em>ner 24, and the SAD adjuster 26,
The video decoding apparatus 20 may mclude one or more data storagea |hdt showi:) for storing input and output data of the SAG parameter obiainer 22, the SAG determiner 24, tied the SAG- adjuster 26. The ysoee decoding apparatus 26 may ipolycl&amp; a memory controller tmx shewn) for managing data input and output to and from the data stoiacjes, M order to. perform: a video decoding gppralod tp reooditaot a video, Ilia video decoding apparatus 20 may operate in association wdh an interna! or external video decoding processor The internal- video decoding processor ot lf?e video decoding apparatus 20 may be an independent processor tor performing a basso vs-deo decoding operation.. Afso, tbs video decoding appataius· 20, a c»olrat processing unit,·-or.'.a, graphic accessing unit may include a video: decodih§ pfodaospr mbdufeid perforin a basic video tic coding operatscm.
Video decoding operations using SAG operations wilt now be described in detail with reference to PN3, 3 , Pi6. 3 is a block diagram o! a video decoding apparatus 30 according to one or more embodiments. II se video deoodiPp appoMtua’ 30 telydea an entropy decoder 31,, a. depueutter 32. an inverse transformer 33, a reconstrudor 34, an infra predictor 35, a reference pctoro buffer 36ca motion compensator 37, a deblocking litter 36, and a SAO performer 39
The video decoding apparatus 30 may receive a. bitstream laoliiding encoded! vidib data. The entropy dbeesder 31 may parse infra mode Inlbana&amp;n:, later mocta information, SAO Information. and residues from the bfetraem.
The residues extracted by the entropy decoder 31 may be quantised" transfermstion coefficients Aosordtogty, tbs dtepuapiipr 32 may perform dpC|uanii:a.iop on fig resfdyes to rspanstedt franafenmalion cooficierfe and th# inverse traogtarrner 33 may perform Inverse imnSiQrmaipn op tip rasoaslructpd: rseonebusfed-ooeffideni® is ressosiruct residue! values si the space domain.
In order to predict and reconstruct the residual values of the space domain, infra pradttipn or motion compensation may be performed.
If ihe intra mode information is extracted by the entropy decoder 31,. the intra pretiiplor 35 mOy ^fermlhb reference samples to bo Wortd to resdnitRibt :cunpof samples tom among somptes- spatially adjacent to the eurroot samptes, by using the intea mode Information. The reference samples may be selected from among samples previously reconstructed by the reconstructs? 34, The reconstruct©? 34 may reconstruct the current sampte by using the reference samples determined based oh fce intra mode information and the residual vaiu&amp;e rooentbydfedi by the Inyeifb •tiwistonner· 33. if the mter mode information is extracted by thfteoiro^ddpbdiiT 8i( the motion pompenffHr 3? may determine a felererfee pfeture to be referred to reconstruct current samples of a current picture from among pictures reconstructed previously to tee current picture, by us’ng the inter mode information. The inter mode information may include motor*' vectors, reference indices* etcy By using the reference tedious*: from among pfcteies mcoosaucted previously ip Hie curmhtrtjiefe the retereooe pfotyre buffer 36, a reference picture current samples may fe# doferminsd· By using the motion: vectors, a reference block of the reference picture to be used to perform motion oompans&amp;ffcfe on a currant block may be determined. The reconstructs 34 may reconstruct tee eurrepi samples by using tee values reconstructed by tbainverce transformer 33.
Tlfe reconstructor 34 may feconsiruot samples and may output i'eetvtsbiiGf£td: pixels. The reconstructor 34 may ganera.te^fdeonsmrcited ^xete of aate of LGUs basod on ceding units having a tree strecterer
The deblocking filter 38 may perform filfertng lor redudng a bfedislngi pbedomeheh ^ )dfti^liEi::-¥ispx^eH(t-;rsg^cirii»’ ;ic# tee LOU or each of tee coding units having a free structure, -^ϊ®· -S^O.may adjust offsets of recontteuctod pixels of each LCU according to a SAD operation. Use SAD performer 39 may determine a SAG type, a SAD pliasik, and offset values of a. current LCU based on the SAD information extracted by the ootfppy ctepbeter 31..
An operation of extracting the SAD iofoortabon by tee entropy decoder 31 may correspond-fe an operative of the GAO parameter extractor 22 of the video decoding apparatus 20, and operations oi the SAG performer 39 may correspond to operations of the offset determiner 24 and tee offset adjutttfΜ M Ibo apparatus 2d, life SAG performer 39 may values of tee ollfeet: vtlufs vdit rnspisf to the current LOU bssod bo tee blfeef v^iues- dMermioeb -tan the SAO irifomnalon. The SAC perfereior Si may radtree errors between the reconstructed pixels and original pixels by increasing or reducing puef values ol the reconstryeted pixels by ihihtfMBficp fcrfc;· the Offset values. A picture including the reconstructed pixels oti&amp;etfi^:.:peifb^i#r·5' 39'imay'bf? sterod in the refer wee· picture miller 36 T hire, by Kinga reteinpnee pel» having minimized errors between recbn&amp;ovcted samptasand original pixel! p^ordlng to a SAG bperilpn, moian compensation may be pfiiifDnfted ott'&amp; hf^vi^olisrec.·
Is the SAC! operations* based an difference values between reconstructed pixels and original pixels, an offset of a pixel group including the riconetryctidi pixste rnay be (Jeteirhlned, For the SAG operations, embodiments for rpbpbstructed pixels into pixel groups will eowbe described imdetaiL •:toordift| lo SAG operations, pixels may be classified (I) based mart edge type of reconstructed pxols, or pi) a band typo of rpoonsfructed pixels, Whether pixels am Classified based on an edge type or a band type may ho defined by using a SAO type.
An embodiment of classing pixels based on an edge type according to SAG operations wilt now be described in detail
Whip-.edge-type oltsolt ef a mirrerrf ICO ore defermined; orr adge dass of each =οϋΤ puds teoleded in the current toy maybe^^ Thai Is, by
Comparing pixel values of mthent te©Pf^fitosted:·pfttt»- imd - adjsdehl'.pixete, an edge Bias# of the current rec®isimeied pixels fBiy be denned. Ah example ol determining an edgd class will now bedeseubed with reference to FIG, 4, FIG, 4 i® a Ihite ataiing edge classes ef edge types, aeoardfng to one or mere |mbeiirh^i&amp;
Mfeee Cs, 1, 3, and 3 may be scgo0nf5afly sloca.fed to edgedosses 41,42, 4¾ and 4C If an edge type iccipuently occurs, a shiii Index may be allocated to ihe edge type.
An edgeolas#may Inbteam a dir^ edge# formed between a current reconstructed pixel XO and two ad|a^t pixels·:, ;l^tvlrip,dmi index 0 indicates a ^se vi»i edps are formed between the syn’ent reeohslruciid pixel XO and two horizontally adjacent pixels Xi and X2 The edge class 4¾ having the index f ihdioate® a ease when edges aidipiwd-bi.iwisdri'-t^'^^frifft^&amp;^iliu^ed pixel XO and two verbally adjacent pixels X3 and The edge class 43 having the mdex 2 ibdicstii! a cage #t« bdgeSBre iormod botwoen thc curront roeohstiucted pivot XO ind two 13^-diagsna!jy M|acont pi«efe XS acdXB., Tho odge class 44 having the tatex 3 indicates a case wtieo edges are formed between the current reconstructed pixel XO and fete 45'"diagonally adjacent pixels M and X?
Accordingly, by' analyzing edge directions pfIda current LCU and thus determining a strong edge direction: in the current .LCU, an edge dass of the cweni LCU may he 'determined:
W*th respect to each edge class> categories may he clashed according to an edge shape of a current pixel An example ol categories aeeorAtg to edge shapes wilt now be described with relereede td FIGS, S&amp;aPd §§L FIGS, SA and SB are a table and a graph showing categories of edge types, according to ope or nioro ansbodiraentis,
An edge oal^ory Indicates wltctlier a pprrem pixel befreeporida to a lowest point of a sar»a.¥o edge, a pixel disposed at a airbed corner around a lowest point of a oopcevp edge, a mghosi; point at a convex edge, or t pixel deposed at a curved earner around a highest point at a convex edge. FIG. 5A exempl&amp;rily shows conditions for determining categories of edges.. FIG, 58 o preplan ly shows edge shapes between a reconstructed pixel and adjacent pixels «their pixel vatuei, o a, am b G Moatos ar! mdk>x:o£ a ^hsnt· pixel, and a and h lodicato Indices of adjacent pixels at. two skies of the current reconstructed pixel acceding to art edge Erection. .Xa, Xtx and Xc respectively md?cate pixel values of reconstructed pixels having the indices a, by and c- M FIG.. 5B„ an x axis indicate trrtees of the eorrenf reconstructed pml and the ad|aeeot pixels alfwo sides of the eurronf reosnsfryofed plxely and a y axis indicate pixel values otsampfe
Category i sndteatss a case whon a current sample corresponds to a lowest point of a concave edge, i.e., a local valley, As shown in graph St iXc<Xa. &amp;&amp; Xc^Xb*. II the current reconstmeted pixel c fcxdwebn the adjacent β$Β&amp;;·ς..apd-b'spfTespdin^ |ο·«. lowest poim of a concave edge, the current: recpostected pixel may do dassified as fte category i.
Category 2 indioataa s caea wtiee a current sample is dispoaed a! a aurvedi esriw around a lowest point ol a concave edge. Le., a concave comer. As shewn m graph 52 |Xc<Xa &amp;&amp; Xt>«Xb'h li the current reconstructad pixel c between the adjacent pixels, a and b is deposed at an end point of a downward curve of a concave edge or., as shown in. graph 13 |Xo^Xa $ $ ΧΡ<Χ|ΐχ it the current reconslmefid pixtI c Is disposed at a clad position of an upward curve of a concave edge, tne current reconstructed pixel may be classified as the category 2.
Category 3 indicates a case when a current sample is disposed at a curved comer around a highest point of a convex edp, l,e,t aconvex comer, As etissv^n In graph 54 tXoXb M Xo—X«,\ Η tftf* curort reconstructed pixel o between the adjacert pixels a and b p deposed at a start position of a downward curve of a convex edge or, as shown in graph SSiXc^sOCb M XoX&amp;h if tie airrefi|::r#goPsswe^rf pixel pis:^p^i§d:-:§tr|p·'. ^^dl’·&amp; convex edge, the current reconstructed pixei may Pe classified as the category 8,
Category 4 indicates a ease when a current sample corresponds to a highest ppifii of a sunvdxedpj.e.,. aloeaf puatu Msbown Ip graph §6 i>Cp>Xa iS Xc>Xh),: If the sarren! rsoorrsEroc^ pixel :o-:^et«Mie@pi. ::UnwB-ati^u5wit:plx^s''iBE-':ainii b corresponds to a highest pomt of a convex edge, the current reconstructed pmel may bo classified as the category 1.,
If the eurrsm reconstructed plxaidoes cot satisfy any of the ooriditioPs.of the <^iit^^iilfi^-' ?: 4* the current: reconstructed pixel does riot corresponds to an edge and tips it claialled aa c^tigpry :ti, and an offset of category 0 does not need to
According io one or more embodiments, with respect to reamstruefed pixels correoponding to the same category, an over ago value of difference values between the reeooseucted pete is ..and original pixels may be determined as an offset el a cwrer* category. Also, offsets ol ail oatsgshes may be determined,; the concave edges of the caiegoriss i and ? may be smoothed if recon&amp;tnjcted pixel valdii ire adfustad by usmg positive, offset values, and maybe sharpened doe to h^iftya':^f^|t:v4lLies. "tte convex edges of the categories 3 and 4 may be slhobiidd land may bo sharpened du@4o positive offset values. the video -«flooding apparatus 10 may net allow the sharpening effect of edges.. Hfii^ Judges of the categories 1 and 2 heed; positive offset values, and the boovsrx edges Pi the categories 3 and 4 need negative ottos! values, In this case, if a category of an edge is known, a sign of an offset value may be determined. Aeoordrngiy,. the video onppdlng apparatus 10 transmit the sip: of the oiiaft value and an absolute value si the offset value, A&amp;, the video decoding appirifes SO rrs^y : m absolute value of the dfeot vaiue. encoding apparatus 1¾ may ebsgdi; and lihsmit steel valuta according to categories of a current edge dasss and toa wdeo decoding # mi^ i^asst mummied pixels of the categoric by the received-dte&amp;l 1toiya&amp;
Foe ex&amp;mptey if mi -offset vate c# an edge type is deiwmineii as €L f® video ^epdlrsg app^aiyg 1 o may Masreit only zero value information as the gffeef yipg.
Per example, if .» offset valued! an edge type Is not CL the video encoding appaeatiii 10'may transmit: zero yatoeinfQFmaticn and an absoMsvalue as toe offset value, A sign of the offset value does not need to betr&amp;nsmited tfw vtteb decoding apparatus 20 reads the-zero value intomista trem tfw tobblybd ofiiel yatye^add may read the absolute vaiueof theofset vatoeiffhe offset values Is net CL The sign of Che offset value may ba precieied according to aired®© bitogary based on an edge shape between a reconstructed pixel and adjacent pixels.
Ar^rdingfy, the incodfng apparatus 10 may classify $0&amp; according to #dp dlrecaiorts and edge shapes, may defeifoine an averafp oRmr yatao between pixels Having tbs s&amp;mf diaFa^ahilcs &amp; an oHset value, and maydstormme offset values according to oatfgpfte T ha video encoding apparatut id -may eneecte m$' mrmml ΨΑΡ type information Indicating anedge iyperSAO class Information indicaling an edge "direction, and the offset values.
The video decoding apparatus 20 may receive the $AO type iniormafeiytba SAIL class information. and the olfsef values, and may determine an edge::t*te&amp;&amp;ih;a.ccdr«8i^''· to the SAG type intonation and the SAG class 'mfcnma&amp;cm. the video daooAg ippaiitttp 20 may determine in offset value oi reosnstruettd pfoels of a category dprroippding: to an edge chape according to the edge piroefio^ and may adjust pfa$t values of Hie reoenstrusted pixels by the offset valuer thiteby minims^ing an error fetehv^h/an;origin#ihR^(^d aroconstruftsd'lm^jsi:;
An ambodlmcnt df classifying «Ιό based on a haml typo according Id :¾¾ operations wilt now be ^iitibed *n detail.
According Is .enew.thore embed «merits, each of pixel. valtwof rs^PStaBtedy pixels may belong to one of a plurality of bezels For example, the pixel values may have-a total range from a minimum value Min of 0 to a maximum value Max at 2*!p-i) according! to pfoit sampling, if the total range {Mm. Max) of the pixel values to divided into .&amp; idtoHiSlbLeidh mtoryal of the pixel values is referred to as a band. If Sk indicate a^ajomttmivatee^i*hikbrntikbands (Bo, Br U, [8i, Bril, |B*, %i|,. ..i, aid |Brl, Bhj may be dswded. I a ptttof value of a current reconstructed f»xei Rooney) belongs to toe band 1Β*.ί, 8<J, a current -aikc Tte tends may fee unibmity of uournformly divided.
For example, % pixel values are classified into equal 8-bit pixel bands, the psxel values may be divided mtc 32 tends In more deled,· they may be. classified Into tends [Q, Ju [8, IS],· «,,.ρρ, 247], and |248, 255].
From among a plurality of bands dasafei accodjlng to aJaand type, a band to wriieb ;iaph of pixel values of raooaginieted pixels tefenp may be determined, -Alep, an offset value indicating an average gi errors between original pixels and reconstructed p,H#is to each tendmay bst determined, .^c^jrdihg^ tbe vlcfee erKOdirtg appamtus 10 end the video decoding apparatus 20 may eneode and tmnsceive an offset: corresponding Id each of bands classified t»*-«uriri^tond type, andmy t#st m-corntmaedpixels by the ©tot
Accordingly, wth respect to a band type, the video encoding apparatus 10 and tfto .Video decoding apparatus 20 may classify reconstructed pixels according to tends to which their pixel values Delong, may determine an gif set as an average of error values of reconstructed pisete that belong to toe mm* tod, m* may adjust to* riObngMtte pabts by the offset, thereby rmnioteing;m error between an original itoage artil a reconstructed image,
When an offset according to a band typo is determined, toe video encoding .apparatus 10 and toe video decoding apparatus 20 may classify reconstructed pixels into categories according to a tod posfem For ®mmp$*t if the total range of the pixel values is divided into K tends, categories may ism itoto sstetolhg fo^m bind index k indicating a xto band. The number off categotes may be determlto to terftepbnd to the number of bandfe,
However, to. order to: reduce amount of data, The video encoding apparatus 10 ansi the video ttetotog apparatus- 20 may rsstoct to number efl categories iirf to detemibe bilged storing: to SM3 operto;ior>s. Far ex®rtple¥ a predatertoined number of tends that wo eonioyous from a band having a pradotofmlned.start posllsn la a direction m which a band index *s Increased may be allocated as catogones, and only an gffibl gf edch: Pftogary may be determined.
For example, if a tend having an index of 12 is determined as a start band, lour baito-from ton start bto, i*.t baiT^;h'^^.:^le^;.:bC-1.'2,,i3fl i4,:®d IS miy be affected w: categories-1, 2, 3., and 4, Accordingly, an average arror between; reconstructed pixels and oegmaf pixels included in a band having the index of f 2 may be ctetormined as an offset of eatogory 1 . likewise, an average error between rerxMistructdb pixels and ongtnai pixels included in a band having the index of t3 may be ^sierminedlas m offset of eaie|piy 2. an averse error bbfwoad i@cMsLru€ted pixels and original pixels Inputted m a tend haying ffee; index of 14 may be cfeterminad as an offset of category 3, and an average error between reconstructed pixaie and odgnfl pixels included in a band having the indtexei 15 may hedefomlnad as an olset ©I category 4. in this case, information regarding a band range start position, 1¾. a loft band positidd.m mgurrod lo detwmmo positions el tends allocated as categories. Accordingly* the video encoding apparatus 10 may encode and transmit the information about-tia start band position as theSAO class. The video encoding apparatus 10 may encode and transmit a SAO type indicating a band type, a SAO class, and offset values acdordlog: to categories.;;
The video decoding a^ the SAO type, the SAO class, and me offset values according to the categories, li the received SAO type is a band type* the video decoding ap^ratoi A may raid: a iiaif band ^iiimn from the SAO class, teipog, from among four bands from tlw stah band, may determine an obsetvaloe itoted to a current band from among the effset valyec sccejdmg to the categerfos, Mid may adjust pixel values of the reconsamted pixels bythe off set value.
Hereinabove, an edge type and a bandlype are introduced as SAQ types, and a SAO a. category according io lie SAO typrare described In detail SAO parameters encoded and transeeived by the video ecoodlng apparatus 10 and the vidbo dioedldg appemles 20 All now be described in detail.
The video encoding apparaius Id and tie 29 may determine a SAO type according to a pixel efassificatfon method of reconstructed pixels at each LCU,
The SAOtype may be dstermined amsrding to image Qhamotetefcs of each block. For example, wish respect to an LCU including a vertical edge, a end a diagonal edge, In order to change edge values, offset values may be determined by classifying pixel values according to an edge type. With respect to anlGU net inolydipg ip regfoo,otesf value® may ^ ditemilncd according to band
SieSiiflDallqft, Accordingly, tse video encoding apparatus 10 ami the video decoding apparatus 20 may signal ttis SAO type with repeat to ©ash of LClia.
The video encoding apparatus 10 and the wJeo decoding apparatus 20 may determine SAG parameters wth respect t&amp; iA*1&amp; That is, SAO types of mccmstwtad'pixete· el an t€!i may be determined, the reconstructed pixels of the LOU may he Massif tad into categories, end offset values may he determined according to the categories..
From among the reconstructed puels inducted in the LCD, The video encoding apparatus. 10 may determine an average error el reconstructed pixels classified into the wte: patppry, as an otteet- value. An offset value: of each category may he A£®3tdih| to one or mom embodteienta, the SAD parameters may iociude a SAO; typt, ind a SAO dlaai. the vided decoding apparatus 10 and the video decoding apparatus 20 may fransealw the SAO parapitim doterminod witfe fiipfpt to each SPtl ptom arhenQ SAO;;p^me!or§ of an ICO, the video cocking apparatus 10 may ahooda; ini ttihsmlt the BAG typo and the offset vafijes. it the $ΑΘ typeis an edge type, the video encoding apparatus 10 may farther transmit a SAO class indicating an edge direction, which is followed by the SAO typo and^ Ibe otfpet vetoes according to categories, if the SAO type is a hand type, iber ytd#o: adMidg-.^Pilia^^is 10 may further transmit a SAO class indicating a start band gnadft&amp;jiettfffi ilisftofcvi# ,by;#e.;:' SAO type and the offset values aeooi ding to categories,
The video decoding apparatus 20 may receive the SAO parameters of sects LCU, whioh ingiydss the SAO lypsy the nfiset values, and the SAG class. Also, the video ifeeqdibfi apparatus 20 may select an offset value of a category to which each rembstrudted pNel beiodp, tom among the off sot values according to categories, and may adfust the reoopstucied pixel by the itieptad offedt mlue.
An embodiment of signaling offset values from amonghbw be daeonbedi in order to transmit the offset values, the video encoding apparatus 10 may furtfier transmit sere value information. According to the lero value information. sign ihfdrmatioo and a remainder may be further transmuted
Tfse *&amp;tq value information may be a i -bit flag. That is. a '0' flag indicating that the offset value Is Q or a T flag Seating that fti slfiM value 1® rq| 1 may bn iransmted. I llie iero value information is the ‘0! information of the mmaindbr does pot need.io te«eorted Huwow* tf the»s value iplo^atoi ia tfee 'V flag., fee agn infcfrmatton and the remaster may be turner transmitted.
Hosiery as ffeaerlM ^φ^^·ίί^ί|ΐ:·:ρ6«ρ^Ι:Ι0 the edge type, stnoetbe offset value may be p^idted as a pasiiye number .or a. negauve «timber according to a category. the sign mforniaUQn does eat need to be transmitted Accordingly’. It the sere value ielermatiOP fe fie -11 flag, the rMalhler may be torfber transmitted
According to one or more embodiments, an offset value Off-set may be previously restricted vdim a raego fem a mloimom vafye MmOffSet and a maximum value MaxOffSet before the offset value ss determined I'MmOffSei ^ Off-Set s MaxOftSet)..
For exampte, with reaped: to an ed^e type, of tel values ot receostrycted pixels of categories 1 and 2 may be determined within a range from a minimum value of 0 to a maximum value of 7. With respect to the edge type, offset values of reconstructed pixels of caiegprlei l and A may be m from a minimum vatne of -f i 'to £: rteifhyrp: vtfub of .&amp;
For example» wfdi respect to a band type» offset values· of reeopctryslmf^pixelsM all batciorios may be determined within a range from &amp; rmmmym ν&amp;ΐα**#Λ??;1»,φ maximum value of ? imnsmteion bits of an offset value* &amp;: remainder may be niitrlcled to itp^blyalyo instead of it'dagillvi pyrbbih Inthis case. the remainder may be grtiMr than m ορΜρί to band may be less then or equal to a difference vaioe between the maxim urn value arid the minimum value (0 sRenMinder s MaxOffSet -MioOffSet * 1 us if She video encoding apparatus 10 transmsls tbs remainder and: the video decoding apparatus 2D knows at feast one of tfto maximum value and the minimuM value:of its ofIset valuer an sdglraf offset value may be teeonstructed by uslng only the resolved remainder. FIGS. iA through 6C show relationship® between first and second chroma components 61 end 62.
Poring operations of nnooding and decoding a video of e color lmage, Image fefbrmatipp is generilly classlled fntd a luma component and trst and second chroma components for each color component and stored in a memory, in FIGS, 6A through; 6C, tee first &amp;odf second chroma components 6f and 62 are stored in the memory in an interleaving order among color components of the same image block
Fi6. 6A shows samples Ihal am referred to among neighBoring samples Of a tell hlobh »d art upper blooh whan an totra pre^etten is performed m tee first and s<aoood chroma components 6i and 82. The first chroma componen t 81 may rater to a fun chroma component 6$ neighboring, the tell block or a first chroma component 68·-heigliMmng the upper bfobk. The seeped chroma pompodenf ii may ?efer to a aeootid chroma component ii nelghtehog lie Jieft: bioclr dr m seoohd chroma component '64 neighboring the upper block.
However, to the fnira prediction, the flmi and sdopnd chrotea.e©tep.efteRte§1- and·. 62 itiay «hare m ttm predloiipn direedep. Thus, tfe %nra prediction may be? slmglteneousiy j^rfotmeef. or tee first. and sscsnd iCfirtms,- components 61 and 62 by obtainlbg the first sod Sfobmi chroma aompoopnta 63,64, 6St and 66 of the felt block or i;be upper block that are stored in the memory m the interleaving order.
When a motion compensation is performed, a luma component and the first and Second chroma components 61 and 62 of the samp image block sharp a motion vector, and ta an loltr prediction may be slmrtapoogaly performed an the first aad spcpml phrsata pam^npmeii arid 62, .Wften a loop filtering $s performed* fitters having die same size and coefficient are used tor the first and second chrome components of and 62, and teas the loop filtering may fee sfiTiuiStapaouslf performed an tee fast, aod second chroma components 61 and 62,
For a sample, whip an edg® lypo 8<AO bporatidn Is paridt'mad:, rileliohatifbs hmm&amp; SAG operatioos with rcspaei to tte first and eheoma- mmpmmw$i and 62 wit: now be described wite rafmBPaa to fIGS. 6B and 6C. i Is assumed like FIG, 6iB .teal i SAG edge dlrecaen of a current first chroma component 6 U is defonmtrteci -mavortfcil dlreotiom aid Iho SAG ccigo ckcclten of a •current second chromadirection,. To perform a SAD operatico oo tbo csinent fel olirome oompoheat 611, first chroma components 613 and 615 disposed above and below the current first chroma componerf oil need to be obtained from the memory. To perform the SAO operation on the current second chroma component 612. second chroma components 623 arid 625 ditpbsod ioh and right die ®yr?#« ieboed bHrdma componf p| 612 stood Ip bo obtained IfDhi; tte memory.
The first and second chroma compo-neuts 8 i and 62 are stored in the memory' in tM tntedeewg order, and tftui samples stored in different -bei #imultaneodSfy obtained from lie memory through a denntmle&amp;ving ptopf s®. Aftotfte SAO opemferr Id performed on the first chroma eempooent 61 through fee ^toteffeaving · process;.' toe ·' eA^h.opersiisri'-ls perfomied on too smxmd chroma component 62> and then the de- imertoaving process needs to bo i^dorrned.
Thus, when SAD edge dlreottons are different, lie SAG operation may not: be simultaneously performed on the first and second chroma component 81 and 62. It the SAD operation le eeduontiafiy performed on too first and second chroma. Oomporwit if and 6¾ latency occurs' during partotei processing of video coding, which may rosul in a cfdfey to ehf to todeo coding operations, H©peyer,--.11 is assumed Tte PM SC thattte SAO edge drections of the s-yrrent Into utoroma component 6i 1 and ihecurrom aecondchromacomponenf 612 arc equally determined as the horizontal directions. To--perform the SAO operation on the eurrenf first chroma component 611, first chroma compgeents il 7 and 619 disposed 1# #1 right the currant flrit chroma component 611 may be oDtalned from the memory. To perform the SAO operation on a current second stecrnc component 621, toe · sensed: phrema pempbhehfe:and f 26 disposed feft and right the oonenf second ohroms oomponent 62 ? may be obtained from the memory. In this ease, samples stongdip φ®:·:. same direction may bo smuitaneously obtained from the memory, and fhua the BAD operation rn&amp;y he ^muitaneously performed on the first and second chroma component 61 a'>d 82
Thus, il the first and second chroma component 6f and 62 sham a SAO type as shown Irt FIG, 6G? ..parallel processing latency may be priypbtfdrto itorance, and a bit number ol BAG parameters with respect to chroma components may be reduced two iitoes.
SAO merging information among SAO parawterpacc^^ vM mm be descr&amp;ted in detail below, SAO types andtor offset -values., of: adjacent bfoaka may be probably the same, The video encoding apparatus hi may compare SAO parameters of a current htock to BAG plrameters of adf£ceitf blocks and may merge end encoder the SAO parameters ol Ilia current block and the adjacent blocks if toe SAO parameters are toe same It the SAG prsm^tors of tlto idjicifif:block am previously eboddeto Tfe SAOparamatem of thdi ad|acshf block maty he adopted as the SAG perametofs of the cyrrenf btocfc fie video encoding apparaiys l O may no! ehoode toe SAD parameters of ''te-VduiMl-:bid!Bk and may accede only the SAD nwpig Interm^aticm of Ida euwan! block.
Bdlpra the SAO parameters are parsed from a reserved bilsto&amp;aro, the decoding apparatus 20 may Initially parse lisa SAG mrmjipg W©r.mafaR. ..and may ^btermine whether to par^d tod SAO parameters;, The udeo decoding ^pdlrpius 20 may determine whether an #d§aoePttbtoek having the same SAO peramuiBrs as Ihoee of the . ^Msed' .^rif ;fti® SAO nwrgmglofomialori> i an adjacent block having ttto aatoi SAO pemmefem as toose of toe curmfT tek eiisii based cm toe SAO merging tefmmafoo, toe video decetlmg apppllfui 2§mif ript paisA;^#e iAO parptoetoftoW ^ reconstructed SAO parameters of toe adfacent block as toe SAO parameters cl the current block. Accordingly, the video decoding apparatus 20 may reconstruct the SAO parameters at toe current block to be toe same as those of the adjacent blpek. Atto, based on toe SAO merging information, an adjacent block having SAO parameters to be fitoited to may he pgferenioed.:
For example* i the SAO paraitoters of the adjacent blocks are ditorcot iTom too SAO parameters of the current block based on toe SAO merging mformatkm, toe video dcobdlhg ^psmtua 20 may parse and reconstruct the SAO parametoroto toe current block tmm toe bitstream.
FIG. ?A is a diagram eh^ 652 and 553 retorredto morgaSAO parametert, aseQfdteg to ene m more ernhotoments.
The video encoding apparatus TO may detortntoe a candidate Hat of adjacent UGUtoto be referred to predict SAO pammeters of a eorrert LCU SSI tram among adjacent tCUs reconstructed prior to the current LCU 651. The video encoding apparatus io may: compam: SAO parametir^ M tt^:touri^ht IlGM: 651 and the adpeent bOll# in the os»! date feh
For exappte, slrnpiy, the left and upper LCUs 653 and 652 of toe current LCU €St jR«^fi^l'bi^ro€iS''toay be is^cluded ih toe candidate Hat
Accordingly; toe video encoding apparatus 10 may compare SAO parameters of the advent LGUs Indu-ded in toe ear;dktote Net te these ι^'·1ΐ0::·)ρ^ρ^|' LCU 651 according to a reference order. F or example; toe SAO pararn^mimayili^i'mibp^ied to ttoepof toe:. iτι th© LCU 663 andtoeupper LCU 852.
From arnung the compared tetoaod upper LOUe iSS end 652, an LCU having too game SAO parameters as those of fee current LCU 65t may be determined as a reference tm I» aste; i predict fee SAOp&amp;rfenefefe pf the cyrmrfeLGy SSy the video iapMrrg ap^rahis 10 and fee yideoitecodfeg apparatus 20 may refer to thssama arlacafe: LCUs, Also, SAD mergi^ JRloaoaJiori ludlcaflng so LCU having iA&amp; parai^oters to he referred to may l?&amp; teesmtted and -obtained. Tbevideo decoding-apparatus 20 may select: ©he of the adjacent LCUs based on the SAO merging information. and may reconstruct the SAO parameters of the current LCU 651 to be the same as those of the so looted adjacent tCU.
Fore^apple., it is assumed that th#Mtaed upper LCUs 653 and 652 are referred-The SAG pararnetfer ahcodfer IB may secede left SAO merging tnformatieri indisalrip •whether the SAO parameters of fee left ICO 653 of fee eufreeltcy @il are tte sarrie as those of the current LCU 661, and upper SAO merging tnf$pha^ the SAO parameter!! of the upper LCU 6S2 am the same as those of the eurrmf LCIi SSt, as the SAO merging information. In this case, the $A0:.ffiarametem-of.
IcOy iit arld^feei IMt LCU 653 may fe inS»afty compared in betermfee whetterfeey are the same, aed feenfee SAO parameters; of the ewent LCU 6S1 and fee upper LOGS® may be eefepared ίά ^ According to aeemparisori result, the SAO merging information may be determined,
If the SAO parameters of a! feast eeidf fee left· end upper LCUs 653 and 652 ere the same as those of fee aUftept LOil 651,1» SAG parameter encoder 16. may epeode-oaiy the felt or upper SAO margir^ tetermatiee and may net ene^e fee SAO pammefem of the current LCU 651. II fee SAG pammetpie el belli orfee left and upper LGUe iSS end |$i ere 0twmt tern these #febeurifefttLOU S5i*^;^d:'p^|ifteser encoder iB may encode the left er upper SAG. merging ioformaieo and fee SAO parameters offeie oyrrenttCy 651, SAO paramMife^ color components wl mm be described in det&amp;ik l%e yides arfeodlr^ apparatus I# and lie video decoding apparatus 20 may mutually prsdiot SAO parameters between color Pompeneete. 'lhe;iiridep.· enaodfe|| '-appariiiia 10 and fee video decadWg perform a SAG operate bn al ©f a luma block sad oftroma dfodts in a iiClCb color fpfeiit, Offset values of a luma -oorripondfe Y and chroma components Cr end Cb fe p current LCU may be determined, roapeelvety
For example. common SAO merging information may be applied is the Y aofbpbnenh the Cr component, and the Cb component of the current LCU, That is, based on boe piece of SAO merging information, it may be determined whether SAO componentam iie samo as todda bt/the V' component of art fc^,^.elermlr^::idt^Hi®r':: SAO pstwn#tors d the<®r.· component q| the current LOU arc the same as these erf the Cr component of the adjacent LCU, and I may he determined whettm SAQ paraa'ieiem of the Cb component of the correct LCO are the Mmm as those of the Cb eofriponeot st tbe adjacent LCU, ^''tSKito'^'dbOl^eft'SAO type Information m&amp;yibd applied lofhcCr competent and the Cb component of the eyrroni LCU, That St* based on ono piece of SAG type ipterntatiee, it may be determined whether SAO operatiea is afmultaneoysiy' performed mim Or component and the Cb competent or net Based on one piece of SAO type information, it may also be determined whether offset values of the Cr component 0¾ the Cb component am determined «Ksoording to an edge type or a band type, if the SAO type is the edge typo .based m w piece of SAO type information, the Cr component and toe Cb component may bo share the samo edge direction.
Steed on on® piece of SAO type toiormeiao, !i mey also be determined wbetoor crffpet ydlyei pi the Cr component and the Cb uorriponett are deieimtoed acoorcfing te ae edge type or a band type,
Based on one piece of SAO type information* tie Pr component and Its Ob-eomponent may also share toe same SAO ciass. If the SAG is b^sed on one piece of SAO- type inierrnetion, toe Cr oampon^nlPKi the Cb component may be share toe same edge dib^iigri. if fhe SAO type Is the bAad type based oncee pmoe of SAO type fntemaldni: the Cr compsnentami the Cb component may be phere the same left band start position; stpMufes lb Atotoh SAO parameters acos;rdtng to ool©r components of a surmnt LCU are defined wi now be described in detail with mfemnee to FIGS, TB tofeygtt ?G below.. The video deccdtog appamtos go,may parse syntax shown in FiOS. 7S ihtwgf? 7GS obtainlheSAO parameters.andperform a .SAG opemion. FIG. 7B shows syntax structures ol a sice header 700 and shoe data 70S according to one nr more embcdim.erts.
The slice header 700 aeesrdtof mmom parameter 701, 702, and 703 indicating wbotoor SAC operation is performed on a current sfteto
The video decoding apparatus ip may obtain ’pep^aiiipl^^aptive_etiselJ1ag|0|! 701 tom So slop header 700 and determine whether to perform the SAG operation on a luma component 'llIhe SAD epemtinn lor the luma oep^eMoi is performed, tie video decoding appsiBlos i0 may obtain "ifc%japipte.:.adaptive! qf IsetOfegil f 70S! from the sitoe header 700 and determine whether to perform the SAO operation on a first ciiroma component
In this regard, the video d^oding ^ipafalys 20 may not further obtain a t^ramo^r iriloafei^^ wtifiior to perform the SAD operation on a aeeeod chrome eomponont from the tfice header 700. Information 'sifce_sampia.„ad^ptiva^ofiset.j!ag|2f 703 Moating whether to perform the SAO operation on the-seeomf Aroma aompn^ni may M/^alyupredildted. from fe ‘shoe, sample. adaptive. offset ,8agf 1}' 702 obtained from the sties header 7P0. ThuS, the SAO operation may' or -.may not fee simultaneously performed on the first: and second ehtpma; ocMBponefeM.
The video decoding apparatus SO may determine whether to obtain a SAO parameter 70S according to iUCs from the t§oe data 70S feeaad on %goeu$am^ 701, '5lee^ampie_adaptive_„siisetJiag|i:|1 702, and · 70¾ that: are- determined from the sloe header 700.
FtGS. ?C and 7¾ -show 'Sjji^isteetuMs at SAG parameters 70S end 709 with aspect ίο LGUs ιοροΜΙπι to one or awe emfecHfiments,
The wfao deccdvig apparatus ?0 may obtain 0-b SAO merging informal n 707 tram ffee SAO pirarnetar TOS 'Sie^unit^cabactrx. ry, ciaxy with re&amp;popi to LCU&amp; In this: regard, the common lei SAO merging information 707 'sao_mer^eJeijag |rx]|ry]‘ may bo ofetaifeed: wfthout:distindppp of a luma component and:IfSt; and second tcbromAi: components. Accordingly; the video decoding apparatus Ife mey almultanbayely ami equally determine whether to use a SAO parameter of paramepfi of a luma: eemphent and seefend chrome. components· of a current. LCU baaed on the common toft SAO merging mformation 707.
If it is defermsned fhst tlie SAO parameter cl the left LOU is net referred to Meed' can the left SAO : 707¾ fis $d'e&amp; ^tyiihp.^afatus ?0 may obtain yppir SAO merging ^formation 708 ’oao^mergq...epjlag |rx|[ry]' from tha SAO parameter 70S with respect ta fee LOUs. ykowfse, the common left SAO merging infomiattoo 70? may be obtained without dtolncttoo of the luma cem:poneht and Ibe first arid seecrnd ifWraa compcNftf^te, teoningiy, the video decoding appamtos 2f may simultaneously and equally determine whether ID use a SAG parameter of an upper LCU as SAG p^ametord pi the luma component and the first aM i@ednd ^rptfta c^pof^is/oi^oji^nt LCU baaed on tfie o^mou upper SAD rrwprp. 70S, fi si is determined that the SAG parameter of the upper LCU is not also referred to based on toe upper SAG merging information 70S, toe video decoding apparatus 20 may' directly obtain aeuntent SAG parameter 709 with reaped to the ourront LCU from dm SAG para Pieter 706 wib mspeef lb the LGUs:
Thf curred!^ SAG p^f^itotor TCDmay tecfudeSAG type iefermMion 711oi fee current LCU- the video decoding a|5peratus ?Ci: may obtain the SAD type information 711 soparateiy defined with respect to a luma component and chrome components from the current SAC parameter 709. Thus, the cornmed SAO type information 711 %aojypeJdx (eldxl[fx]fryj' mayim obtained with respect to toe fitef ®d second efiroma. components. For example, if the SAO type information 711 is obtained with respect to the hrst cnromt component pi the currant LCU, SAG type iMemtatipp with feepeDt to toe dQibpotoent rndy be piediteedlrwn the -SA.G type informtotoivTTI with respect to toe second chroma componsob- I pit indicating whether SAG operation ss performed on the current LG4I may be obtained from the SAO type snformatkm 711. if it ss determined that the SAG dpemfien il performed baaed on a first 1 bit, a eeebnd 1 bit may be obtained from toe SAG type Information 711, and it may be ctotemiined whotoer the SAO type ot me current I.CU is tp edge type or a band type trpm thefecohd 1 tot
If the second 1 bit of the SAO type information 71Ϊ is determined to be the edge type, the video debodlnQ apparatus S0"irnay estate tetormafen regarding an edge oatoflDry hromytmaming bits of the SAO type iniorroalion ?! 1, if the second 1 bit of toe SAG type information 7H is determined to be the band type, the video rteccdieg apparatus SO may obtain n@fpajrdlirpp::m from the rgm^toing tots el the SAG type information 711. 11-¾ vtoeo decoding ^operates 20 may deteernme whether to perform the SAO operation on toe torna component of the current LCD based on the 1 bit ol the SAO type; tofbrmatich 711 regarding the luma cemportoni the-videe idfebdlhg i^iratos 10 play perform the SAO operatic^ 'w.Jhk:· fc&amp;M&amp;weond chroma components of the correo! LCii based on the i bit of the SAG type information 71 f regarding the chroma components, if si te determined that tl-pe' ~SAO: oompenents of the current LCU1¾ not performed based on the SAG type infowafiofrfl 1 ter tte Jynia c«smpGiMiit «:· )te-st3rofT^,:!OGMTiponents, 4-next bit -is net .obtained from ifm SAG type information 711. Tlie SA0 information 7Vf may be received «η a msmW$ unary code form.
Only one p«eoe of the SAG type mformation 711 for tee chroma qpm^ppils according: to an embodiment is encoded,, the SAO type information 711 determined for foe imbdrmdra s^sTipooent ::mgy for the seernvi
The videodecodifsg apparatus'Sid:,may';0teln--edge class: tnforma&amp;n for the lama dbfopdflint and'..edge class information for the chroma component from foe SAO pirifoefor 7D9 'saojiit'setj^ba^rx, ry, cidfo’ respect to the eppenf LCU. An edp pipes may $nd cate four edge directions including a horizontal edge direction (CH, a vamcal edge direction iSK>*L a ISS'diagnoal edge direction,, and a 45' diagonal edge dlredtoivand thus tee edge dass may be deined as tebite, flS. 7F shows asyntax stmetore of SAO pararrsefor-s wifo respeet fo SAO types according to one w more embodiments.. Referring: to FIGS, 70 and 7F. I a SAO bpamiion i$ pedomned based on fbr SAG typo Wbrmatfor? 711,. foe SAO parametirs Wp: apd 70¾ may lubber include at least, one of an offset value 713; teae^alla^eie[p||rylp]? and affsit sfo^ sbformatjotejif
Context modeling for GABAC encoding ol the offset value 713 Mil be described to FiG. 7E, FIG, 7E sbows a syntex Slfycfore of context Information for GABAC encoding of SAO parameters according to one or more embodiments.
TfisiIfo asshown1ft FIGS, 70 and7F, tee wise decodingapparates 20 does not obteln thbo^ Tit tom bath tbs SAO paiarheters 796 and 709 but rrmy INtiy obtain a first 1 bit 721 lsaajBffset_a^ opg^-·'^ Jhi£. offset vafoe 713 as shown in FIG. 7E, When the ..first 1 Pit. fo not 0 since the offset value 713 Is not 0, the video decoding apparatus SO obtain refoalnteg bis 723 of tba magnitude of the oicet value 7t3>
Theiirsi 1 bit and the remaining bits;713 are.s«p»at#'f^m'; otter, amt thps fhe retelibing ill may be GABAC encoded In a bypass mode.
Only when the offset value 713 is not 0. the video decoding appar&amp;ies 36 may obtain the pflset sign information 715 Ttao^offsefjigo[cldx]|r>:][ry]|i!'of file pffiet value ?13 horn thiiSAO pars meters 706 and 70$. •The. offset sign Irlfcmration 715 ^^sffsrL^97i|cid^jfrx!(ry:||may be obtained only when a. SAO type Is .not a band typo and the offset value 713 is not 0, When the SAQ type &amp; an οφ&amp;$^|ί;$1&amp;η of the otfseil^iue 713 may Pa determined tp vdfte^r^ed|e-'-cias» i$&amp; febil pak, a local valley., a concave edge, or a convex edge..
Referring to FIG. 7F, when the SAO type is the hand type, information 717' %ao....baird«i»Sition[cld!<linf|(ryf regarding a left band start position as well as the Offset sign infgdualon 7i S may be obtained from the SAC) parameter :d&amp;c»dlnci^^ii|^ 20 may perform CABAC eooodihg od the SAD parameters:--706' and 70S, To perform the CABAC ©flooding.-on the SAO pammetem 708 d;iii 7^, ^Tlext rnodPling wfth rospeot to the Wt SAO merging information 737, the upper SAO merging mformation 708, ieteeatod regardaig the offsqt vifue 713, aed itta SAO type information 711 among the SAO parameters 706 and 70$ may be performed, The absolute value magnitudes bf the offset value 7T3fo thelnformafion regarding the e&amp;et vifee 713 may be restricted accpndtng to «:Bl A target value of the absolute value magnferfe may be determined according to an equation below.
Oliset.. abs...max *· tt<<fl«intbftOepb>, 10)-5))-1 for oxBOipta, m .g*bit bit depth decoding, itiabsolute v&amp;te mtgpttyde of the offset value 713 may he iftsii 0 to E For another example, $n 10-bit bit depth decoding, the absoMe Mluo magnitude of the offset value 71S may be 0 and 3t..
To guarantee the magnitude tosldstlorf sf the offset value 713, tip IhfeMaloh regarding ihe offset value 713 may be encoded by using the truncated unary code-
The video decoding apparatus 20 .may use only the .context model with reopeotlo the first 1 b-v of the information regarding the offset value 713. inn video decoding apparatus 30 may perform CABAC decoding on the remaning b% of the irffomittion regarding the.qfleet vafob^ 71:9fin the bypa#rftipde-
The SAO typo fnfpftnatibn 711 Ihcfpdpf vafbosfrom 0 to 5, using 2 context models· may bo porlormPrf on; th# irSI 1 bit of the SAOtype Information 7ii indicating whether to perform the SAO operation of the eurrem.LCU, CABAC decoding may be performed on the femainirtg bits of the SAO type informaion 711 other than the first i bit ·η the bypass mods ;merging Information 707 may be OASAC decocted by using a sin#8 cooled modelshared -bf !h# hm. component and t® first s^ond shrpf^ componentil. The upper SAO merging hifomulion TSTmay be CABAGdeeydcd ^ using ;p@ sipga model shared by tte luma comppftpn! and the first and seedf^ chroma components.
Tliiaallrt, a total number of 5 context models may da used to padirm deoodinf on 1¾ SAO parameters 706 and 70S. Thus, three canted models may ^ reduced sompamd to a case where contest models are determined with respect to &amp;- bins of the effect vafyt ?13. and the tell SA0^ miigfng fhiprrT^^iidrh ooior components.. An artieynt of data storage dial needs to do stored in a mcrbbfy may he reduced owing to Ihe mduetioo in the eenteimedels for CA8AG decode Ite of a pfuF&amp;% of SAO paramo^ors are CAB AC encoded to the bypassmode, and ad arhount of CABAC catcuiabon and transmission feits may be reduced
Ih# mformaiion 71? *saoJ^andjx3^^|dl^|(;ri<3[iy|! mgarding the Jilt band position included in the SAO parameter 709 has a S^lt thv»at5le bl length and a falue of 31. The video decoding apparatus 20 may peborm CABAC decoding on infermstiop 71T reprdinp the ^ stad pstilen In a bypass mode of iheyw&amp;rts#^ Nbiength, ;A iprboess of parsing various pieces of SAD related Wermatipo from parameters through CABAC decoding will now be described below, A SAC3: typd of &amp; luma eonipoent is parsed from SAO parameters., li the type »s an off type f'GFFh since offset adlPsimerit acooriing to SAO operations e performed on the luma component, SAO parameters of a chroma nomponent may' parsed. f Ihe an edge type (E<X luma ©t
Ippr categories ofsy be parsed.· The offset values of the edge type may be parsed ewttat sign mfomteOdm A luma edge bliss (lorni EO dossl of 2 hits may be parsed from SAO parameters.. An edge d-rechcm of the luma component of the current LCU may: be dotdftnlnoi based so the luma edge class.
As described above smeo offset values of tour categories indicating edge shapes Ireyecesved»·. four offset values are received. Since each recocMryeted ilyma pbrel of the current LOUmay Pe compared to adjacent pf^efe accdr^pg fe e «die tlrdPi^n And thus its edge shape and Its: category may be dofcFmined. an offset vmfug cf a. current category:.among the received offset values, ;;A:hgiyeT'' ValufToftte mconsfrusted luma pixel may fc© a#is!®d by using foe selected offset value. II the SAO type of the luma component §s a band typo (SO), luma offset values at feur ^efortes fepamte, The offset values of tte bandfype may be ρ«ίθΐί logtlfcoi; Intofimaipa, A luma tate dlasa of 5 bits may ha parsed A left band *!S^v3P^®ET#%· among a plurality of temfo of foxfo values of iWPMruotedbased on tte form tendeiass.
As described above, smce offset values of lour categories indicating (our a start band potion emmcoived:, a total of four of foot values ere meefoed, Slab® if may be determined a bled ti white eate moanstrucfed luma pxel of fofoteteni LGUbeltep aed thus its tefof b# may be de!ertened,aotesei value of a current safopry m^fe sefosted from among tte received offset values, A pivot value pf;tbf may te Rusted by using foe selected offset value.
Them a SAG type ot a chroma proponent is parsed from SAD pmwmm®.. Tte SAO type may be oommonly applied to a Or component and a Cb component,- 1 tte SAO type is an off type (OFF), since ofote adjustment according to SAO operaliens Is not perforate on the teroimtemfystete tte pmo«
If iteSAD type pf lhe::terpmd oefoiteteerfo^ of four categories may be parsed from SAO parameters. The Cb offset values of tte ictfO typo may be parsed wteatsfgn iafoteuiort. A terdmi teib olaii (Cbroma: i0:; blasi) of a Pits may be parsed from SAG parameters. An edge direction of the teroma obmponent^M ite puttenl LOJ may be determined. baaed on the chroma edge class, Tte ehroma edit class may also be cnmfBrmfy applied to tte Cr componem ted the Cb epmppriertt Cr offset values of four categories may be parsed from SAG pmwiofar®, yke ofiset abiusiment on foe edge type of tte luma co:mponentv on each ofite Cr ^mponinf ate foe ^ an offset value of a currecl pategory may be sufopfod ftom amoei received offset vtejee, A pixel value of a roeoiterucfod pixel of foe Ipr cteffonebt or vise Cb eumpaneet: may value,
Ilbi SAG type of tte terpmatemptetm fi a tend type (SO), offset values of fob Cb ooifiotebi of four cafogotet may bo parsed from SAD parameters together with sigh irifofmatta, A Cb bate class of $ bis may be parsed fern SAO parameters A Cfofofoiate start ps^ rteonstmottcl f ixela bf foe C&amp;: component' of foe current LCy may te tetemtead based on foe Cb band class. Offset values of the Or component of four categories may he parsed together with sign information. A Cr band class of 5 &amp;ts may be parsed. A Cr left band sian position ot rccopsbycled pilots of fee Cf componem ot the current LGU may be determined based on the Cr band Class.
Like offset adjustment on the band type of the lyrna component, on each of the Or component: end the Ch ccmpopeA an offset: vile# of a" ou r mm category may h® selected from among A pixel valve of a reconstructed pixel of the Cr component w lie Oh component may be: adjostod by using tha selected offset value.
Accordingly, the video encoding apparatus 10 and the video decoding apparatus 21:drdli ossng SAO oppugns may classify-pixel valoe#-of each LOU according to irbbpid:.'^raciedsffc^^b-;«a:'aii edge type or a band fypd, may signal an ffmi is an average arror yaloe of pixel values having the same chara^risttosy.sr^^my, Midi enprodiefabie pixel values of reconsirvoted pixels by the ohset value, thereby minimizing an error between an original image and a reconstructed image.
In the video encoding apparatus 10 and the video deeodfeg. apparatys M as disprlpid abo*e, video data may be spin into iCUs, each LCU may be encoded aad decoded based on ending units having a tree structure, and eagh toy; mgf-.dltalhe· pfipt:i#IuPS abiding to pme! PlaPiilpatipn, Hereinafter, a video encoding nmthocha vide# encoding apparatus, a video decoding method,-and a viderefeeddihg apparatus* based on coding units having a trot structure and transformabgn units will bo described with reference to FIGS. 1A through :5¾ FK3, 8 Is a block diagram of a video ansoiNg appraius 100 based; on oadihg piti h&amp;-;«lrucUire, mom amtxjdlmenfs, Pbr pcnvenlanoa of eypteatte "video onggdmg apparatus: 10§ Psstd #n coding unifr. according to a tree: aruotym5 s referred ><$ as ^ideo encoding apparatus 10$* hereinafter..
The vktep ommdfng. apparatus 100 involving ..wijfeb prediction bdssd oh sodiri# units according to a tree structure includes a tOUid,.a:^hg,uhlf.;%|^'mii#. 1:20, and an outpuifer 130<
The LCU splitter 110 may spirt a ament picture based on a LCU that ss a coding unit having a maximum i&amp;i for a cement ipletute of m imago, if the current picture is larger than the ICU, image data of the cumem picture may be split into the al feast one icy, The LCU according to one or more embodl»rit# may be a detd phlt having a stee of 32x32, 64x64, 128x128, 258x258, eta, wherein a shape of thriafeyhifira·· square having a wujth and length in squares of 2 The image data may be output to the ceding m\% detamnirw 120 aseQrding to the at teas! odd LCU, A coding unit according to one or mor« etoh^lrohrtto ntiy he maximum slse amt a depth: The depth denotes the member of tees j^tjjgjfhQ um is spadafiy split from the LCU, and as th® depth deepen®, deeper eading ynite aemrtng to depths may be spNt from the LCU to a smaHest coding unit fSOyj; A cteptN^;^:ipU fe ah uppermost depth and a depth of the SCO ss a lowermost depth. Stoop aiie of a coding uor concspondlng to each depth decreases as the depth oi the LCU deepens, a coding unit ogrresppnding to an upper depth may include a plusaSiy of Cddiig unite cortosperiding: to lower depths,
As etosetobed adeve, the image data of the current p&amp;cfd^>i®sfipfii |i.lb'.th® .LG|J®.' according to a maximum: staa of the coding one, and each of the ICUs may inolude deeper coding units depths. Since the LOU according to one or more embodiments Is split according to depths,, th# Image data fif th® spdi dompln · Included to the LCU may be hierarchically classified acoorfng .to depths, A maximum depth and a maximum $Ue oi a coding unit, which Iter too total number of t«mes. a height and a width of the LCU are htommhieafly spit, may he j^edetemimed
It® ddding unit detetoer f2i encodes st least one split region obtained by splitting a mglbn qf th# LCU aeeordiogto deptoto -and determines a depth to cutout a inaily eecDCt#^ Image data seconding to the at feast ana spit region, In other wotos, the coding uni dstonminer 1 .'2D detfifmte a. depth by .encoding' toe image dSEsin too deeper coding· units according la depths, according lo toe LCD ol the current picture, and selecting i depth having the least encoding ©nor:. The detorrnlnad ceded depth and the encoded image-data according to the detonomed coded depto are output to toe :
The Image data in the LCU is encoded based onfhe deeper coding units dprtospoedihfi it at teaatone depth equal to or below the maximum depth, and results of encoding the Image data are compared based on eactrof the deeper coding units, A ctepto having the least encoding error may be selected after comparing encoding errors # the debpshitodihg pftift* At least ana coded depth may be ededed lor each LCU, The Me of the LCU is spilt as a coring unit is htorarchicatly split according to depths, si the n^ of coding units moroattoS, Also, oven If coding: umtsi oerrospond to the same depth in one LCU, it to detormlned whether to split eadi of the coding units corresponding to the same depth to a lower depth by measuring an encoding error of the image· data of the cash ceding ynib separately. Accordingly, even when Image data is induded in one LClb the |nia0dlrtg arrom raiy differ ar^ordtog tp· rigors ip tho erte LCU, and®ys the coded depths may dlter aSaordlng to ref lops Ιό tie itmia tfeSa, Thye, ore or more coded depths mar be* detpmiired m one LCU, and the i^O dMdod according to coding amts of at toast one- coded
Accordingly* the POdin0 un| determiner'ϊμ;ϋέί0' triayonlla layirii at trio ftroiSi» Indkidod in die LCU, The ’coding olructuri' aoresi^og to ooo or mere embodiments include coding units corresponding to a depth dotconincd id te tip d^d depth, from among ail deeper ceding unis included; in the LOU. A coding unit of a coded depthrrey be hierarclifcally delerniinod according todepthsln the same region of the LCD, and may be independently determined in different redone. Sadlfetrty, a «fed depth in a current region may be independently determined from a ceded #p^;1rt:jpdthey ifeskm, A maximum depth according to one or more embodiments is an mdex related to the rember^#jpliing limatfmm a LCU to an SCtl. A that maximum depth according embodiments may denete tlm total number oi spliftinf dmea tom the LCU to the SCU. A second maximum depth aooordirig to ore nr mere embodiment may denote the total number of . depth levels from fci LCU to thb Ucy; For example, When a depth of the LCU &amp; o; a depth of a coding dhtb in which the .LCU 1% spilt Ohce, may be sot to 1, and a depth of a oodnp unit, m which the LCU is split tvnee, may be sat tp E Nero, if the SCU is a coding unn m which the LCU is split four times, § depth levels of depths 0,1, 2, 3, and 4 exist, and thus the href maximum depth may be sot to 4, and the second maximum depth may be set io 5%:
Friction pneOdlng and transformation may:be petfomwcf accordfng to the.LCU, f ho pmtlclion encoding: and fie dansfprmaiiPmar© also podermed baled an the deeper boding yells aooatdlh|lib: &amp; -depth equal .iptbr depths less than the maximum depth, according to the LOU,
Since the number of deeper coding units increases whenever the LCU ;s split according to depths, encoding, snduding the prediction impeding and the transtomiahori, is reformed on ail of the deeper coding unas generated as the depth deepens, for borweniinOi/oLdibbhpIdhirlho prydfctfon encoding and the transformation will now be base And basad on a oodihi unit of a current depth, in a LOLL IN? wJeo encoding apparatus 100= may variously select a size or shape of a data unit tor wading the image data. In order to ousoio iie image data, operations, st#ch as prediction eococfeng, transiDfmatiem and entropy eftpoding, are performed, and at 8hf&amp; ti;mef tltd; iirne data unit may be used for alt operations ordiffomot data yrtermayhs used tor each operation, l^fewnpie, 10P may select not only a coding unit/ for image data, but also a data yoiidlimenf. from.Hie.c^dt^;!tji'fit'.$cii.a$'fob perform the predion encoding on the image data in the coding unit, to order to perform^prodfotfop encoding itt'^.prediction encoding may bo performed based on a coding unit egiresperteieg to a coded depth* ca, based on a Obdinp unit that Is no longer split to coding units correspoifoiog to a lower depth, Neteioaftar., the coding anl&amp;iat J* n©1op§er ..split arid becomes a basis unit for predotion arfoqdidg will now bo mforrod io as a foreblcften unif. A partition obtetned by sptliog the piedtaUcm unit may indude a prediction unit or a data Mbit 0fetaippd'.^-:s.p8ifog at · least one of a height and a width of the predicted unit. A panteen is a data unit where a prediction unit of a ceding unit ts sptfotrid; f |ϊήιφ^^ i^rtltfe Mftep #ame size as a ceding; unit
For example,: #ea d epding uni οί2Νχ2Ν twbereTJ Is a positive Integer) is no longer split and beoomos a predicts unit of fNxtN, asta© of a partition may be 2N*2N, 2N*N, N*2R m N*tt mteplos Of a partition typo inoludo symmetrical partitions that are oMrted by symnfoihcatly ^littini a hei^it or width of the predioion unit parittens obtained by asymmaMcaliy splitting tie height m width of the prediction unit, such as 1:o or n;t. partitions that are obtained by geometrically splitting the prpdiCtioh unit* and paftitipns hn^ing arblirary shapes, IA: .pridioiion mode of the prediction uni may be at toast one of an infra mode, a inter mb^, And a ship mode, for example, the intra mode or the inter mod© may be ixsiocmed on ;ifee parlfcn of 2Ν*2Ν, 2N*N, N*2N, or ΝχΜ. ii«p>!mpd0. may be performed only on the partition of 2Ν.χ2Ν. The encoding is independently performed on one ptedirttiDP u nit in a coding uni, thereby selecting a peediotjem mode having a least encoding error.
The yjdeb eheodini appteifoi 110 may also perform the transfonnaiten on the image data in a coding unit based Mf only en the coding unit lor encoding the image data, but ap3- based fop a data unit that Is different tom bin ceding unit in order to perterm the fmnitermatton In tee coding unit, tha transtermation may bo performed based or* a data uml having a size smaller than or equal to the coding unit Fdr examples the data unit for the trausfermtfpn tody toefode a data uiA^'^fS'tntia toocte arid a data unit few an smer mode.
The transfermabon uhlih the cosihg unit. may be feifesffer · sized reglens In fhesifmfaf manner as the 'coding unit jKxx>fdin$lGhtiie;
Thus, residues m the coding unit may be d^ded according m iho transformaKon -«nil hiving Ihf trib structure according to transfermation depths. A transformation depth indicating the number of splitting times to reach the transformation unit by splitting the height and width of the coding unit may also bo sot in the trainstortoalidd irpit For example, in &amp; oiiiiptw^odfae unit at 2N*2N; a transionriaison depth may be 0 when the size of a transformation unit &amp; 2Ns2N, may b&amp; 1 when tbs size of the aanstormaion ynt is N*N« and may fee; .-Mte-: transibhoabon unit is in other words, the transformation unit having the tree structure* may be set sccorti^g fo the tr&amp;nsformatferi depths. incodinQ infomiation awarding to coding units coitospon€iiif;#'.it coded depth requires not only information about the coded depth, hut also about information related tblprediolod encoding and transformation. Accordingly, the eodtr^-yoidotomrtiner ISO: hdi'd^lyi'd^t^rtTiines a ceded depth having a feast encoding enor, bet also determines a panfeon type In a prediction unit, a prediction mad8:ePP0totog to prec^oidn ttoitsi and a Size of a transformabon unit tot transfunnahon.
Coding unis according to a tree structure in a bCP felti:i^i^ffedti£i|S.-^sf ic^i^rrRfih!ir^..^i.. prediction uofcipartfooa and a transfornmfton m% aserdiop to cue w more embodiments, wifi fee described m detail oefew with reference to FIGS. 7 threogh 11;
The codiog ynititotorminer 120 may nioosoia an ihbbdihg f per of deeper coding units according to depths by usA^^.Rafe^bi^rlfefi -0pi}$izata based on Laigrahgfan muisplters.
The outputfer 130 outputs the «mag® cfatA of INt€l|yWifch is anoedad hasad an the at least one coded depth determined by die coding unit daienmrser 120. and information abeyt the encoding mode according to the oNeddepta to hitsirehrm The encoded image data may be obtained by encoding rwduos of an image Th® information about the encoding mode according to coded depth may m cl ode Ihfirmatiop aboui the coded depth, about the petition type to the prediction unit, the prediction mode, ahd Tfe® size of the hansformstion unit
The tafermaifen about toeeoded depth mey ito ^ spill: tofbrmalion according ro depths, which indicates whether encoding ;s performed on coding urn® of a lower depth instead of a current depth, It the current depth oi die current coding umt is the coded depth, image data so the1 current coding unit is encoded end output- and thus the spilt Prtf Miration may5 be defln|d ool t© eurre^nt coding unit to a fewer deptv •^tentatively» if the current depth of te cuTOnt ooding unit k not fee coded de^ft» fee inoodfeg Is performed on the coding unit of the fewer depth, end thus tm split information may be defined to spill fee-.aarem bddtag. utii'te::#Mfe. |he'.:dd^g-U^··#: thdtowrdepth>: ft the current depth is not the coded depth, encoding is performed on the coding unit that is split Into the eodrng .wit' of the lower depth, Sinee at feast one codng uni of fee id^fer exists In one coding unit of fee curroet deptbjheeneodimi ts repeatedfy perfeoroed on each coding onit of fee fewer depth» and thus the encoding may be r^es^lsHilly^ tti® hsfeng the. saow depth.
Since the oaefeng units having a. dee stmetere are determined for one UDUL aia| information about at feast one eneodthi mode is defemtlned for a coding unit of a coded depth, information about si least one encoding mode may be determined ter one LOU. #10, a coded dopfe of fee image date of feo ICO may ba dilferori aooordmg to tocaieos sided: fep, image: dlfe i bferarehicaliy eptt aocMhg to tfepthsd and feus mformMson; about fee coded depth and the encoding mode may be set. for the Image date, the outputter ?30 may assign encoding pfdrmdtion ihopf a pariaspohdiop coded depth and an encoding mode to at least one;dli;fegdrf^;Up^. prediction unit* and a minimum unit included In the LCU,
The minimum unit according to one or more embodiments Is a square data unit Obtained by splitting fee SC Li eonitifutihg to Alternatively, fee feinimum unit according to an embodiment may be a maximum square data unit that ; bp teefyded In alt of fee coding amts, predfo|idn ypits5 partition ..00¾ and trerisfoninatlon units feiuded in fee IGU<
For examine?:,fedVeneodlng Inloipation output by the oplputter -j30. fepy del slassiffed info enenbibg information acpstrding ¢0 deeper coding unifs, and ensodirti information according to prediction units. The encoding inlormaonn according to tea #eepinMdid| yoits may ftidlydalhe Infonnaicn about the prediction mode and about fee Mfe of the partitions. The eoectelog Ihterualon according to fee prediction units miy fRilydt Ifeormtiion shout an estimated direehoeof an feter modo, ahoui a roforonce image index of to Inter oiodo , about a motion vector, abouf a chroma component of an iMra mode. and about ad rder^od of foe intra mode.
Information about a maximum size of the coding unitdefmed according to pastures, Slices, or GOPa, end information about a maximum depth may Pe msened into a Header of a bitstsfearfo a sequence saL or a picture parameter set folormaiw abcut a p«iwiii :sin ol foe transfomiatfon unit permitted with :'.l iqy:about a minimum sae of foe transformation unit may also be output fotwgh t header of a b!t$trean\ a sequence paratiefer stf, m a picture parameter set. The outputter 130 may encode and output SAO parameters rotated to foe SAO operation described above with reference fo FIGS. 1A through 7F,
In th# video encoding sppwstetou, tbs deeper ceding unit may be aoodrng umf flight or width bf a esdmg/unil of an upper depth, wfiiph is phfo layer abow:! by A¥o: in otter words, wten the mm of the coding unit of the outmnt dl# tbt ^ so^iiri^: Mbit ol foe fowor depth is N*N. Abo, tteooting unit wifi the current depth having a .size of 2N*2N may include a maximum of 4 of the ceding units w4h foe (owe? depth.
Accordingly, the video encoding apparatus 100 may fomi foe codfog urnts having ito®kmcoding units having an Qp$mm-shiipNsatafgpteum.·-siio:;.f4fir^^jhi:...Lipili:'^l»8fessiied on the size of Use LOU and the ma^imum &amp;pth deformfned ©sneidPrihg chaiacterictico of the current picture Ate/ since encoding may b# performed on each tGii fey using any one of various prediction modes transfurmafeons, an optimum encoding made mayI sharactehstiss of file codfrip unit of various image sizes.
Thus, I ife Ifoaip having a liigh folqluifen or a forge data ainount ls oneoded in a ponventipfiif maoroWock. the number of maomhiocks per pfotym Bxeos#veiy increaaes. APppt'dinify,: the number of pieces of compressed information generated for each m^rdfeioek lporeeses/ahd ihys It Ip dllSIf 01¾ t« the ccnipresscd foformatfon and: data oom^esafon efficiency dcprcasds, Howeven by using foe video encoding apparatus 100» imege-li^ :efff^df ;m£y be increased slnse a eodfojiyni is ad^sledi«hile'^neid^fir^· ^^ad«n.sioe?o! an ifftage while increasing- a m:afofoym:elm Of a coding υηή white considering a size of the image,
The Wdio encoding apparatus 100 of FIG. 8 may perform opprsficn of thd yideb encociing apparilus 10 described .above with reference to FIG. i A,. :120 may p^rfornr operalissn si toe SAO parameter determiner 14 of tiie video encoding apparatae IS. A SAO type* effect values according to categories. and a SAO das&amp; may be determined with respect to each LCU.
Ifm du^yt^r ISO may perform operator; ef itt SAO parameter encoder 18. SlO; pafMrtoiors^ ieterrrftoed with reapeet to each LCU may fee ootjpuL SAO meigiog irrform&amp;lton indicating whether to adept SAO parameters of an adjacent LCU of a correct LOU as the SAG parameters of the current LCU may he iillalb output As a SAO type, as elf type., an edge type, or a band type may be output. An offset vafue may he output ip arf grdtoi pf «are value intermafid«y:ilQn lnfewiti0!p?.#id-a remainder With fisspict" to toe edge type the gffept vaiue may net bo output
If fti SAO atorbicg toformatiori: of the oyrreniLCU allows adoption of the SAO prametofs of the adjacent LCU. toeSAOiypo aoctth® ©Use! values of toe cumss! LGU may not he output. 1! may he determined wttotoer to perform a SAO accordtog to color eoatoentriti. It may be determined whether to perform the SAO operation for a lama component and first and second chroma components with respect to eadi slice Ik? opipottor 1¾ may outpui &amp; Ume header inofudlng iyrua SAO use intormatiho and ehrpma SAGuse information.
The outputter 130 may include luma. SAO type information iitoicatoig whether to perform the SAO operator tor toe luma ttomponept and a SAO type apd ckpma SAO type Information indicating whet ha? to perform the SAO-pperation-for chroma aOTpenetifs and a SAO type Ip toe SAG parameters defemlhad vito: reject to bidftUSO, FIG. i ll a black diagram of a video dechdirta apparato$ 200 based on ceding ufiite bayfe-g a-bee afteiqtore. according to one or more emborflmectov For ©enveniepoer of p^lanaflon, "video docodmg appa:retue 3@0 based on co^rfg;drfi? ^pd^.f!p::te<i;:iNe·:; sir u^.-lv ;*υ ‘Us referred to as ’Video decoding apparatus *00” h«fnaffc
The video tfeccdlag apparatus 200 that involves video pr^leta hased oc ceding unite having a Me afruetero Indyffes a» image data and encoding information extractor 220, and no image data decoder 230.
Definitions of various terms. Such as a coding unit, i ddpfL, 1· :p^i|^i^;.yiifc M transformation unit, and toronmahon about vanous encoding htodeis*' for ppefeiiot's of the Video decoding apparatus 200 am idontical: to those Uoscribed with reference lu FIGS, 7A through ?F- and the video ehoodtng apparatus IM
The receiver 2Ή1 receives to parses a bitstream of an encoded video The «nap data and encoding information extractor 220 extracts encoded Imag© data for each cod 1¾ unst from the parsed bitstream, wfherdn the coding unit© daws a bee structure according to the extracted Image data to the imago date decoder 230. The image-data and encoding information -extractor'200 i^y extract information about a maximum cl# of :| ylritl ot a current picture, from aboadur tout the current picture, a parameier set. or a}0cw.j»mmeter.sei
Also, the image data and encoding information extractor 220 extracts information about a coded depth and m encoding modeler the coding unite havini a tree structure according to each tCU from the parsed bitstream. The^extracted inicjmatiou about: te coded depth and the Image data dedoder 23¾. In dth^ words, the image data in a bit streatu is s^itintslhe LCU so that the image data deader 230 decodoe the imago dat^:#r LCU.
Ihe information about the coded depth and the encoding mode according to the LCU may be set for toiureatian pbpyt at least one coding uni corresponding to the coded cfeptli, and tofoempon ibotit ah'ι^'έχ^πρ:jnhiay/lrB^udef· - Siboui a pardon type di i eopisponing tofo| unit corresponding to the coded depth, about a '. . a size of a tmnsfor motion unit Also, spilling Information ^spoi^infp -tp·:^^ €&amp;^^»ated'is^=1|^''ipeorf¥is^ion about the coded depth.
The information about the coded d^ti amt the encoding mode acoondirrg to each and encoding information extractor 220 is information tout a ooddd ifepth and an encoding mode detected to generate "a- minimum ebppd|h:gencoder, such as the video incodlog apparatus l 00, repeatedly pedpsmt f heading tor each deeper coding unit according to depths according to each LCM Accordingly, j^i.%|deb .d^dihgiapparate- -:200 may reconstruct ap image- by des^^:'^;;ih!ii^' dats ecconliig to a coded depth sod an oncodiag; mode that -tin.© '.liBliiiiTiMritt -#noo<dwiQ error.
Since encoding information about the toed depth and the encoding moda may be assslisni&amp;d.':Φ&amp;-·β·-:|»r^iaieixtninfi>d data unit from among a somespsneteg coding unit, a. prediction unit, and a minimum unit, the image data and encoding information extractor 220 may extract the Information about the coded depth and the encoding mode according to the predetermined data, units.· It information about a otod depit ar# encoding 'fbodo,:W a c^basi#hdihgv f 0M i rb^Mid according to pridotirmiptd data-units, the predut^rmmud dati unit s to whfch the Baino Wormafcn. about lb© coded rfepth ami the encoding mode is assigned may be inferred to be the data units inducted in the oameLCth ^ίέΙ^.-'ίΙ^ώό^φ'Γ" the ourrem plekire ,;l!^--:!|i^ii5€i^f^'·; image data trt LCO based on fheinfomiafeun about il ie coded depth and the erKsdlng mode^aos&amp;idNig to file ifSUs, Ικ<βΜΤΦ0β(ίκ:4ί* image data decoder 2%$· TOiy= '.b^^ci'-.<^if.-.’=th^’· ®^t.raicte<i information about the partition type, the prediclpn mode&amp;nd thetransfomiabDri urit for each coding unit frpm sptong the coding urns having the tree structure included in each LCU, A dedcdihg access may inpiyda a prediction fnciedmg infra prediction and melerr ebmpen^rtoa amt an ie^s^
The image data -iitehp^jter'\l^^-v#ay perform Iftiti prediction or motion €Qmg®nsa8orf according· to:a partita andmprisblstlun mode if eaA coding unit, based do the information about the partition type and tho prediction mode of the prediction unit Of tho coding unit according to cecled depfe. in addiiefe: decoder 130 may read tetermatton about a imosfumtaen unit according to a tree structure for each coding unit so as tc perform inyptei irantfprmatipn baeed on tFiniformafen;-unl|iTor;;:&amp;&amp;gii coding unit for invomo tfunsipreialiun lor each bCSIl, Via the iitwerso transfomiatlonv U pxei vaiuo gf the apace dpps^p Of:the coding anil may be recoilstrocfed the smago data.; Vocoder 330 may determine a coded deptft bf a cuoent tCtJ by using spk information am^rding to depths. If the split ioformata indicates that imaged data is rio feget spit in the current depth, the current depth is a coded dppth. Accordingly, the image dal a decoder 23b may decode an coded data in itie current LCU by yfing fi# information about the partition type of the prediction unit, the pridibion medpVdnd tlie size of the bansfonriatiGn unit for each ceding ynif sorresponding tc the ceded depth. in otfeor wo#Sc data unite contorting the sneoding Information including the same split: infadbalpn may fee gathered by observing the encoding informota the prcddtcnhliwd data -unit from among the coding unit, the predchon ynlb and Ihe minimum umt and the gathered data units may bo considered to be one data unit to be decoded by the image data decoder 330 in the same encoding mode, # aoefp the current codmg unit may be decoded by obtaining the information about: the anoodipp nslbde fif: each codi ng tie if.:
Also, the video decoding apparatus 200 of fig. 0 may perform operation of the vf^P decoding appiriili £&amp; described above with raferaneb Id P1G< 2A>
The image dati and encedirig Information petferdt dpeipem of ite SAO parameter extractor's of tfravidec decoding apparatus 3$fc The image data decoder 03 rwy pidorm Gp^fgffons of the dttpmiieer 24 iod &amp;© SAQ ddiuster 2$ of th© video deco$r*g apparatus 20.
It be ddlormtaed'ii^tP&amp;r to perform a SAG oomponents, 1Mdmplp: iaM Arid encoding information extractor 110 may obtain luma SAG use fnteinaloe and chns^^ information from a sloe deader. It may On dpfomtfoed wt^tifiar ip S operation iori:lupi:^poneitt from the luma SAG use Jidsrmatioe and first:and second stiroma oom:j^nen!s Item the ctiroma SAO use Information,;
The imago data and encoding mformation extractor 220 may obtain luma SAD type. Informatfon indicating whether tc perform the SAO operation for the lyffta component and a SAO type from SAO parameters determined with respect to each fcGJJ, The image dafei arid encoding irsf^maifon extractor 220 may obtain chroma SAG type iniornation tndlp||ih§ whether fo pbiform fie SAG o|drfen for thd Ifrst afd second shrenrn cornportesits «I a SAG type from the SAG pararneters detemiiined with resect: fo each LCU,
If only SAG merging ififormatlori Is par^d Trorn a bitstream withant SAG parameters of a. went LOU. ifte image d&amp;isjwl .©ijeodifsg jrtlormatevmtm^r·' 2-81:-may reconstruct the SAD parameters of the current LOU tc be the same as those of at idfft eng pi adfocent LClte. Based on the SAO merging snformfeeo, an ii|$cent L0I3 hayingis|b parameters to be referred to may be determined. If 4 is determined thef: the SAP 'parameters of the current LGU sradiffersnt from those of the adjacent LCUs based on tho SAO merging Informatism of the currant t€U, which Is parsed Item tra bttstrafnT:..'the «mage data and encoding: information extractor 22Q may parse and reconstmct the SAG parameters of the currant LCU. from the LiMream,
Tfii image data and encoding information extractor 220 may parse SAO pramitesi if each LCU from the bitstream. Based on the SAO parameters, a SAO type, values according to categories, and a SAO class may he determined !f the SAG ^eWthe currant bfiy ifmhof ty^ fermiiitafcd. Hf.i(he:^(5:::lyiper:ls<m edge type, teed on a category indicating an edga- class Seating an edge direction ol each of reconstmcred pixels. and an edge shape. a cum>nf offset value «nay be; selected from among reoelved offset values. If the SAD type is a baud type, a band to wtacb each of the meoastmemd pixels belongs Is determined and art offset value corresponding to a current band may be selected fwt among the offset values.
The image data decoder 230 may generate a reconstructed ptxot capable of minimizing an error between am original pixel and the reoanstryptod pixel, by adjusting a p*xe! value of the recofislhjetdd pixel by a correspphdlhg offset 'value. Offsets 0! reconstructed pixels df each LCD may be adjusted based on the parsed 'SAD paramo ters,
Thus, the video depocliug apparatus Ξδ§ may: obtald information about at least off® coding unit that generates the n^intmum eiwodtng error when snsoding is recursively performed for each LCU, and may use th# information to #edda tip .current picture:, in other words, the coding units having Hie tree atruotnre determined to be the optimum coding .units in eacb'LCU may be decoded, 'Accordingly, even if image data has high resokmn and a largo amount of data, the image data may be eiteentiy docoded and reconstructed by using a stzo of a coding upit: and #1. ehdbdipg mode. which are adaptively daitrmi nod lapsdfrfinQ: to; characteristics of the image data, by using infermattob aiaotit an cpimum epesdibg :mdde received from an encoder,. FIG. 10 la a. diagram lor deaertbing a eorieept of coding.units according to one or mere embodiments., A ilia pf 1 coding unit may be expressed by width x height, bnd oiay be Mx84* 3¾¾¾. 18x16,· and 8*3 A coding umt of 64«β4· may fed aplt into partitas of 64*84 54*32. 33*54, or 32*32. and a coding unit of 32x32 may be split Into partitions of 32x32, 32x16, 16x32. or 16*18. a podltig unit of 15*1S: may fee op® Intopartitions- of 16x15, 16x8» 8χ H3., or 8*6, and aesdirtg unit of 3x8 may P0splitmtp partitions of 8x8» 8x4: 4x8, or 4*4. la video data 31¾ a reaalyioti k' 1920*108¾ a maximum ilia, of a coding uni is 54. and a maximum depth is 2, In video data 320. a resolution Is 1920*1038, a maximum size of a adding 'Unit is 84; and a maximum depth Is 3. Irt video data 330,-s resolution is 352*288, a maximum size of a coding unit is 16. and a maximum depth is 1, The maxtmym depth shown m FK3, f O ddhotis a total number of spits bom a LCU to a minimum dneoding unit K a resolution is high or a data amount is large, a maximum eiza of a coding unit may fee large so as to not only increase encoding :©hid&amp;ncy but. also to accurately reflect characteristics of an image. Acooidingty, the maximum size of the coding uni pi iht video data 310 and 320 Mvirtg^fcp&amp;y mssluii« than ft'e yid^;:|^-.:33©.may bifr-64» Sirw the maximum depth of tie video data 319 is 2, coding units 315 of the vMe data 3 >0 may include a LCU having a long axis size of 64?.and coding unite having tong axis sizes of 32 and 16 since tfeplfis are deepened to two layers by spieling ihe LCfO tMasg Since the maximum depth of the video data 330 is 1, coding units 335 of the vxfeo data 330 may include a LCU'havi&amp;g *- lorig :i3t^.'Siit«;dt t6<'^Hl coding units having a long axis size of 8 ,wmm depths are daepened to Ode lapr. If-splitting the LCU once.
Since the maximum depth of the video data 320 is 3, coding units 325 at the video data 320 may include a iOLI having a long axis cl» of 84* and coding uola having long; axis sizes ot3£s t£h and 8 slope the depths are deepened te 3 layers by splitting the LCU three limes. As a depth deepens, doliffed information may bo precisely oxpro$s«l FiOL it ^ a block diagram of an image encoder 400 based m coding yiiia, according :to one or more embodiments^,·
The imago en^jder-4^'perl0rmS'Opef^pns^:^i,o^li^.:uhit doierrnlhirtii eT the vidad encoding apparatus:;.1.'00-;.Is©-·.- . In other words, an infra predictor 410 performs mtra predision oo;oedtog units in an {ntratiiao^ifeiro among a cuonent fiame 405, and a mohoa aitihiator 420 and a motion cgnipomalor 425 respectively perform inter e&amp;hmatkm and motion compensation m citing units Id in inter mode fromi Meng the «went tram# 405 feypu^dhe cwffenifmme -46i^and a reference frame 495
Data output from the loifi padl^ and the motion compensator 4¾ :1a output as a ::gyaplied transtormate .mMofcHtt transformer 4^0 s quantizer 440, The quantized iransfermatai coefficient m reconstructed as data in the. space· domain fhraeg^ a degyanfizar 480 andao inverse transformer 4¾ and the reconstructed data is the space domain &amp; output as the reference tram© 435 after being post-processed through adehfooksng filter 480 and an offset adjuster 490 The qurmeznd transformation coefficient may he output as a bitstream 465 through an emropy encoder 4 SO; in otder far the image encoder 400 io be spiled: Ιό the f00, aft clombptt of the image enoodar 400,1,0,, tho mtm predictor 41CMha motion cstlmaici 420,; tiia motion campedsatm'425, the·· tmnsfermer 430, iff© gy&amp;ofeer 440, tiw entropy «redder 450, the dsquamiser 460, the inverse transformer 470, the cte blocking filler 4i0, and the offset adjuster 490 perform operations based on each sbdilip unit afofefig Ceding unite having a fra® struct in« while Derate ohg ths depth bt each LOll.
Spedfica#/. the Wra pradtefor 41 CK the mofon estimator 420, and the mo&amp;n domp&amp;nsator 42$ determines partitions end a pnsdfoiKfo mete of each codifegynlf frifei among the coding units having a tree structure white feon$.rdx»nitg. foptragudmgrm.stefe%β&amp;· the maximum depth of a current LCD, and the transformer 430 determines the sae of the transformation unit in each coding unit from among the coding units having a trap iteusturi:.
Specifically βό Che motion estimator 420 performs iheimer prediction ygiop the fengdenm reference tamev the ROC information of the fong-term raferehoe fmnie may ^output ms foe tehgTorm reference index, The entropy encoder 45|foiy ooeodeand output the LS8 information ot the FOC information of the long-term reference fr&amp;mi, fit the long-term reference index. The LSB information of the POC informabon ef itfo feng4drm«fer^nsfi times for mo prediction units of the current shoe may fee included if the skee hcaddrand foon tanimihed, 1¾ plfsdt adfidsfor 4¾ fomy piasilty.. . ^ «; ;fo^^:::: type) of each LCU of the reference frame·* 495, may determine an edge dlreclfenfrt si&amp;rt band position), and may determine an average error value of reconstructed piftete· included in each category. With respect to each LCU. S AO merging Informatfon, aip type, and offset values may be encoded and %nafedi
Thi infepy fii^cter 450 mayperformCASAO ennading on SAO parameters iPctedtef SAO merging information for SAD operation, SAO type information, and feffodf vialufe m For example, the entropy eaooder 45C1 may petferte CABAC encoding on a firs! bllof the SAO typn lnformatisn: fey using-one conte^t modat afid cn other Nts foefofef Ife a tirpa^ mpcfo:. Two context modete may be usad teethe offset values.. Cup context rspdfei may fee- used for each of feit SA.O merging Information and upper SAO merging information. Thus, a total ot iva ®ηΜκΙ models may fee used to perform CAiAO encoding on the SAO parameters FIG. 12 is a block diagram: at an imago decdder idh bassd ari cpdfng unite, aecfeplipg to one or more emhodtetetes, ^ Si 0 raapig date to ttedecodsd and informatten about encoding ragylr^fr decoding from &amp; bitstream SOS., The encoded image data fe s^ipyt Ss teversequamizeef data through an entropy decoder 510 and an 5¾ and teeInverse quantoed data is fimage dalate the spec# domain through ait inverse transformer 540.
An infra coding units in an intra mode wife raspeslvta.tft® 'toag8>data 'toltouspacp doimalny&amp;ndl a motion compensator -560 :''^rr^»|ifssiatiop= m cpdit%4^' Ip jk#3jtar mode by u$*ng $ mforanca ftarnt ISS.
The imago data m me soaoe domain vviiioh passed tthrough tits infra predictor 350 and the motion compensator S60, may be output as a rocbbstoctsd tome 595 after being posopfidoagaad ihrougiN a deblocking filter SIO and ars offept adjuster 6S0. Also* tiro image difa |hat is th«''d6Slac1i^:'fl||ei· 570 and the oiisel adpsiar 5i9 rnaf hs oytpuhas the reference frame §85.
In order to decode the Image data in the image data decoder 330 of the video decoding apparatus 200, the image decoder 500 may perform operations that pi# performed after the parser 510, tn order tor the image decoder 500 to bo applied in the wl-ee decoding apparatus 200;: alt elements of§0Oi: i.e.» the parser Sto, the emrepy dessder SgOp the heqeant^ei· S30. the inverse transformer 540, the sntra predictor SS0. the motion pompdnsctor§§9*the deblocking filter 5FO, and thepfeot adjuster 5S0 perform operations based on eo#ig yeita hevteg a tree etrt^tere for ta^ LCU- ipeelfteilly, the Intra prediction S50 arid the metion eempensator §60 perfenn operatisns basedoo partitions and apretecta mode lor «it of tie coding units hiving a tree structure, and the inverse transformer 540 perform operations based on a size of a tranefcmmien yet ter each coding unit
The entropy decoder 520 may perterrn CABAC decoding on SAO parameters and parse SACJ merpep information for a SA.O oparaifan, SAO type intermaionv and offset values tom the SAO parameters, tor example, tee entropy decoder ilQ may perform CA8AC decoding on a first bit of tee SAO type information by usmg one content model ato on other bite thereof in a bypass mode. Two context models may be itosd for tec offset values. One contest model may be used for each of left SAO merging iMcreMfipe and u^ar SAO merging infommiteri. Thus, a tote! of five context models raay: be used to perform CABAC decoding on the SAO parameters.
The offset adjuster 519 teiy «6$ni$:$AQ p^ahteters of LCUa tram: |, tetatoam, Based-on SAO nicryteg;!ntemteiibadfom'':artte:b8':tia SAO parameters of a current LOU, SAO parameters of toe cufreotld:^.:#isi? are toe same as those of an adjacent LCU, may bo rscorssfer^c^·; By using &amp; SAO type and offset values from amoi^g th? SAO parameters of the LCil, .each of reconstructed pixels of LCUs oi the rssotstru^tidiirap^ iiS rruy be adjusted ^aheffsei value eorfesj^rtotog to a category according to the edge type or the band type, FK3k 18-.1#'a:Peopw coding enits according to depths, aM to see or more emitodtments.
The yfdib Ipsbcibg IBP end the video decoding apparatus 200 use htorardlicsl bddieg upits so as to oonatdo^ c^amotonotics of an image. A maximum heigtto a matormim widths, and a tosumunn depth of coding units may be adaptively ^tomiiped according fp toe eharactoesbcs of the image, or may be differently set by a user, Sixes of deeper coding units according, to ^ according to theiil^ size of fhpoodldp unit,.; to a hierarchical stoetorp ©00 of coding un*s, according to one or more embodiments, the maxi mum height and toe maximum width of the ceding unite are each 64. and toe maximum cteprh ss 3. In tois case, the maximum depth refers to a total number of imsa the coding unit it split from toe LCU to the SOIL ifee a depth dbdppna afbng a yprllpal axis of toe hierarchical siruc:yre 60€, a hefgbf and a width of the deeper coding wait are each Split: Also, a predietlpn uni and p^rtittoPSi whi#i are bases for predtofcn enepdiog of each deeper coding: unit, are shown along a horizontal axis of the hierarchical structure 600,
Ip ether wptds, a coding uni i10 is a LOU in toe hierarchical structure 800, whdtom a depth is d and a sbe, i.e.f a height by widthfi 64*64. The depto deepens along a si2$rf 32>to2tmd a daptooriya
Coding unit 636 havini a sicc of 18*16 and a depth of 2. and a e&amp;ding unit 646 having a siaa of i*8 and a dopto Pi a,. The coding unit §4© batong a size pi 8*8 arita depth of 3 is an SCO.
The prediction arm and the partitions of a coding unit are arranged along toe horizontal axis ascording fo each depth, to o ther words , if the osding unitilQ having a size of 64*84 and a depth of 0 Is a prediction u;ntf toe prediction unit may he split Into padiltons include In the encoding unit 610» ne. a ptotitton eiO having a slze of 64*84, partitions 612 hs*ljRj|· pariitovs 614 having the size of 32*84, or partitions 611 having^ih4;slzipfM><32,
Similarly, a pmcf efion uni of toe coding :uriit 620 having tha sizo of 32x32 end too
depth of i may be split into partitions included in the coding unit 6¾¾. re a partition 82G having:® i0mof 32*32, parbhons 622 having a size of 32*1§, partitions 624 having I #2# 'Of t6sfepld prfitiOns 826 having a sfee of 16*16. =a; t€ and ;φ®,-' depth of2 may -be split Into pa^iomsr.MiKtedl Ir? the coding uni 8¾. I·*. a parlfc having 3. size of 16*16 snotoded in Ihg podihg um 63jO, partiti<ms 632 having a sip of 16*6- gattitions iM having asize of - $*$% and prftttes 138 having a alza of 8*8 :;Si.irriiM^r*- ^’-^θφ.ρ®0ίϊ;ΐ»Αΐ of tho coding unit i4€h having the size erf 8*3 and the depth of 3 may ho split into partitions included in the coding unit 846, ito. m partition having « size of 8*8 induced in toe oodmg unit 848, parti^oiia:642 having a size ol parpshs 1644partitions 846 having a size of 4*4,,
In order to dbteanlha tie at test o?n© coded depth of the eodiog; unite oonsitoiag the ICy 61 a, the coding unit determiner 120 of the video encoding apparatus 100 p^^ormsf' j»nee^RrHS ,1or cormsjxsrtoing to aaeh depth included to tap LCU 810,. Ά /numberof ;:iif^^·^έρθ'«1ίϊί®_ t^;>icl^jii)^'-id'the sari» ranp and the same « Increases as the depth daapons. For aKampto, foureoding units oortospoadiog toi a depth toi l am toguirfed coding unir corresponcNng to a depth of 1, Accordingly, results of the same data according to rfepthilthe codingthe di|ih of t and tour coding units corresponding to tha depth of 2 are each encoded.
In cider to perform encoding for a current depth from among tha depits, a least edeodtog error maii Pa iliactod for the airreht depth hy performing encoding tor each prediction uni m tha coding units catfeipondirig to the currentaioog the I ontoj t i oxiu of he hici reheat stuetous 680 Alierratvolv the nw&amp;mun encoding orror may be osstohgd tor bycom||aifhg the toast encodingto cfeptfe, each depth as the depth deepens along the vertical axis of depth and aparllcn having the mlnimoio encodlng «or in te coding unit if O may bo seteotod as the. ceded depth and a partition type of the coding ortl 110
FiG; 14 is a diagram tor describing a raiattormhto between &amp; coding unit |:l 8 ehi tlwisfdriMtvon )i Its 728. dcaoiding to one or more embodiments*
It to vtoep ρο#η&amp; decoding apparatus 280 encodes or decodes an Image ^oconini to coding units having sizes smaller than or equal to a LCU for each ICU. Sizes of transformation units fik may fee sefedied based on (Mto. ywi^: fli'at:ar©· pot ta$prlltiifr&amp; i^sspdhAig wdifti unit.
For example, in the "video encoding apparatus 1GQ or the vsdeo decocShg apparatus 200* if a size of the coding unit 7t0 « 64x64, - tratr^wraiion may fc# performed by using the transformation units 720 having a size of 32\32
Also, data of thepp#rig unit 710 having the sizs oi 64x64 may be encoded: by performing the trioslbctetetoh ofs each of the traosfermatfen units having tfeo eiip oi 32x32, 16*18. 8 «6, and 4x4, which are cm altar than 64x64, and thee § transformation unithaving ifteJtoNt codmg error may be'ejected
FiB, IS fe a diagram' ftp deserving encoding: iniormaiop of coding pnlla ©responding to sodded depth, according toons.or-morsembcKSmeris, ffio outputier 13Q of the «deg oncoifingt^spa^s 1$ may encode and imnamt information 60D about a partition type, intormahon 813 about a prediction: mode, and information 620 about a size of a transformation unit for each coding unit eomespoadteg so a coded depth, as information abcc? an encoding mode,
The information 800 indicates information about a shape of a paddlorTebtalrtod by splitting &amp; prpdtetigh unit: bf a cutfWt dtelhg: yfeit, whereip the 'partition ii m data pmi ten prediction encoding the current coding unit Far example, a current coding unit CU_0 having a step of 2M.x2N may be spht into any one of a partition SOS having a Oise of 2N*2N, a partition 604 having a size of 2N*M a partition 806 bavmg a size el hte3M, and ' a partition 808 having a size of N*N. Here, the information 806 about a partition type is set to indicate one of the partition 804 having a size of 2Ν*ίΝ, the parlfen M€ having m p:ze of N*2N, and th# partition BOS having a size of N*M,
The informatiPii 610 indicates a prediction mode of each, partition. For example, tho lnlorniatigo 810 may indicate a mode of prediction encoding pattormed on a partition indicated by ft© information 800, i.e.t an ..infra· mode 812, an Inter mod© 814, or a skip mode 816.
The information 820 Indicates a. tmnsfarrosifM unit to fee based on whan transformation is performed on a current coding unit For example* the transformation unit may be a first Intra transformation unit 822, a second intrs transformation unit S24, a first inter transformation unit 626, or a second inter transformation unit 828
It® -#KI #eb#dg ihformstete extractor 220 of the video decoding: appamtes iCkl may ©xlract arte use Iho informstten 600, 610, and 830 tei decoding* according to each deeper coding unit,
Fie. 16 is a diagram of »cp«iirti toon*· tfiiom embodiments.
Spin infoimsllon may bo used to indicate a. change of a depth, The spill information kKlestes whether a coding unit of a esjirent depth is spill ini© coding units of afdww cfeptfi:. h prod iota Wit 0 foritaSotaencoeBrtg a coding unst 900 having a depth of 0 and a slip ofMay Ihciyda partitions of a,prtit<e« type 912 having a s^e oi EN 0*2N„0, a pension lype 914 having a size at 2N {?sNj, a pejohan type 916 having a. w of NJ}*2NJ3, and "a··parti Iw type 918 having a size of NJM4JT RG. 9 only likjsfrates the partition types 912 through 9i| which are obtained by sypmethcaity slitting the predicta unis 910, bet a partition type is net limbed thereto,. and the parhfionaof the prediction unit 910 may fncMde asymmebfe^.pajtilidi^l, partitions having.;· a pmdeiefrmned shape, and partitions having a geometrical shape
Prediction encoding is repeatedly performed on one partition having a. size of 2N ji*2N j), two pardons having b size ol 2Nj}*Nj>, two partitions having a size of Nj)*2Njx add four partition® having a size of NjMMj}. aooordiog to each partition typo. The prdeletion ohebding In ati infra made and an inter mode may be performed on the partitions having the sizes of 2NJ)»2N_Q» HJIGlhiJX; 2N;j:GNjX and J1 The production enta&amp;ng m a skip mode is performed only on the partition having the size oi 2N _0χ2Ν_0. if an encoding error ia araatlast In one biite partition types ill through 916, the prodidtloh unit 910 may not fe#split Into a tower bsfA, ft the emcodiog ertpr •frit^mrttaf -to .fietaff on type 9¾ a depth is diang#i from 0 to 1 to- spilt the partition type 9tS m operation 920, and encoding Is repeatedly perfQnwd on coding units O&amp;Orhavibg ;adepth of t and a si^e ot NJNNL0 to search iot a miiiicnym ohoqdlug error,:: A prediction unn 940 tor prediction encoding· m&amp; tafcig t#i .pf|,ha#gadeplli of' 1 and a size of 2N...1*2N...t |=N 0>-N 0> may include padlions of a -partita type 941 having a size of 2NJ *2hM, m pompon type 944 having a size of 2hJM xhM. a partition tfpe §46 having a size of NJ*2N.J, and a partition type 94S having a Siize of M,J: xMml.
If m encoding eiro? *s the smallest w the partition type 948, a depth is changed fittm i fo 2 tp ip|| ftp partition type In oporata: 990S: tn^dihg is repeatedly pertoimed op .coding units 9(30, which .fmve&amp;.defrtb· of- •E-and.-.a· size· of. to sgirohf !br a minimum encoding error, ύ» :;§ptji; ®ρ@Ρ|δοτ* accocdmg ίο each depth may te p#i0rm$d yp ioi^eo a depth lwr^1f ® d split imormatiod may he encoded Pi bp to Λοο a depth is ops of D Is d-2. In other ^ords, whemencodihit is performed up to wNn the iopiN Is d'«1. alter a coding out €»)^δρορΛ§ to m depth of ct^S! Is split in operation 970, a pmdieften unit 990 for predtetmo encoding a coding yoit iSO having a depth of d-1 sod a size of ^^-i.fxSNjd-i) Pipy tebdo prisons of a partition type 992 havinga sics of 3NJ^i)*2^Jd-1)? a pardon type;, 984 having a size sf: BNjd~1}sN Jd-th a partition type 986 having a sUe of N^IJ^NJd-l), and aparllpn type 998 having a size of NJd'i)*Mjd'ii
Prediction encoding may be repeatedly performed bh oho padltlOP having a size dftNjdt}>i2NjdiK two partitions having a sixe of 2NJd‘ 1) * Nj(dkl ), tm fsaPiilons hiving a size of N fd-i P-2N tdl·1), tour parfl§pri§ having a size-of among the partition types 992 through 998 to search fc*:-ftf|)affton lypa 'ibavtag;#: minin'um ene^Ing spm ttfi'lwiii thi petition type 998 has the mmsimspi encoding •maitlmwda^::^'4.,awdi^:pnit CUJtMt having a depth pi d>l Ip m longer spli t ίο 0 'J0»f·the coding units constituting a current LCif SOp. is determined to h@ <J*1 and a partition type of the pyrteof LOU ISO© may be determmed to tje NjcM)*Nj€M). Also, since the roa»m«m depth Is d and an SOU 980 having a lowermost depth of d-1 is no longer split to a teypr depth, ^lit lfilapriaien^^^ler the ®SU llO is net set A rfsia unit 999 may be a 'minimum unit' lor the current tCU A minimum unit appprdiog to one pr more wifepdtm«i may do a squam daia unlt obtained by spMog an Sell ttBi) by 4. By perfamilng the encoding repeatedly, ttte vldeo enondlog separates- top pay the least encoding error by comparing ehPoding ©frees edPsrdmi te ifeptfe of tie coding unit 900 to dctnimlne a eodod deplh, and set a porf-o^rt^ a prediction mode as eo encsoding mode of the coded depth.
As such, the minimum encoding errors according to depths are camparod si all of the depths ol I throygh d# and a depth having the leaaf encoding offer may bo determined as a coded depfc, the coded ifcptp the partition type of the prediction unit Ihd the prediction mode may ho oncoded and transmihcd ae ihfermation ehout an encoding mpdor Also, since a coding unit Is spilt from a depth of 0 to a coded depth. only spll: thtohbattoh o? toe cededdeptb is sal to 0, and spltintornwhoo of depths fpluding the mieddepth is sai to 1,.
The Imiga data and encpcttoc} ifttertoatkto extractor ZSO of fhe video deoobtog apparatus 200 may extract and use the ihlormaikto-about theeodeddepth end toe prediction unit of toe coding unit ®0O to deoctoe toe p&amp;diiion 912, The video decoding 200 mt^ ditoraiw a depth, in wbch split ptorrotoionipa, os o coded depth by usrng spi: IntoMtidh according to depths, and use information about an encoding mode of todepth tor decoding,
FfOS, t? through 10 are digrams for describing a relationship between coding units 1010, prediction units 1060, and tranidorm-aepp prats 10.F{), according to one or more embodiments.
Theoodiog: units 1010 am-toding units having a tree structure, coross^tfingto:" loded depths determined by the v?de< lob, in « py, mo prediction units 1060 are partitions to prediction ynfcs of each of the coding units 1010, and the transform^Bon units 1070 are transformation; units of each of toe seeing units 1010,
When a degth oht L0y Isom the coding units 1010, depthssfeodiog units ιοί 2 aod 1 ©Bb ip depthu oi ooilup wits 1014,1016,10! 8,1026,10¾ and tOBSare 2, degtos of coding units 1020,102¾ 1024,. 1020, icm f€32,and 1Θ48 a^B/and doptte to ceding unite 1040,1Q42,1044, and i§4&amp;mM in the prediction units 1060, some enecjdingunits lOM, 1016,1022,1032,1646, 1050, 1052. and 1054 sre obtained by spiittogtoe coding,, units In too encoding units 1010, Id Other Uh||s 1014, 1022, 1050, arid 1054 have i stop of 2MxN, pahitontypes in toe coding unitsi0i6, 1048, and lOSt have e size of N*2N, and a partition type of toe coding unit 1032 liasaae of N*N. Pri^ciion units; and pirtitibes of too coding units tOi e sre srriaiier toan or equal to each coding UUit::
Inverse iraostofmatme is performed on Image data of toe coding uni 1052 Ιό tibe iXriits 1070 in a data ynit ttie! is smslter than toe coding unit 1052, Also, the coding units t0i4, 1016,1022. 1032., 1048,1050. and 1052 in the imnsfdrrnaiion units 1070 are different from those m the predtotton unite 1060 in ierms of sizes and shapes, in other words, toe video appltgtoebsmay perform Intra prediction, mbfen oamgensei^^te and inverse tFansformuion indii4duai!y cn a deto uoif In;
Accordingly, ««codi^ is· recursively pfrfirmed m each or coding unis havibg m hierarchical structure in each region of a LCU to detenriine an optimum coding unit, and il>us coding units having a recursive tree $b 1¾¾¾¾ Encoding information may sneiudo split ietorfiiatsoc abcoi a coding unit, Ipiformatfeo about a parlion type, information about a prediction mode, and intonmatw about a size of a teiiifocnason unit. Table t fhgws the· encoding intermaHson that may |p set by th$? esfssii^i'iigi ',ιϋιήεϊ '-4feip^iiiiei!^ 00 and 2G&amp; .,..............................................................................................................Table i _____.........................................................................................................................................
The outputter 130 of the video encoding apparels too may output the anc^irti; iolbrmatieo about the eocBng units having a tree structure, and the image data abdi encoding ^formation extractor 220 of the video decoding apparatus 200 may extract the ©hooding information about the coding units having a tree·· «tractate from a roooived btiitmtm
Split Information indicates whether a current coding unit is split into ceding units of a lower depth, if split infprmabcwi of a current depth d *s0. a depth, in which a current coding unit rs no longer split into a lower depth, i$ a coded depth and thus information awti partition type, ppdictron mode, m$M mm-M a rasforo&amp;tidn umf may be deflnod for the coded depfh> if pb priori oodirii gpit Is fyrther bplf split rnfdrmifipn, encoding Is performed go fopr spilli O#hp unfit of t low#; depth, Ά pmsftetion mod®: tbay bPIdh® of ah lnfra mode, an truer mod#, and a. stop mode,
The irstra mode and the loser mode may be denned in all- partition typos, add tho itdp mode is defined only m a parition typo having a s&amp;© of 2N *2N.
The Pformdfiion about the partition type may eyrnrPiirfcai partitta type# having sizes oF 2N*2N} 2N*N. N*2N. and N*N, which are obtained by symmetrically splitting a height or a width-of a predschen unit · and asymmetrical partition types- having-Size# of gNxfilJ, 2N*nQ, n!..-*2N, and nR*2N. which pro obtained by asymmetrically splitllrig the height or wf#t· pf the-prediction, grift. The asymmetry! partition type# haying fie:Mies of MAll and %Mm&amp;. rpay bp respechvefy obtained'%-;spi|ihg the height gt the pred;#bon unis m 1:3 end 3b. end the asymmetrical partition types having the sizes of nL*2N and nR*2N may be respectively obtained by splitteig hha pidtfi of the prediction unh in 1:3 and 3d 'The size of the translsmiiatlon unit may be set Ib b® two types in the iotra mode «d twd lypb#Inthe miermode* In other words, if split information of the transformation unit is G„ the size at the Iranstorrnatmn unit may be 2Nx2N. which is the size of the current coding unit,. If -split information of the transformation omf is i. the transformation units may ho obtained by spishng the current coding uml. Also, d a partition type of the eurfent bsdng unit havmg the size of -£N*2M is a symmetrical partition type, t size of # tfanaippifiahon unit may be fpd'prthe ament boding uol is an asymmetrical partition type, the size of hw transformation unit may bo Νί2*Ν·:'2,
The encoding information about coding unite having a tree structure may include m feast one of a coding unit corresponding to a coded depth, a prediction urM, and a minimum unit. The? coding unit .corresponding to the coded depth may include at least doe pf a predchon unit and a minimum unit containing the same encoding information.
Accordingly, it is determmed white adtacehidf»:unlit $m MoMed in ## iime- coding uml corresportidlng to the coded depth by comparing encoding information of the adjacent data units.. Also, a oorrespondfng^ oodftig umt ©oreesponding to a coded depth is dotonTilnedi by using. ««coding information of a At® unit, and .thus a distribution of boded depth# in &amp; LOO may be determined..
Accordingly, it a eurrent coding up! Is pitedlcfsd .based on opoodlng Mormata of adjacent data units, encoding m Formation of data units in deeper coding units adpeont to the current coding unit may be directly referred to and used
Alternatively. II a current coding um *s predicted basedm encoding infomiaion of adjacent data units*. data obits adjacent to the obfrsht coding : unit art sgsrohod uipg encoded information of the data units, and the searched adpeanf coding unite may bo ritdrred for predicting the current coding unit RG, 20 is a. diagram fair describing a relationship between a coding unit, a ^fiction unit, and a Pansiommifon unit, according to oncodng mode information of •A mtWim toAifa-Oodfa u&amp; 1302. 1304. 1306, lllE 1314. 1316, and 1318 of coded depths. ΗοΓρ,,φοο the coding unit 1318 is a ce^fogpnit at a coded #p#\ Mit information may beset to 0:, Information ataf apartiqn type oi the coding, yet. 1311 hiving a foie of 2N*2Umay besetfobe oac oi a patfion type 1323 having a sae 0ffM*2N5 i partita typo 1334 having a size of 2N*N, a pfotfta typo 1926- having a of tfo«3Nii a partita-type 1328 having;ear*©;·#hkN*©partition typo 1332.haying·», size oT2N>foU. a partita typet334 having a size of 2Νχ·ηΟ% a pediion typa 1S36ttaeg a siia of nL*3N. and a partita type t338 having afozeol nR*2bl
Split information fl'U siso fiagi oi a transformation unit i a typo tran$fatrnatai index. The iizt dt the transformation uifte comsspondfiig id foe tasfcrp&amp;ta iodtex imy Oechanged according fo a prediction unit type or partition type of fo© coding unit.
For example when the partition type *s set to be symmetric ie foe pMition typ; 1 1324* 1:321, dr 11Sfo a taaforitaonsfoi 1342 having» size of 2Nx2d is Sat if aTLJ s*ze flag of a transformation unit is 0, and a transformation golf !:344 hawdg;a sfoa, st N*N is sot if a TU size flag is i.
When foe part&amp;en type is set to be asymmetrical, i.e.bfoi partition type 1332, 1334, 1336. or 1338, a transfomialcn unit 1352 having &amp; size st 2N*2N is set. if a TU aigplteg ?s 0, and a transformita uni 1iS4fo:piM:ng« sa9t':if aTtF al^4. fed Is 1 BMerirfog to FK5; 2¾ foe TU $&amp;e Pag is a flap having a value or 0 or IT but ftie TU 'ί5^ϋ::4|^:·ϊΐ^ί:iliir^it®^-:b&amp;: tdjpt, and a transformation, pniimey be hiaforcbicafly aptit having atrpi sfmctefo white foe TU size lag ioereata femfo Spiff informaten (TU size flag) _4tiiEi^:|ii# of a traesforfoatidn index. 111:Imries^rmei^chii unit that has coco actually umd may be expressed by using a TU size flag of a transformation unit, according m one or more embodiments. together with a maximum sze end minimum size of foe transformation unit: The video encoding apparatus 100 Is oap&amp;bte oi encoding maximum ti^lfomnsben unit suae mformata, mipimumir&amp;nfoofmata gnd size tefarmifoofota»: maximum TU size flag Tbs m^ui of encoding the maximum: transformation unit ska- iiifermittoil. the m^mum transformation-yrsit sip Information, and the maximum Tlj s**e flag may be msnrtbil into an SPS. The video· dscodmg apparatus SOD may decode vide*} by «sing the maximum transformation unit size information, the minimum iransfotTnaten unitsa:e infemiallon, and the maximum TU size flag.
For example, (a) it the size ©t a amront coding unit iav04.w64. -aod a maximum transfomnatw unit size is 32.*32, (a-1) then th© size of a tranatprmabon unit may· be 32x32 when a TU size flag :s 0(. (a-2) may he 1 8*16 when the TU s*ze flag is 1. and fa-3) may be 8^i when i|@ TU sae flag is 2. A® another example, ib) ft the s;za of the current coding unit is 32*32 and a mioiinyei traosteiToatcin unit size is 32*32, lb-1) then the size of the;transformation unit may be 32x32 :#ien ftie TU size flag is 0. Here,ffi© TU 'size flag carmof be sat to a value sifw than: Spin©® the size ol the sranslormahon urit cannot t>? less than 32x32. I® anofter ©xampte, (¢^ it thd size,of the cufregt eodlog up4 is $4*64 end a maximum TU size flag is 1. then rhe TU size flag may be 0 or 1. Hem. the TU size flag cannot be sot to a value other than 0 or' i,
Thus, if U th defsned that the maximum TU size dagli -yaxTrsnsformSiZOIndex’, a minimum transformation unit size is ‘MinTmnsf ormSize'. and i tansfermafioo amf size is sR^!Tu^^ :tbp TUis&amp;a teg IstT transformailen unit size ‘CurHyiteTuSize’ that can b# determined in a current c®$®§ uiiiy may drdbtneet by Equgtidh.il):
CurrMinTuSize * max iMinTransformiize, BoofTuSizo/ft^yaxTmnstorinSizelriiox) )..· in
Cbrnpanad f© the curtertt minimum fabifomialidfr unit size "CnrrbloTySize'' that old be dMormified m the curreef coding unit, a transfomiaaon unit size ‘RoofTuSize' when tfie TU size flag is 0 may denote a maximum tfansiorniafion unit size that can be seteted Irr the system, in iqbatioh (ty -’RedtTu® xiracsfpmtSizeindex}' denslos atrahsformaloR anl size when the-feaitsformation .unit size ’Ri?otTiiSk#, ^w. the. TU size flag is J, Is spit a. pumber of times oorresioopdlog: to the maximum TU size: fte§o and MnTrapsfbrmSiz# denotes m minlmym; tmnstenmalsn slim Thus,. a iihate value from among ' R cm tT · j $ t z ©.· {2 ·' fd a >r T ra π storm $ i a a l ode x) ‘ and ' Mm I ranslorm Size' may be the edrrem minimum transformation unit saze ^GurrMrnTuSIzt* that can be determined in the current coding, . ninth teording f© ©n© ©r mb?© embodiments ttd maxiipum fmnffermdldn unit it® RoofTySizo mayv^ryvaccardins to the type of- &amp; pfcdicta mode.
Pc* wamp&amp;ff &amp;Qtoont prediction: mode is ap Inter mqd@;'isgh Έδόίΐρ8ΐζ®ν nwf Im-determined by using Equation i2) bolo*i. fn Equation (2), ‘Max Transform Size' denotes a maximum transformsItfm unit size, and 'PUSize* denotes a current production unit size.
RootToSlza -mmlMaxTran-sfoimSi«a, PUSsze),,,..,,. 12}
That 'is, it the currant prediction mode is IN mfor mode, the transformation m$ size 1%>stTy5sze\ when the TU size flag is 0, may be a smaller value: from ammg fc maylrnym transformation ypl eiie apcf the current: prediction y.lij&amp;; II a predighan mode of a current partition unit is an Mra made, 'RootTuS^zo' may be determined by usmg Equation [3) detow. In Equation p], PaflttonSiza’ denotes the size ol the current partition unit.
RootTuSUe *· min(MaxTransformSlzevPattlionSlie) .,...,,....-.(3}
That Is,: If lilt current predcfeon made is the item mode, the trandtormatiob ohit size RaofU.iS.;ze when the TiJ size Hag is €s may be a smaller value from among the maximum transformation umt size and the size of the current partition unit..
However, the current man mum transformation urn! sue 'RootTuSIze' that varies according to the type of a predteiten modio in efarttton yrtt it je®t sc exampto and ih*. empadrmerits are noftlmits^iterdte
Acdcvdinp to the video encoding method based on coding units having a r*w structure as ctoscnbed with reference to FIGS. 8 through 20, image data of the space dbmaio Is encoded tor each codmg unit of a free structure, According to the video deosdlnf method cased on ceding units having a ired structum, dedoding is pedomned ter: e&amp;dh LOU to reconstruct image data of the space domain, Thui, a picture and; a video thii is a plctore sequence may be The video may be repfoduped by a repiOdudog apparatus, stored in a storage mefibm»;.dr iramroittPil' through a network.
Also, SAG parameters may be signaled with respect to each pictyre, each slice,: each LCU, each el coding units having a tree srriMstpr®>$^|Mediaiqb:un|.df M<c0dNig. units, ..m each tenstemiatfon unit of the coding unis. For- sKSmpie. pixdl values of reconstructed pixels of each LCU rosy fee adjusted by using offset values meonsmered based on received SAD parameters, and thus so LCU having a nunirmzed error betweori art original block arid the IGU may be mcanstructed
The ombbcfirristitg may b# ywttep as ctompytdr programs add may te ImpiuniuBted in gcncraFuac digtel compute that: execute the programs· ustog; a medium Examples, of the oompyteroeadatfe recording medium delude magnetic storage media ta.g,, ROM, fteppy $1¾¾ bird discs, elm) iMf pptfe&amp;i recording mefe (#.g„ €D~ROyo. or DVDs>.
Wfisld ll^e one or ^M;';:iiS(b|?ltrs(teiits have been particularly shown and described life reference to exemplary embodiments Iheree^ It will be upstereteod by one of ordinary skill In the art that vyn#us changes m form and details may be made toefeii; witteut d^partbg from the spirit aπdl:^ϊ£^^^l:';|!!ι^·lriverιyon as defined by the following claims. The exemplary embodiments should be considered in a descriptive sense only and not for purposes of limitation therefore, the scope of tho invention ie defined not by the ddlaited descrip^on of the mwmran but by lh# following claims, end all diifcmnqas within the gioope wdl be conofmodas being tndodedl Ip tie oneor more embadlmente.
For oQitvef&amp;iemi® :οίϊ dfescripidiv. the: . enroding method scosrding to adjustment of a campto offset which is described above with reference to FIGS, 1A through 20, will be referred to &amp;$ a Video encoding, method according to toe one or more &amp;mbodhments\ In addition, the video'-d^eoiS^;i.f«s*c^:,a^s^r^ to adlysto^ont cf a sample oitsef. whsch is described af?ove..^lth. fA ilrmigii 2i, iif tfe tofirred fo ac a video ioeodtog motood aocordliig to lb© one or moro em:bodlmems\ Afe&amp;litto vidpo eoepding apparatus 1¾ the wdsd btidsdlnf apparatus 100, or the Image encoder 400r which is descdbed above with fefeeinnfe id FIGS. 1A fhmoghab, wi be referred to as a videa encoding apparatus poording' to toe one or inore emPodlmonto', to addition, a video decoding apparatus ihbferfcg .i^ykteo docediog apparatus 20, toe. video decoding apparatus 2Ό0, or the image decoder 500. which is described above with reference to FIGS 1A through 20, will b# fefsnred to as a Video decoding apparatus according to the an# or mom atidddl feints', A computor-readable recording medium staring a progrsm, e,g,, a dtso 26PG€, acpordlog fe one or mow cmbodtoiocfe wi now fee rfesenbetJ In detail, FIG.M it # feagrafe ai a phyMbaf ;$ie^::_2fiN30p’ In which a. program
Is stored, acoardini to one or more embedments. The disc 2SOO€y which is a storage medium, may bo a hard dnvo. a compact diacaoad only memory (CD-ROM) dscc, a Bly-ray disc, #r a. digital versatile disc {DVDp Th#·*^o· If.·of oonceninc sacks Tr feat are. each divided »mo a spediie number of sectors Be In a circymlorbntfeidifebtfen;felThelfellP: 2SQC0. in: a spfefelflfe: region of too dfeo 26000,¾ program that ©xeenfes the p&amp;ntte&amp;ifbn parameter dotoimtoalbn method, the video encoding method. and toe video decoding «method described above may be assigned and stored. A computer system embodied using a storage medium that Stores a program tor e>i«@is!jrtir!0 yilciso method and the video deendtog method as described above ^ftpser^-scte^erilNgdl Id Ft@, FKB. 22 &amp; a Vagram ot a dtoo drive 266O0 for recording and reading a program foy using the disc 20000,. A computer system 20700 may stores program that e^eoytes at least one of a video encoding method and a video decoding method according to one or rfooto embodiments. iri tie disc 26000 via the disc ddva tiBOO, To din tie program itordd in me disc 26000 in the computer system 2O70QS toe program mayberesb from •The program that executes at least one of a video encoding; decoding method according to one or mom emibpdments may foe stored not only In the disc 26000 illustrated in FIG. 21 or 22 but also m a memory card» a ROM casaette, or a solid state drive |S$D) A system to «whksh the vktoo anendtog method,arid a video decoding metond cteacr toed above are applied wi be described- below, FIG. 23 to a diagram of an overall structure of a content supply system tt-OQCMCr providing a content dislribyPon service. A service area of a communication system is dvlded into predetermined-siied cells, and wireless teas stolons 11700,11SM. 1 iSO0s and 12000 are tostoled in these colls, rospostiv^ly,
Tl% ootitiht supply fystim 1HXSQ includes a plurality to Independent devices. For f M|mpl% the pfyraity of independent devioe®, such a® a oompytor 12100, a persohOf;digW assistant (PDA} 12200, a video i^mem 123¾ and a mobile ph$s0y
to toe intend ΐτΐ$ρ.*Ιβ m iifemet :#^^;prdVfcfeir:'Tt200, a e^jhiitolldhv.'hefeTOdr; 11400. and'^*jMiN*'bm .stations fI7G0, 11600, H9G0, and 120CCK
Hewevbr, the ecnlent supply system t10O€ to net limited to as lustraled to FIG, 34, and devices may be selectively connected thereto. The plurality oi independent device® may foe directly connected to the :φ· Μ'·: vtoetoss base stations 11700,11800,11900, and tSOOO. imaging is capable of captedsig videeJm&amp;ps. The πκΜτ phone. 12SG0 may emptoy ist least one communication method from among venous protocols, a g . Personal Digital Communications (PDG)t Code Division Multiple Accessv|CDMA^':'-'-Wi^sbarici''p0^:;.· Division Multiple Access (W-ODMA^ Gtoat System tor Mo^^ Communicafions (08% ato Personal I1ato|iphbrte System {PHS},
The video eamera 1230® may.· fee ooftrwctto: to a sf reaming server 113S0 via to we less base station 11900 and the commumqation network t14po. The streaming server I lSCtP allows confect:^ via to video camera. 12300 la be streamed via a rtofetf to broadcast The content received from tie video camera 133® may be encoded utog to video1 camera 133PDOrth# ^eimihitotor 113®, Videb dto caplurfed by the vidod-camera 13300 may fee tr&amp;esmtob to to streaming gator 113fe| to the campy ter 12100.
Video dto ^ camera.12Θ00 may ate fee tmnsmfittote toSfeegmmg aetor 113® 1310¾ The camera 12600 is m imaging device capable of capturing both gtttil images and video images, simitar to a digital camera, "the video data captured fey the camera 12800 may be· encoded using the oatora 1£€DQ or tho computer 1 £1$0, Software that performs encoding and decoding video may be sperod in a obmpctortadito rooertoi e.g„ a CD-ROM diee. a floppy tocra hard #ae dfto ® SSD, or a memory card, which may to e® mob. f : data ίο captured by a camera built in to mobile phono 18¾¾ to video data mayfee toeivto^tarn tfeemtotetoone 12500.. li iiltoMf ato to eooaded fey a targe scale iolegraMd eirsyit (LSI) system instated in to video camera 12300, the mobile phone 12500, or the camera 128»
The content supply system t iDGO may encode content data recorded by a year using: the Video cattler^ 1.2300, the camera 12@G€f the mohrfe phono 12500,. or another imaging to vie m e;g,(: conttof recotod dodeg a oonceri. and transmit the encoded feotom data. ta tfee streaming server il380. Tteatrsaming server 1 130® may transmit to enitoto totont data in a; toe M astreMnlto ^tifcnt to ether clients that request to toptentdato
The ofiehts am devices tofeabh of deetoing the en&amp;ptod content data. e.g.. the computer 121 DO, to PDA 12200. the vtoo camera 12300, or the mobile phone t25C€ Tbu$l: to content supf# #tom ι ιΙΟδilotoIto eitoptsi to ^ reproduce the «toed con torn data. Also, to, content supply system 11000 allows-to clients to receive iii© encoded content data and decode and reproduce me encoded content data id real time, thereby- envying personal broadcasting
Eroding pp decoding operations Of the pMftlfty:' included in me- contem supply system 11000 may os similar to those of a video encoding apparatus ami i vidimdectxii^g apparatus according to on# or mow embodrmenan
Tli#mol|iiId|jhdno 12SOO mduded w the coniem supply eyslWl . 1δ δ® »®»1 emlmdirrieots will no® be described in f renter detail with referring to FI©, It illustrates an external stature of the mobile phone 12500 to wliluha y?S@o ieeddini method aml a wfoo decoding metaf are appledy according to ooecr more bhihPdimentSu The mohfephone 12O0Omay bo a smad.phone, ftp functioiiO d which are not limitiNdend a large number cllbe fyeclona of which may be changed or pxptaed
The mobile phone 12500 mdud®$ an infernal onionna 12510 via which a radio-froquehcy iRF) sigoal may be exchanged with the wireless base station 12000 of RG. 21, §nd Includes a display screen i25?Ci for displaying images captured by a camera 1253dm that am received v;a liquid pryetai> dilpay (LCD) or an organic lifhi-emiting diode {OLEOi seism The mobile phone 12ΐρΜοΛ 12540 including a wtml bytton amt a touch panel, if fig dliptey screen 12520 is a touch screen, the opera lion paddi 12540 further ioeiudeea leech sensing panel of the display screen 12520. The mobile phone 12500 Indydds a apafer 12580 for outputting voice and sound oranother type of sound outputfer, and a microphone 12550 for inputting voice and sound or another type sound inpyfter. The mobile phono isEhO ferther includes tte camera 12530, each as a chergp^cbopled^ ^ camera, to capture video and stlf images, The mobile phone 12600 may feriier include a sfora.ge mbdiym 12570 tor storing oncoded/decoded data*##, visfeomsiii images captured by the camera 12530, received via email or obtained ac&amp;hfdi^ ways: and a slot 12560 via which tfs-e storage medium 12570 is ioad^tiiB::lha:.iBn^fe phone 12500. The sterna medium i 2570 may he a flash memory, e g,:, a secure digital {80) card or an clcaihooliy erasable end j^pammable mad only memory rEEPRQy) Mtad tn a pfasttc case. FIG, 25 females an infernal struaum df the iTrob^le phone 12500, according t| ctiipr more pi^pments. To ayctentafiy control parts eftfie mobile phone 125CK) including tbe dicplay screeri 12520 and the operation panel 12540, a power supply circuit 12700, an operation input controller 12S40. an image encoder 12720, a camera interface 1fP0, an LCD controller 12620, an tmagf: decocter ::1£690. a muiyp3exer,'^^'p|ti^^r 1268¾ a recordedreader 12S70, a liiddu^tor/^diocMa tdf 12060., and &amp; sound processor 12650 are coGbabtad ίο a eenfmf oo0t®lfir 12710via a. syp.cliFOP.izetiw Pert 2730,
If k:bpi;:^raip a power buitonand sots Pom a 'power off stale to a tjower oaf power In ail the parts ©i ife wiobile phone 12160tod a Pater/ pack. thereby seisrp tPe moble phone 1250Q m an operation modf.
Tit# Ponpal PonlnateF t2?1dIn©iubos a central procoesing: unit a abb a fWMi, WNfirii· mdtoite phone tgSSPirarmmits^cc^myrticalon dsla to His mtside, a digital signal is generated by ths moWa phone 12500 under control of the central controller 42710. For example, the sound processor 12650 may generate a digital sound si.pal,:^Wip enceefer 12720 may goaam^ a digital image sfpaL and left data of a message may Pa generated via the operation panel 12540 and the operation Input sontrptfpr :12640. When a digital signal if tmosmittea to the niodoli^r.3Jef0ddyla|pr 1^660 under' central of the central oontroifer 12710, the mcKluiatQrrdemodu laser 12660 modulates a frequency band of the c&amp;jiiaJ signal, and a communication c*a*it 12510 pf rferms cfigial- tu-an slog conversion (OAC) and frequency donverfibri on the freqpaoc/ MndHmpdulater! digital sound s*gnal. Ajransmlsaion signal output; from tho ooffimonlbaten circuit 12610 may bo traasmiitad to a voids base station or the wireless Paso statin 12000 -m the antenna 12510.
Fpr 12500 &amp; ur acpm^ffalbri mods, &amp;sound signal abliped via the microphone 12550 is transformed into a digital sbyiid s^pfl by the edond 12650, ynddr dbptrof ot the central controller 12710, The digital sound signal may be tranffermpd imp a fransfoiimaiion signal via itm TOdil^pg^d^hodulalor I2p0 ahi fra communication circuit 12610. and may be tranomfisd via tte anfaana 12510,
Whtn a text message, o.g., email is transmitted in a data communication mode, Mxt ditafpf tetd mesaapd if ipput via tbs operation panel 12540 and if bansmittad to tl>e oeaiml GfrHiolfer 12710 via the operatioo input oontroier 12640. -Onder control of r :i27liytbt tpm. d^ signalvia ire mpduJalpPd^sipinlafor 12i00 and ifm communication circuit 12610 sad ip «SfhtiPd id the wireless base station 12000 via the amentia 1310<
To transmit imago daia in the data communication mode, image data captured by te camera 12530 k provided to the image encoder 12720 vie the earners interface 12630. The captured image data may be ditc-dly displayed m the display screen' 12520 via ihe camera interface' Ί2830 and th« ICO controller 13620 A structure of the image eooodir 12720 may correspond to Ifiei of the abe^-tfe'Serihed yidso encoding method according to the one or more enibodimeots. Tflayimagf: encoder 12720 may transform the image data renewed from die ctss 12530 snip compressed and encoded image data based on the above-described video pmiotomg method-aajorefeng.· to the one- or-' mom- embodiments, and then output the eooodedl Image data, to the muftiptexetfdenultiptexer 12680. During a recording operation of - it® cameia 12530» a sound signal ehtatned by the microphone 12550 ot the m#io phcM 12S00 may be transfomiad into dlgfel sound data vsa the sound processor 12650., and the digital: sound data may Ob transmitted to hi® mytiptexeddemuitiplexer 12680.
Ttld mytl^&amp;bdemyillptexpr 1268.0 muillpfeses ft® encoded image data reeti^d Imm ftm imagt encode r 12720, together wrth the aaynd data received i ram the aound pmdeaaer 12650, A result cl ffiudpiehng the data may be tmrisf armed info a transmission signal via the modulaior/demodutator 12660 and the cotoniuofcattoo clOUit 12010, and may then be transmitted via the anfeona 12610. W hile the mobile phone 12500 receives cammunicidiun dale f rom I be outside., 'tepeacf recovery and ADC are performed on a signal received via the apfenna 12510 to transform (he signal into a digital serial. The mbdufaforkiimedyiator 12660 rti0ciijiai.ee a frecpency band of the digital signal, Th# iragjenoytoahd meddlated digstoi i-gnal is transmitted to the video decoding unit 12630, the sound orocessar12656, or die LCDeonfraite 12620, according to this type of tha digits! signal, .In. fbo conv^^to-mode. the mebla phono 12600 amplifies a signal received via the antenna 12510, and obtains a digital sound signet by performing frequency -Pdnversibn and ADC m the ampilidi signal A received digital sound signal is transformed into an analog sound signal via the moduilator/demodulator 12600 and the sound processor 12650, and the analog sound signal is output via the speaker 12SS0, under control af the central controller 12710
When in mode, data of a video flip aooca&amp;od at an internet. website k rec@i)«^rf,,"a,stortaf: racotvacf fromthe wireless Mse station 12000 via
Ci is oifi;i| tfe modyfatobdemoduM^ |M^e multiplexed data is transmitted'to ttie mutflptexef/demotttpi©*er 12§i0v
To decode the muftiptexed data received1 via the antenna 12510« die demultiplexes the multiplexed date Into an encoded video data stream and an encoded audio date stream, Via the sypphrwfzatioB· dus T stream and the encoded audio data stream are provided to the video decoding: uni respolvciy, A structure ol the image decoder 128i9 may oocespcicd to thsf of fti; above-described video decoding method according to the one or more embodiments, Tito image decoder 12690 may decode the weeded video data to Obtain retmnstmcfisd: video data end provide the reconstructed Video data to the dlsplay's^gn 12S2G via ttio; LGD controller 1262D, by using the above-desialbed video decoding .«sethocf according td Ilia Ohd or more embodiments.
Thus,, the data ol the video fie accessed at the Internet website may be displayed m the display screen 12529.. At the same time, the sound pmcessof 12850 may transform audio data into an analog sound signal, and provide the analog sound signal to the appifer 12600,. Tim audio data contoned in the video file accessed at me hUemcl website may also be reproduced via the speaker 12580
Tha mobifa phono 12500 or another type of sammanlc^cKi tannfeaf ηφ£$&amp;%: IransceMhg terminal including both a video ancodtag :- ^pafefps aocardin|i ta one or more embodiments, may· be airaeseeiviog terminal feluding antythe video encoding appam&amp;a, or may be atranseaWng terming IneiuAg poly the video decoding apparatus,
Aeommuoieatiers system according lotus one or mere embodiments ^ rm felted to the^ctfetiupfelpd sysiam described^ reference to RG, 24. For exampM, FIG; g® iiNitr^as s έ3ί§|§^ employing a communication system, acoording to « or^ maro embc^majils. Tim digital broadcasting system of RG..' 28 may meefve a digital bfaadeaet: iinpiittadt Br;ua network by uclng a video encoding apparatysanda video decoding appmaSis noomding to one or more ^dmiments.
Specifically, a hroadeasflug MMion 12890 transmits avideo data stream to a .©^moriicattoni-^llile·. or :a :t»t>aEl^^'R0·. satellite 12900 by using radio waves. Tim broadcasting sofeife 19TO banemi^ a broadcast signal, anbthe broadcast signal is Iransinittcd to a satefllie traadeast ioeelvor via a household antenna 12860. In every houii* an encoded decoded and reproduced by a TV receiver 12810- a set*top box 12870» or another device,
When a video decoding apparatus acconfttg;.fp.prisrtfr mdr^i^ fenplemented in a r^pradyoisig ^pa^lda 12830» the «producing apparatus 12830 may par» and encode sm encoded video stream recorded m- a storage medium 12820, such as a; disc or a memory card to reconstruct distal signals. Thus, tbi#pnstnrcted vid^g sip# may be mpmduged» for example, on a..moititor 12840-
Id the seTtop box; 12870 connected δ&amp; tli^ij^liigrBniia. broadcast or a cable antenna 1285() tor receiving a cable television (TV) broadcast, a wdao dacodsog apparatus according ic soi or more arrtolimapts may 0© ^staffed-Dpla putppt from Pie set-top box 128/0 hlay allplbe;:·.
As apothar exampla- a video decoding apparatus according M dee or mor© orhibitdifii may Ac instated m the TV receiver )8810 instead of the set-top box 12B70.,
An automobile 12920 that has an appropriate amenea 12910 may receive aelgnai from die satellite 12900 or the wireless base stalled 11700 of TO» 21, A dbbodtd vidbo may bp reproduced on a dftpiay sprean ef an aytonidble navtgilgn syltem ! 2933- )n$tal!id in the autef nubile 12920» A video signal may ix? encoded by"M («Iso encoding apparatus according to o«e or owe embodiments and may then -be^stoed tn a storage medium» Specifically, on image signal may be stored in a OVD iSisd 12980 by a OVO recorder or may be stored In a Mrd disc by a hard dl» recorder 12950» As anoiw ixampls» the video sffpsf may be stored in an SO card 12970. If the hard drsc recorder 12950 Myd^Ct vidbA decodmg apparatus according to one or more amDodlmcnts, a vldcc signal, recorded on i he DVD disc 12960» the SO card 12970» or anothei storage medium may l^^proifitaa&amp;l: on the TV mpnffcr 12880,:
Th®; automobile navigation sygfam 12930 may net snclrt -the camera 12530 of FIO, 24» and she camera «nterface IBipaiMi: the image cocsdar 12720 of FI®. 25, For example, the computer 12100 and 'the TV resefvet 12810 may noTiocfudd ifio camera 12530, the camera Interlace 12630» and the image encoder 12720.
Fi-Q, 27 is a diagram illueLrabfig a network structure of a Cloud ppmputidg afflito/ using a vkieo caeming apparatus and a video decoding apparatuMSSf^ihg to one pf more ornbgdimipfl.
The cloud computing system may Include a; cloud ©pm.pti^g;SN@«ver Ά 16^*λμμ*: |0B)· 14100* a plurality of computing resources 14200,-&amp;ncJ a usertemwal
Ttie cloud oompifong system provides an gn-domand outsourcing service of fe plurality of computing resources 14200 via a data cc^nmu^tc^tion network, e.g., 11¾ fofeffiei id response to a raciest fram the user to tricot Under a doud computing environment a service provider provide - users w4h desired services· by voroimingr. computing resources a! data oemers located at physically diderent tooafjons by using virtuaiizafkm technology. A service user does not have to install computing resources, kg,, an application, a storage* kO'.#®P4iir^':%^s|eiti fpSi* arid socurity, into h is/her own terminal m order to use them, but may select and use desired services tram among wmap· ?n a virtual spaao ger^ra^scl Ibmugh the virfoal/zaiion technology;, at a dewed point Irt tirtso,, Ά osar terrains! of a spedtai service mm- is connected to the etaud computing serypr 14000 via a data sommynloatidn dtl^o# iooludlng fta Intamat :add, a. mobita telecommunication network User terminals may be provided cloud oompmmg services, end particularly video reproduction services, from the doud computing server 14000* The user tornvnais may be various types of electronic deuces capable ej bo mg oonnsoted to thefofonret e.g., a desktop PC 14300, a smart Tv 14400. a smart phene 44¾¾ a notabpdk doraputer 1460OV a portable mylftaadia player (PMF*y 1W* a tablet PC T4SOO.. and the like..
The cloud computing server 14000 may semens the plurality of computing resources 14200 distributed in a doud network, and provide user terminals with a. result of combining, The ptaiity of ceitiputibg res^ymes 14200 may mdudis various data fidtade data uploaded from user igrmthmis, As dnsedbed abow, iba pteud eomputing serve* 14000 may provide user terminals with dpsirel services by dombinmg video database dsu ibuied in different regions according to the virtual! gallon technology,
User information «tout «sers-.wh0jiw.sylj$eribecl for a donb computing sarvloa is storedNd the user OB 14100.. The user information iriayi'Ii(^ls^id^\feGiig^ih^..f^fo.rrn^^^i^ addresses* names, and personal sradl infonmstien of the users. The user Informelnn may further include indexes of vfooos Hera, the indexes may include a list gi videos that have already been reproduced, a list of videos that are being reproduced, a pausing point of &amp; vcfeo that was being reproduced, and the like vfdw m tile gsdr DB 14100 may be shared Ibtsyeen usor doercbS;. iFfor example* when a vkteo aorvicd is provided to fhe notebook computer 14S00 in response to a raciest from the notebook computer t4600, a reproduction history of the video servfee m stored m to® user OB 14100, Whoo a request to reproduce this video service &amp; received from the siliert phone iiSOO, the cfoud computing server 14000 searches for and reproduces tlii#.ν|β|©φι.€>« φβ; user OB 14100. When the smart plioeo 14500 m video data stream-from·the cioud comjH&amp;ng server 14000, a process of reproducing vtoeo by decoding the video dfeti ffraam ieslmflir tc an operation of the mobile phene 12500 desodbed above with ii^rebcp td: FtS.h24:i::
The cloud computing server 14000 may refer to a reproduction history oi a dpitod video wvlce, storod Ip the user 08 I4i 00, For example. the cfoed computing vl^e«i_'jS^nesi in the user OB 14100, from a user termioM, II tots' video was being, reproduced* then a method of sbMtrotof this Video, performed by the cloud computing server 14000, may vary according to the request from the user terminal, he., according to .yldteO',-Τ^ρίΌίΙιΚ^ί,,.. starting from a start thereof or a pausing point thereql For eKarcpits If the user lormimf requests id reproduce [ho video, starting from the start [doroot, the ctoud computing oerver 14OO0 tmnsmifi; itsiaining dale of the Λ f rom afirot frame thereafter the .user terminal. If tficuserterroinai roepesfs to reproduce the video, starting tom the pausing point therepfptie doud computing server 14000 iraiistef^'-^^fffffig data Of to®/.: vldadBdd^^ from a frame cromecpcoding to the pausing pointJb the uaor terminal, iimitoisease, #p 11¾ terminal miy frieloda a video dM;oding apparMus: as Ascribed above with referehostD FIGS. 1Λ through 20. As another evaropte , the user terminal may include a video encoding apparatus as pesenbed above with reference to FI©:S.1 A through 20. Allemotiveiy. the user tprmfhiti may ifiofod®· bait 'toa'WideeK: citopddfhigi apparatus adtl tfe video enoodNn-g apparatus as described above with reference to FIGS, 1A through 3Θ,
Various ppplcalidns. of a. video encoding method, a video speeding: method:, a video encoding apparatus, and a vtoeo decoding apparatus according to the one or more emMlroerfe described above with retotoiioe to FICTS. 1A through 20 have been described above with reference to R0$. 21 to sc However, methods of storing the video encoding method w$ the video decoding method In a etorage medi'Uili or methods of Inriipieniantiog the video encoding apparatus and the video decoding apoaraius in a .i^g^^iou^· embqdltobnto, ^ r»pit fIrrtiΙα |#0nifo4idirrifssTsM deriifp^l above with reference to FIGS. 21 to 27. WMe the Mi Dr more embodiments have tew particularly show and described with reference to embodiments thereof, it wifi be umferstood by one of ordinary skill In the art that various changes m form and details may be made therein wltitc^ut departing fmm the spirit and scope of the invention: a# defined by ibe iotlawincj claims. TM embodiments should be considered in a descnpfiva sense only and not for purposes of itmitatipo. Therefore, the scope of the invention ts damned net by the detailed description cl the icyeriion bet iby the following cierms, and ait ditfereeces within fc Mope wl be eopibyed as being Inblydedin the one dr mom embodi moots.

Claims (2)

  1. The claims defining the invention are as follows:
    1. A video decoding apparatus comprising: a decoder which is configured for obtaining slice offset information indicating whether to apply an offset according to an offset type for a current slice, and compensating for samples of the current block by using a offset parameter of the current block, wherein: when the slice offset information indicates that the offset is applied, the decoder is configured for obtaining left offset merging information of a current block among blocks included in the current slice by performing entropy decoding on a bitstream using a context mode, when the left offset merging information indicates that the offset parameter of the current block is determined according to an offset parameter of a left block, the decoder is configured for determining the offset parameter of the current block using the offset parameter of the left block, when the left offset merging information indicates that the offset parameter of the current block is not determined according to the offset parameter of the left block, the decoder is configured for obtaining upper offset merging information of the current block by performing entropy decoding on the bitstream using the context mode, when the upper offset merging information indicates that the offset parameter of the current block is determined according to an offset parameter of an upper block, the decoder is configured for determining the offset parameter of the current block using the offset parameter of the upper block, when the upper offset merging information indicates that the offset parameter of the current block is not determined according to an offset parameter of the upper block, the decoder is configured for obtaining the offset parameter of the current block from the bitstream, the offset parameter comprises at least one of offset type information and offset values, the offset type information indicates the offset type or whether to apply an offset to the current block, and the offset type is one of a band offset type and an edge offset type, when the offset type information indicates the band offset type, absolute values of the offset values is obtained by performing entropy decoding in a bypass mode on the bitstream, and, when the absolute values of the offset values are not zero, signs of the offset values is further obtained from the bitstream.
  2. 2. The video decoding apparatus of claim 1, wherein, when the upper offset merging information indicates that the offset parameter of the current block is determined according to the offset parameter of the upper block, the offset parameter of the current block is not obtained from the bitstream.
AU2016204699A 2012-06-11 2016-07-07 Method and apparatus for encoding videos sharing sao parameter according to color component Active AU2016204699B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2016204699A AU2016204699B2 (en) 2012-06-11 2016-07-07 Method and apparatus for encoding videos sharing sao parameter according to color component
AU2017265158A AU2017265158B2 (en) 2012-06-11 2017-11-24 Method and apparatus for encoding videos sharing sao parameter according to color component

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201261657967P 2012-06-11 2012-06-11
US61/657,967 2012-06-11
PCT/KR2013/005112 WO2013187654A1 (en) 2012-06-11 2013-06-11 Method and apparatus for encoding videos sharing sao parameter according to color component
AU2013275098A AU2013275098B2 (en) 2012-06-11 2013-06-11 Method and apparatus for encoding videos sharing SAO parameter according to color component
AU2016204699A AU2016204699B2 (en) 2012-06-11 2016-07-07 Method and apparatus for encoding videos sharing sao parameter according to color component

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
AU2013275098A Division AU2013275098B2 (en) 2012-06-11 2013-06-11 Method and apparatus for encoding videos sharing SAO parameter according to color component

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2017265158A Division AU2017265158B2 (en) 2012-06-11 2017-11-24 Method and apparatus for encoding videos sharing sao parameter according to color component

Publications (2)

Publication Number Publication Date
AU2016204699A1 true AU2016204699A1 (en) 2016-07-21
AU2016204699B2 AU2016204699B2 (en) 2017-08-31

Family

ID=49758427

Family Applications (3)

Application Number Title Priority Date Filing Date
AU2013275098A Active AU2013275098B2 (en) 2012-06-11 2013-06-11 Method and apparatus for encoding videos sharing SAO parameter according to color component
AU2016204699A Active AU2016204699B2 (en) 2012-06-11 2016-07-07 Method and apparatus for encoding videos sharing sao parameter according to color component
AU2017265158A Active AU2017265158B2 (en) 2012-06-11 2017-11-24 Method and apparatus for encoding videos sharing sao parameter according to color component

Family Applications Before (1)

Application Number Title Priority Date Filing Date
AU2013275098A Active AU2013275098B2 (en) 2012-06-11 2013-06-11 Method and apparatus for encoding videos sharing SAO parameter according to color component

Family Applications After (1)

Application Number Title Priority Date Filing Date
AU2017265158A Active AU2017265158B2 (en) 2012-06-11 2017-11-24 Method and apparatus for encoding videos sharing sao parameter according to color component

Country Status (27)

Country Link
US (6) US9807392B2 (en)
EP (5) EP3300365B1 (en)
JP (5) JP5990326B2 (en)
KR (6) KR101529994B1 (en)
CN (10) CN104902273B (en)
AU (3) AU2013275098B2 (en)
BR (2) BR112014030935B1 (en)
CA (2) CA2876167C (en)
CY (1) CY1119864T1 (en)
DK (5) DK3297284T3 (en)
ES (5) ES2746341T3 (en)
HR (1) HRP20171936T1 (en)
HU (5) HUE047442T2 (en)
IN (1) IN2014MN02643A (en)
LT (1) LT2854398T (en)
MX (2) MX352119B (en)
MY (2) MY171147A (en)
NO (1) NO2938424T3 (en)
PH (5) PH12014502761B1 (en)
PL (5) PL2854398T3 (en)
PT (1) PT2854398T (en)
RS (1) RS56675B1 (en)
RU (4) RU2595578C2 (en)
SG (4) SG11201408246QA (en)
SI (1) SI2854398T1 (en)
TW (5) TWI604718B (en)
WO (1) WO2013187654A1 (en)

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120035096A (en) * 2010-10-04 2012-04-13 한국전자통신연구원 A method and apparatus of side information signaling for quadtree transform
TW201322772A (en) 2011-06-23 2013-06-01 Panasonic Corp Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
USRE47366E1 (en) 2011-06-23 2019-04-23 Sun Patent Trust Image decoding method and apparatus based on a signal type of the control parameter of the current block
KR102062283B1 (en) 2011-06-24 2020-01-03 선 페이턴트 트러스트 Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
TWI581615B (en) * 2011-06-24 2017-05-01 Sun Patent Trust A decoding method, a coding method, a decoding device, an encoding device, and a coding / decoding device
EP4270950A3 (en) 2011-06-27 2024-02-14 Sun Patent Trust Encoding apparatus and decoding apparatus
BR122022013246B1 (en) 2011-06-28 2022-11-16 Sun Patent Trust DECODING APPARATUS FOR DECODING A CONTROL PARAMETER FOR CONTROLLING DECODING OF AN IMAGE, AND CODING APPARATUS FOR ENCODING A CONTROL PARAMETER FOR CONTROLLING ENCODING OF AN IMAGE
WO2013001767A1 (en) 2011-06-29 2013-01-03 パナソニック株式会社 Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
AU2012277219A1 (en) 2011-06-30 2013-09-19 Sun Patent Trust Image decoding method, image encoding method, image decoding device, image encoding device, and image encoding/decoding device
AU2012277220B2 (en) 2011-06-30 2016-08-11 Sun Patent Trust Image decoding method, image coding method, image decoding apparatus, image coding apparatus, and image coding and decoding apparatus
MX2013013941A (en) 2011-07-11 2014-01-31 Panasonic Corp Image decoding method, image encoding method, image decoding apparatus, image encoding apparatus, and image encoding/decoding apparatus.
US9253482B2 (en) 2011-11-08 2016-02-02 Texas Insturments Incorporated Method and apparatus for sample adaptive offset without sign coding
US9560362B2 (en) * 2011-12-22 2017-01-31 Mediatek Inc. Method and apparatus of texture image compression in 3D video coding
DK3297284T3 (en) 2012-06-11 2019-09-23 Samsung Electronics Co Ltd ENCODING AND DECODING VIDEOS SHARING SAO PARAMETERS ACCORDING TO A COLOR COMPONENT
TWI618404B (en) * 2012-06-27 2018-03-11 Sony Corp Image processing device and method
TWI557727B (en) 2013-04-05 2016-11-11 杜比國際公司 An audio processing system, a multimedia processing system, a method of processing an audio bitstream and a computer program product
EP2996269A1 (en) 2014-09-09 2016-03-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Audio splicing concept
US9942551B2 (en) * 2015-01-30 2018-04-10 Qualcomm Incorporated Palette index grouping for video coding
US9729885B2 (en) * 2015-02-11 2017-08-08 Futurewei Technologies, Inc. Apparatus and method for compressing color index map
US9877024B2 (en) 2015-03-06 2018-01-23 Qualcomm Incorporated Low complexity sample adaptive offset (SAO) coding
CN105049853A (en) * 2015-07-06 2015-11-11 深圳市云宙多媒体技术有限公司 SAO coding method and system based on fragment source analysis
WO2017014585A1 (en) * 2015-07-21 2017-01-26 엘지전자(주) Method and device for processing video signal using graph-based transform
CA2997193C (en) * 2015-09-03 2021-04-06 Mediatek Inc. Method and apparatus of neural network based processing in video coding
WO2017083553A1 (en) * 2015-11-10 2017-05-18 Vid Scale, Inc. Systems and methods for coding in super-block based video coding framework
US10841581B2 (en) * 2016-07-14 2020-11-17 Arris Enterprises Llc Region specific encoding and SAO-sensitive-slice-width-adaptation for improved-quality HEVC encoding
US10567775B2 (en) * 2016-10-01 2020-02-18 Intel Corporation Method and system of hardware accelerated video coding with per-frame parameter control
KR20230143626A (en) * 2017-03-22 2023-10-12 한양대학교 산학협력단 Image encoding/decoding method using pixel value range constituting image
KR102302643B1 (en) * 2017-04-21 2021-09-14 제니맥스 미디어 인크. Systems and Methods for Player Input Motion Compensation by Motion Vectors Prediction
CN115714862A (en) * 2017-05-31 2023-02-24 交互数字麦迪逊专利控股公司 Method and apparatus for picture coding and decoding
EP3750320B1 (en) * 2018-02-09 2023-11-08 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Adaptive in-loop filter with multiple feature-based classifications
KR20230031992A (en) 2018-06-26 2023-03-07 후아웨이 테크놀러지 컴퍼니 리미티드 High-level syntax designs for point cloud coding
EP3846470A4 (en) * 2018-08-31 2022-05-11 Samsung Electronics Co., Ltd. Video decoding method and apparatus, and video encoding method and apparatus
KR20200044662A (en) * 2018-10-19 2020-04-29 삼성전자주식회사 Apparatus and method for performing artificial intelligence encoding and artificial intelligence decoding of image
WO2020094049A1 (en) 2018-11-06 2020-05-14 Beijing Bytedance Network Technology Co., Ltd. Extensions of inter prediction with geometric partitioning
US10727868B2 (en) * 2018-12-03 2020-07-28 Samsung Electronics Co., Ltd. Apparatus and method for offset optimization for low-density parity-check (LDPC) code
WO2020140862A1 (en) * 2018-12-30 2020-07-09 Beijing Bytedance Network Technology Co., Ltd. Conditional application of inter prediction with geometric partitioning in video processing
WO2022115698A1 (en) * 2020-11-30 2022-06-02 Beijing Dajia Internet Information Technology Co., Ltd. Chroma coding enhancement in cross-component sample adaptive offset
CN113099221B (en) * 2021-02-22 2023-06-02 浙江大华技术股份有限公司 Cross-component sample point self-adaptive compensation method, coding method and related device

Family Cites Families (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0896300B1 (en) * 1997-08-07 2002-01-30 Matsushita Electric Industrial Co., Ltd. Device and method for motion vector detection
KR20060105407A (en) 2005-04-01 2006-10-11 엘지전자 주식회사 Method for scalably encoding and decoding video signal
US7010044B2 (en) 2003-07-18 2006-03-07 Lsi Logic Corporation Intra 4×4 modes 3, 7 and 8 availability determination intra estimation and compensation
KR101082233B1 (en) 2004-01-20 2011-11-09 파나소닉 주식회사 Picture Coding Method Picture Decoding Method Picture Coding Apparatus Picture Decoding Apparatus and Program thereof
EP2373033A3 (en) 2004-01-30 2011-11-30 Panasonic Corporation Picture coding and decoding method, apparatus, and program thereof
US7933327B2 (en) * 2004-01-30 2011-04-26 Panasonic Corporation Moving picture coding method and moving picture decoding method
KR100657268B1 (en) * 2004-07-15 2006-12-14 학교법인 대양학원 Scalable encoding and decoding method of color video, and apparatus thereof
WO2006104366A1 (en) 2005-04-01 2006-10-05 Lg Electronics Inc. Method for scalably encoding and decoding video signal
TW200806040A (en) * 2006-01-05 2008-01-16 Nippon Telegraph & Telephone Video encoding method and decoding method, apparatuses therefor, programs therefor, and storage media for storing the programs
JP4650894B2 (en) * 2006-04-03 2011-03-16 三菱電機株式会社 Image decoding device
CN100578618C (en) * 2006-12-04 2010-01-06 华为技术有限公司 Decoding method and device
JP4956180B2 (en) * 2006-12-26 2012-06-20 株式会社東芝 Progressive scan conversion apparatus and progressive scan conversion method
JP5467637B2 (en) 2007-01-04 2014-04-09 トムソン ライセンシング Method and apparatus for reducing coding artifacts for illumination compensation and / or color compensation in multi-view coded video
US8938009B2 (en) 2007-10-12 2015-01-20 Qualcomm Incorporated Layered encoded bitstream structure
BRPI0818444A2 (en) 2007-10-12 2016-10-11 Qualcomm Inc adaptive encoding of video block header information
KR101624649B1 (en) 2009-08-14 2016-05-26 삼성전자주식회사 Method and apparatus for video encoding considering hierarchical coded block pattern, and method and apparatus for video decoding considering hierarchical coded block pattern
US9406252B2 (en) 2009-12-31 2016-08-02 Broadcom Corporation Adaptive multi-standard video coder supporting adaptive standard selection and mid-stream switch-over
KR101529992B1 (en) 2010-04-05 2015-06-18 삼성전자주식회사 Method and apparatus for video encoding for compensating pixel value of pixel group, method and apparatus for video decoding for the same
JP5676744B2 (en) * 2010-04-13 2015-02-25 フラウンホーファー−ゲゼルシャフト・ツール・フェルデルング・デル・アンゲヴァンテン・フォルシュング・アインゲトラーゲネル・フェライン Entropy coding
US9094658B2 (en) * 2010-05-10 2015-07-28 Mediatek Inc. Method and apparatus of adaptive loop filtering
US8660174B2 (en) * 2010-06-15 2014-02-25 Mediatek Inc. Apparatus and method of adaptive offset for video coding
US8902978B2 (en) 2010-05-30 2014-12-02 Lg Electronics Inc. Enhanced intra prediction mode signaling
US9456111B2 (en) * 2010-06-15 2016-09-27 Mediatek Inc. System and method for content adaptive clipping
JP5165083B2 (en) 2010-06-15 2013-03-21 聯發科技股▲ふん▼有限公司 Apparatus and method for adaptive offset of video coding
KR101967817B1 (en) * 2010-09-02 2019-04-11 엘지전자 주식회사 Method for encoding and decoding video, and apparatus using same
WO2012044093A2 (en) * 2010-09-29 2012-04-05 한국전자통신연구원 Method and apparatus for video-encoding/decoding using filter information prediction
CN106878729B (en) 2010-10-05 2019-09-24 寰发股份有限公司 Adaptive loop filter method and apparatus based on subregion basis
US8861617B2 (en) * 2010-10-05 2014-10-14 Mediatek Inc Method and apparatus of region-based adaptive loop filtering
KR102146117B1 (en) 2010-11-25 2020-08-20 엘지전자 주식회사 Method offor signaling image information, and method offor decoding image information using same
CN102595119B (en) * 2011-01-14 2014-03-12 华为技术有限公司 Stripe coding method and device thereof as well as stripe decoding method and device thereof
EP2882190B1 (en) * 2011-04-21 2018-11-14 HFI Innovation Inc. Method and apparatus for improved in-loop filtering
US9008170B2 (en) * 2011-05-10 2015-04-14 Qualcomm Incorporated Offset type and coefficients signaling method for sample adaptive offset
US20120294353A1 (en) * 2011-05-16 2012-11-22 Mediatek Inc. Apparatus and Method of Sample Adaptive Offset for Luma and Chroma Components
MX338669B (en) 2011-06-28 2016-04-27 Samsung Electronics Co Ltd Video encoding method using offset adjustments according to pixel classification and apparatus therefor, video decoding method and apparatus therefor.
US20130003829A1 (en) * 2011-07-01 2013-01-03 Kiran Misra System for initializing an arithmetic coder
US10070152B2 (en) * 2011-08-24 2018-09-04 Texas Instruments Incorporated Sample adaptive offset (SAO) parameter signaling
US9591331B2 (en) 2012-03-28 2017-03-07 Qualcomm Incorporated Merge signaling and loop filter on/off signaling
WO2013175756A1 (en) 2012-05-25 2013-11-28 パナソニック株式会社 Image encoding method, image decoding method, image encoding device, image decoding device, and image encoding/decoding device
CA2841112C (en) * 2012-05-25 2019-09-17 Panasonic Corporation Moving picture coding and decoding using context adaptive binary arithmetic coding with fixed probability for some sample adaptive offset parameters
CN106851275A (en) * 2012-05-29 2017-06-13 寰发股份有限公司 The processing unit and method of the adaptively sampled point skew of video data
MX2013014752A (en) * 2012-06-08 2014-03-06 Panasonic Corp Image encoding method, image decoding method, image encoding device, image decoding device, and image encoding and decoding device.
DK3297284T3 (en) 2012-06-11 2019-09-23 Samsung Electronics Co Ltd ENCODING AND DECODING VIDEOS SHARING SAO PARAMETERS ACCORDING TO A COLOR COMPONENT
CN103634603B (en) 2012-08-29 2017-07-07 中兴通讯股份有限公司 Video coding-decoding method, apparatus and system

Also Published As

Publication number Publication date
US20150189330A1 (en) 2015-07-02
JP2015519853A (en) 2015-07-09
SG10201605050RA (en) 2016-08-30
JP6034535B1 (en) 2016-11-30
KR20130138701A (en) 2013-12-19
KR101603703B1 (en) 2016-03-15
US20150172678A1 (en) 2015-06-18
KR101603687B1 (en) 2016-03-15
ES2652608T3 (en) 2018-02-05
CN104869396B (en) 2018-10-19
JP2016226021A (en) 2016-12-28
CY1119864T1 (en) 2018-06-27
PH12014502761A1 (en) 2015-03-02
HUE047442T2 (en) 2020-04-28
EP3300365A1 (en) 2018-03-28
CN104902273A (en) 2015-09-09
DK3300365T3 (en) 2019-09-16
PL3300364T3 (en) 2019-12-31
ES2746341T3 (en) 2020-03-05
TW201640896A (en) 2016-11-16
KR101529994B1 (en) 2015-06-29
TW201743613A (en) 2017-12-16
SG11201408246QA (en) 2015-03-30
MX342394B (en) 2016-09-28
US20180352233A1 (en) 2018-12-06
BR122019022458B1 (en) 2021-10-05
PH12018500803A1 (en) 2018-09-24
US20150181252A1 (en) 2015-06-25
JP2016226022A (en) 2016-12-28
CN104869394B (en) 2018-10-19
TWI519135B (en) 2016-01-21
KR101603704B1 (en) 2016-03-25
TWI549484B (en) 2016-09-11
CN104471938A (en) 2015-03-25
CN104869395A (en) 2015-08-26
MX352119B (en) 2017-11-09
CN104902272B (en) 2017-09-01
EP2854398A4 (en) 2015-12-23
US10609375B2 (en) 2020-03-31
CN108600752A (en) 2018-09-28
MY192340A (en) 2022-08-17
KR102028688B1 (en) 2019-10-04
BR112014030935B1 (en) 2022-06-14
IN2014MN02643A (en) 2015-10-16
PL3300365T3 (en) 2020-01-31
CN108965874B (en) 2021-04-27
CA2985277A1 (en) 2013-12-19
PH12014502761B1 (en) 2015-03-02
RS56675B1 (en) 2018-03-30
SG10201510803WA (en) 2016-01-28
EP2854398B1 (en) 2017-12-13
PH12018500805B1 (en) 2018-09-24
RU2666311C1 (en) 2018-09-06
RU2693307C1 (en) 2019-07-02
TW201537955A (en) 2015-10-01
PT2854398T (en) 2018-01-16
AU2013275098A1 (en) 2015-01-29
WO2013187654A1 (en) 2013-12-19
PH12018500804A1 (en) 2018-09-24
KR20150009497A (en) 2015-01-26
EP3297284B1 (en) 2019-09-11
CN108600752B (en) 2021-09-21
PL3297283T3 (en) 2019-12-31
CN108965875B (en) 2022-02-01
JP6034533B1 (en) 2016-11-30
CA2876167A1 (en) 2013-12-19
EP2854398A1 (en) 2015-04-01
CN108965876B (en) 2022-02-01
RU2595578C2 (en) 2016-08-27
CN104471938B (en) 2018-08-21
PL3297284T3 (en) 2019-12-31
PH12018500805A1 (en) 2018-09-24
HRP20171936T1 (en) 2018-02-09
CN104869394A (en) 2015-08-26
CA2876167C (en) 2018-01-02
CN108965876A (en) 2018-12-07
US20150189284A1 (en) 2015-07-02
PH12018500803B1 (en) 2018-09-24
EP3297284A1 (en) 2018-03-21
US10075718B2 (en) 2018-09-11
KR20150041768A (en) 2015-04-17
AU2016204699B2 (en) 2017-08-31
RU2638742C1 (en) 2017-12-15
CA2985277C (en) 2019-06-04
HUE047710T2 (en) 2020-05-28
JP5990326B2 (en) 2016-09-14
EP3300365B1 (en) 2019-09-04
RU2014153865A (en) 2016-08-10
TWI632805B (en) 2018-08-11
ES2746936T3 (en) 2020-03-09
AU2017265158B2 (en) 2018-11-08
LT2854398T (en) 2018-01-10
JP2016208545A (en) 2016-12-08
HUE037047T2 (en) 2018-08-28
HUE047709T2 (en) 2020-05-28
DK3297284T3 (en) 2019-09-23
PL2854398T3 (en) 2018-02-28
JP6081648B2 (en) 2017-02-15
KR20150041771A (en) 2015-04-17
CN108965875A (en) 2018-12-07
AU2017265158A1 (en) 2017-12-14
DK3300364T3 (en) 2019-09-16
EP3300364A1 (en) 2018-03-28
CN104869395B (en) 2018-12-14
KR101603705B1 (en) 2016-03-15
BR112014030935A8 (en) 2019-12-10
CN104902272A (en) 2015-09-09
TWI604718B (en) 2017-11-01
US9807392B2 (en) 2017-10-31
BR112014030935A2 (en) 2017-06-27
KR20150041770A (en) 2015-04-17
PH12018500808B1 (en) 2018-09-24
ES2755811T3 (en) 2020-04-23
CN104869396A (en) 2015-08-26
US20150181251A1 (en) 2015-06-25
EP3297283B1 (en) 2019-09-04
US9826235B2 (en) 2017-11-21
TWI687091B (en) 2020-03-01
MX2014015240A (en) 2015-02-20
NO2938424T3 (en) 2018-08-18
DK3297283T3 (en) 2019-09-16
CN104902273B (en) 2019-03-15
US9807393B2 (en) 2017-10-31
AU2013275098B2 (en) 2016-05-05
EP3297283A1 (en) 2018-03-21
CN108965874A (en) 2018-12-07
TW201412123A (en) 2014-03-16
DK2854398T3 (en) 2018-01-08
ES2746824T3 (en) 2020-03-09
KR20150041769A (en) 2015-04-17
HUE047873T2 (en) 2020-05-28
EP3300364B1 (en) 2019-09-04
PH12018500808A1 (en) 2018-09-24
TW201836355A (en) 2018-10-01
PH12018500804B1 (en) 2018-09-24
MY171147A (en) 2019-09-27
SG10201503487VA (en) 2015-06-29
US9219918B2 (en) 2015-12-22
JP6034534B1 (en) 2016-11-30
JP2016220238A (en) 2016-12-22
SI2854398T1 (en) 2018-02-28

Similar Documents

Publication Publication Date Title
AU2016204699A1 (en) Method and apparatus for encoding videos sharing sao parameter according to color component
US9854243B2 (en) Image processing apparatus and method
US9538195B2 (en) Video encoding method using offset adjustment according to classification of pixels by maximum encoding units and apparatus thereof, and video decoding method and apparatus thereof
US20160044334A1 (en) Method and apparatus for determining reference images for inter-prediction
EP3512203A1 (en) Image processing device and method
JP6193365B2 (en) Scalability information signaling in parameter sets
KR102336932B1 (en) Image processing device and method
RU2653308C2 (en) Device and method for image processing
CN104469376A (en) Image processing device and method
JP2012019490A (en) Image processing device and image processing method
KR102111175B1 (en) Image processing device and method
US20150117514A1 (en) Three-dimensional video encoding method using slice header and method therefor, and three-dimensional video decoding method and device therefor
US20160219275A1 (en) Image processing device and method
US10397583B2 (en) Image processing apparatus and method
US10764595B2 (en) Method and apparatus for encoding image by using atypical split, and method and apparatus for decoding image by using atypical split
JP2013251759A (en) Electronic apparatus and decoding method
Paulson New compression technology would improve online video quality

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)