WO2020013498A1 - Procédé de traitement de service d'image dans un système de service de contenu, et dispositif associé - Google Patents

Procédé de traitement de service d'image dans un système de service de contenu, et dispositif associé Download PDF

Info

Publication number
WO2020013498A1
WO2020013498A1 PCT/KR2019/007970 KR2019007970W WO2020013498A1 WO 2020013498 A1 WO2020013498 A1 WO 2020013498A1 KR 2019007970 W KR2019007970 W KR 2019007970W WO 2020013498 A1 WO2020013498 A1 WO 2020013498A1
Authority
WO
WIPO (PCT)
Prior art keywords
intra prediction
prediction mode
information
image
mpm
Prior art date
Application number
PCT/KR2019/007970
Other languages
English (en)
Korean (ko)
Inventor
김승환
이령
임재현
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2020013498A1 publication Critical patent/WO2020013498A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/11Selection of coding mode or of prediction mode among a plurality of spatial predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/593Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements

Definitions

  • the present invention relates to a service processing method for a content using image coding technology, and more particularly, to a video service processing method and apparatus using image information including intra prediction information.
  • the demand for high resolution and high quality images such as high definition (HD) images and ultra high definition (UHD) images is increasing in various fields.
  • the higher the resolution and the higher quality of the image data the more information or bit rate is transmitted than the existing image data. Therefore, the image data can be transmitted by using a medium such as a conventional wired / wireless broadband line or by using a conventional storage medium. In the case of storage, the transmission cost and the storage cost are increased.
  • a high efficiency image compression technique is required to effectively transmit, store, and reproduce high resolution, high quality image information.
  • An object of the present invention is to provide a method and apparatus for improving image coding efficiency.
  • Another object of the present invention is to provide a method and apparatus for coding intra prediction information.
  • Another object of the present invention is to provide a method and apparatus for coding information indicating an intra prediction mode of a current block among remaining intra prediction modes.
  • Another object of the present invention is to provide an image service processing method and apparatus for displaying a main image and / or an auxiliary image related to the main image derived based on the received image information.
  • a video service processing method performed by a digital device includes receiving image information, decoding a main image based on the image information, displaying the decoded main image in a first region in a display, and in a second region in the display.
  • the method may further include displaying an auxiliary image, wherein decoding the main image comprises: deriving an intra prediction mode of the current block based on remaining intra prediction mode information included in the image information; Deriving a prediction sample of the current block based on an intra prediction mode, and deriving a reconstructed picture based on the prediction sample, wherein the remaining intra prediction mode information is truncated binary (TB) binarization. Coded through a process).
  • a digital device for performing video service processing may include a network interface unit for receiving image information, a wireless communication unit, or a broadcast receiving unit, a controller for decoding a main image based on the image information, and the decoded main image on a first area of the display. And a display unit configured to display an auxiliary image in a second area of the display, wherein the controller derives an intra prediction mode of the current block based on remaining intra prediction mode information included in the image information.
  • a prediction sample of the current block is derived based on the intra prediction mode, a reconstructed picture is derived based on the prediction sample, and the remaining intra prediction mode information is obtained through a TB (truncated binary) binarization process. It is characterized by a coded.
  • intra prediction information can be coded based on a truncated binary code, which is a variable binary code, thereby reducing signaling overhead of intra prediction information for indicating an intra prediction mode and improving overall coding efficiency. Can be improved.
  • a highly selectable intra prediction mode may be represented as a small bit of binary code and intra prediction information corresponding to a value, thereby reducing signaling overhead of intra prediction information and improving overall coding efficiency.
  • an image service processing method for displaying a main image and an auxiliary image related to the main image for image information received by a digital device may be proposed, thereby providing a video service to a user more efficiently.
  • FIG. 1 is a diagram schematically illustrating a configuration of a video encoding apparatus to which the present invention may be applied.
  • FIG. 2 shows an example of an image encoding method performed by a video encoding apparatus.
  • FIG. 3 is a diagram schematically illustrating a configuration of a video decoding apparatus to which the present invention may be applied.
  • FIG. 4 shows an example of an image decoding method performed by a decoding apparatus.
  • FIG. 5 shows an example of an intra prediction based image encoding method.
  • FIG. 6 shows an example of an intra prediction based image decoding method.
  • 9 exemplarily shows the neighboring samples used for intra prediction of the current block.
  • 11 exemplarily illustrates a method of coding information for indicating n intra prediction modes including MPM candidates and remaining intra prediction modes.
  • FIG. 12 exemplarily illustrates a method of coding information for indicating n intra prediction modes including MPM candidates and remaining intra prediction modes.
  • FIG. 13 schematically illustrates an image encoding method by an encoding apparatus according to the present invention.
  • FIG. 14 schematically illustrates an encoding apparatus for performing an image encoding method according to the present invention.
  • FIG. 16 schematically shows a decoding apparatus for performing an image decoding method according to the present invention.
  • FIG. 17 exemplarily shows a structure diagram of a content streaming system to which the present invention is applied.
  • FIG. 18 is a diagram schematically illustrating an example of a service system including a digital device.
  • 19 is a block diagram illustrating an embodiment of a digital device.
  • 20 is a block diagram illustrating another embodiment of a digital device.
  • 21 is a block diagram illustrating another embodiment of a digital device.
  • FIG. 22 is a block diagram illustrating an exemplary embodiment of a detailed configuration of a controller of the digital apparatus illustrated in FIGS. 19 to 21.
  • FIG. 23 is a diagram illustrating an example in which a screen of a digital device simultaneously displays a main image and a sub image, according to an exemplary embodiment.
  • FIG. 24 schematically illustrates a video service processing method by a digital device according to the present invention.
  • each configuration in the drawings described in the present invention are shown independently for the convenience of description of the different characteristic functions, it does not mean that each configuration is implemented by separate hardware or separate software.
  • two or more of each configuration may be combined to form one configuration, or one configuration may be divided into a plurality of configurations.
  • Embodiments in which each configuration is integrated and / or separated are also included in the scope of the present invention without departing from the spirit of the present invention.
  • the present invention relates to video / image coding.
  • the methods / embodiments disclosed in the present invention may include a versatile video coding (VVC) standard, an essential video coding (ECC) standard, an AOMedia Video 1 (AV1) standard, a second generation of audio video coding standard (AVS2), or next-generation video.
  • VVC versatile video coding
  • ECC essential video coding
  • AV1 AOMedia Video 1
  • AVS2 second generation of audio video coding standard
  • next-generation video e.g., H.267, H.268, etc.
  • a picture generally refers to a unit representing one image of a specific time zone
  • a slice is a unit constituting a part of a picture in coding.
  • One picture may be composed of a plurality of slices, and if necessary, the picture and the slice may be mixed with each other.
  • a pixel or a pel may refer to a minimum unit constituting one picture (or image). Also, 'sample' may be used as a term corresponding to a pixel.
  • a sample may generally represent a pixel or a value of a pixel, and may only represent pixel / pixel values of the luma component, or only pixel / pixel values of the chroma component.
  • a unit represents the basic unit of image processing.
  • the unit may include at least one of a specific region of the picture and information related to the region.
  • the unit may be used interchangeably with terms such as block or area in some cases.
  • an M ⁇ N block may represent a set of samples or transform coefficients composed of M columns and N rows.
  • FIG. 1 is a diagram schematically illustrating a configuration of a video encoding apparatus to which the present invention may be applied.
  • the video encoding apparatus 100 may include a picture splitter 105, a predictor 110, a residual processor 120, an entropy encoder 130, an adder 140, and a filter 150. ) And memory 160.
  • the residual processing unit 120 may include a subtraction unit 121, a conversion unit 122, a quantization unit 123, a reordering unit 124, an inverse quantization unit 125, and an inverse conversion unit 126.
  • the picture divider 105 may divide the input picture into at least one processing unit.
  • the processing unit may be called a coding unit (CU).
  • the coding unit may be recursively split from the largest coding unit (LCU) according to a quad-tree binary-tree (QTBT) structure.
  • LCU largest coding unit
  • QTBT quad-tree binary-tree
  • one coding unit may be divided into a plurality of coding units of a deeper depth based on a quad tree structure and / or a binary tree structure.
  • the quad tree structure may be applied first and the binary tree structure may be applied later.
  • the binary tree structure may be applied first.
  • the coding procedure according to the present invention may be performed based on the final coding unit that is no longer split.
  • the maximum coding unit may be used as the final coding unit immediately based on coding efficiency according to the image characteristic, or if necessary, the coding unit is recursively divided into coding units of lower depths and optimized.
  • a coding unit of size may be used as the final coding unit.
  • the coding procedure may include a procedure of prediction, transform, and reconstruction, which will be described later.
  • the processing unit may include a coding unit (CU) prediction unit (PU) or a transform unit (TU).
  • the coding unit may be split from the largest coding unit (LCU) into coding units of deeper depths along the quad tree structure.
  • LCU largest coding unit
  • the maximum coding unit may be used as the final coding unit immediately based on coding efficiency according to the image characteristic, or if necessary, the coding unit is recursively divided into coding units of lower depths and optimized.
  • a coding unit of size may be used as the final coding unit. If a smallest coding unit (SCU) is set, the coding unit may not be split into smaller coding units than the minimum coding unit.
  • the final coding unit refers to a coding unit that is the basis of partitioning or partitioning into a prediction unit or a transform unit.
  • the prediction unit is a unit partitioning from the coding unit and may be a unit of sample prediction. In this case, the prediction unit may be divided into sub blocks.
  • the transform unit may be divided along the quad tree structure from the coding unit, and may be a unit for deriving a transform coefficient and / or a unit for deriving a residual signal from the transform coefficient.
  • a coding unit may be called a coding block (CB)
  • a prediction unit is a prediction block (PB)
  • a transform unit may be called a transform block (TB).
  • a prediction block or prediction unit may mean a specific area in the form of a block within a picture, and may include an array of prediction samples.
  • a transform block or a transform unit may mean a specific area in a block form within a picture, and may include an array of transform coefficients or residual samples.
  • the prediction unit 110 may perform a prediction on a block to be processed (hereinafter, referred to as a current block) and generate a predicted block including prediction samples of the current block.
  • the unit of prediction performed by the prediction unit 110 may be a coding block, a transform block, or a prediction block.
  • the prediction unit 110 may determine whether intra prediction or inter prediction is applied to the current block. As an example, the prediction unit 110 may determine whether intra prediction or inter prediction is applied on a CU basis.
  • the prediction unit 110 may derive a prediction sample for the current block based on reference samples outside the current block in the picture to which the current block belongs (hereinafter, referred to as the current picture). In this case, the prediction unit 110 may (i) derive the prediction sample based on the average or interpolation of neighboring reference samples of the current block, and (ii) the neighbor reference of the current block.
  • the prediction sample may be derived based on a reference sample present in a specific (prediction) direction with respect to the prediction sample among the samples. In case of (i), it may be called non-directional mode or non-angle mode, and in case of (ii), it may be called directional mode or angular mode.
  • the prediction mode may have, for example, 33 directional prediction modes and at least two non-directional modes.
  • the non-directional mode may include a DC prediction mode and a planner mode (Planar mode).
  • the prediction unit 110 may determine the prediction mode applied to the current block by using the prediction mode applied to the neighboring block.
  • the prediction unit 110 may derive the prediction sample for the current block based on the sample specified by the motion vector on the reference picture.
  • the prediction unit 110 may apply one of a skip mode, a merge mode, and a motion vector prediction (MVP) mode to derive a prediction sample for the current block.
  • the prediction unit 110 may use the motion information of the neighboring block as the motion information of the current block.
  • the skip mode unlike the merge mode, the difference (residual) between the prediction sample and the original sample is not transmitted.
  • the MVP mode the motion vector of the current block may be derived using the motion vector of the neighboring block as a motion vector predictor.
  • the neighboring block may include a spatial neighboring block existing in the current picture and a temporal neighboring block present in the reference picture.
  • a reference picture including the temporal neighboring block may be called a collocated picture (colPic).
  • the motion information may include a motion vector and a reference picture index.
  • Information such as prediction mode information and motion information may be encoded (entropy) and output in the form of a bitstream.
  • the highest picture on the reference picture list may be used as the reference picture.
  • Reference pictures included in a reference picture list may be sorted based on a difference in a picture order count (POC) between a current picture and a corresponding reference picture.
  • POC picture order count
  • the subtraction unit 121 generates a residual sample which is a difference between the original sample and the prediction sample.
  • residual samples may not be generated as described above.
  • the transform unit 122 generates transform coefficients by transforming the residual sample in units of transform blocks.
  • the transform unit 122 may perform the transform according to the size of the transform block and the prediction mode applied to the coding block or the prediction block that spatially overlaps the transform block. For example, if intra prediction is applied to the coding block or the prediction block that overlaps the transform block, and the transform block is a 4 ⁇ 4 residual array, the residual sample is configured to perform a discrete sine transform (DST) transform kernel.
  • the residual sample may be transformed using a discrete cosine transform (DCT) transform kernel.
  • DST discrete sine transform
  • DCT discrete cosine transform
  • the quantization unit 123 may quantize the transform coefficients to generate quantized transform coefficients.
  • the reordering unit 124 rearranges the quantized transform coefficients.
  • the reordering unit 124 may reorder the quantized transform coefficients in the form of a block into a one-dimensional vector form through a coefficient scanning method. Although the reordering unit 124 has been described in a separate configuration, the reordering unit 124 may be part of the quantization unit 123.
  • the entropy encoding unit 130 may perform entropy encoding on the quantized transform coefficients.
  • Entropy encoding may include, for example, encoding methods such as exponential Golomb, context-adaptive variable length coding (CAVLC), context-adaptive binary arithmetic coding (CABAC), and the like.
  • the entropy encoding unit 130 may encode information necessary for video reconstruction other than the quantized transform coefficient (for example, a value of a syntax element) together or separately. Entropy encoded information may be transmitted or stored in units of network abstraction layer (NAL) units in the form of bitstreams.
  • NAL network abstraction layer
  • the inverse quantization unit 125 inverse quantizes the quantized values (quantized transform coefficients) in the quantization unit 123, and the inverse transformer 126 inverse transforms the inverse quantized values in the inverse quantization unit 125 to obtain a residual sample.
  • the adder 140 reconstructs the picture by combining the residual sample and the predictive sample.
  • the residual sample and the predictive sample may be added in units of blocks to generate a reconstructed block.
  • the adder 140 may be part of the predictor 110.
  • the adder 140 may be called a restoration unit or a restoration block generation unit.
  • the filter unit 150 may apply a deblocking filter and / or a sample adaptive offset to the reconstructed picture. Through deblocking filtering and / or sample adaptive offset, the artifacts of the block boundaries in the reconstructed picture or the distortion in the quantization process can be corrected.
  • the sample adaptive offset may be applied on a sample basis and may be applied after the process of deblocking filtering is completed.
  • the filter unit 150 may apply an adaptive loop filter (ALF) to the reconstructed picture. ALF may be applied to the reconstructed picture after the deblocking filter and / or sample adaptive offset is applied.
  • ALF adaptive loop filter
  • the memory 160 may store reconstructed pictures (decoded pictures) or information necessary for encoding / decoding.
  • the reconstructed picture may be a reconstructed picture after the filtering process is completed by the filter unit 150.
  • the stored reconstructed picture may be used as a reference picture for (inter) prediction of another picture.
  • the memory 160 may store (reference) pictures used for inter prediction.
  • pictures used for inter prediction may be designated by a reference picture set or a reference picture list.
  • the image encoding method may include block partitioning, intra / inter prediction, transform, quantization, and entropy encoding.
  • the current picture may be divided into a plurality of blocks, a prediction block of the current block may be generated through intra / inter prediction, and the subtraction of the input block of the current block and the prediction block may be performed.
  • the residual block of the current block may be generated.
  • a coefficient block that is, transform coefficients of the current block may be generated through the transform on the residual block.
  • the transform coefficients may be quantized and entropy encoded and stored in the bitstream.
  • FIG. 3 is a diagram schematically illustrating a configuration of a video decoding apparatus to which the present invention may be applied.
  • the video decoding apparatus 300 includes an entropy decoding unit 310, a residual processor 320, a predictor 330, an adder 340, a filter 350, and a memory 360. It may include.
  • the residual processor 320 may include a rearrangement unit 321, an inverse quantization unit 322, and an inverse transform unit 323.
  • the video decoding apparatus 300 may reconstruct the video in response to a process in which the video information is processed in the video encoding apparatus.
  • the video decoding apparatus 300 may perform video decoding using a processing unit applied in the video encoding apparatus.
  • the processing unit block of video decoding may be, for example, a coding unit, and in another example, a coding unit, a prediction unit, or a transform unit.
  • the coding unit may be split along the quad tree structure and / or binary tree structure from the largest coding unit.
  • the prediction unit and the transform unit may be further used in some cases, in which case the prediction block is a block derived or partitioned from the coding unit and may be a unit of sample prediction. At this point, the prediction unit may be divided into subblocks.
  • the transform unit may be divided along the quad tree structure from the coding unit, and may be a unit for deriving a transform coefficient or a unit for deriving a residual signal from the transform coefficient.
  • the entropy decoding unit 310 may parse the bitstream and output information necessary for video reconstruction or picture reconstruction. For example, the entropy decoding unit 310 decodes the information in the bitstream based on a coding method such as exponential Golomb coding, CAVLC, or CABAC, quantized values of syntax elements required for video reconstruction, and residual coefficients. Can be output.
  • a coding method such as exponential Golomb coding, CAVLC, or CABAC, quantized values of syntax elements required for video reconstruction, and residual coefficients. Can be output.
  • the CABAC entropy decoding method receives a bin corresponding to each syntax element in a bitstream, and decodes syntax element information and decoding information of neighboring and decoding target blocks or information of symbols / bins decoded in a previous step.
  • the context model may be determined using the context model, the probability of occurrence of a bin may be predicted according to the determined context model, and arithmetic decoding of the bin may be performed to generate a symbol corresponding to the value of each syntax element. have.
  • the CABAC entropy decoding method may update the context model by using the information of the decoded symbol / bin for the context model of the next symbol / bean after determining the context model.
  • the information related to the prediction among the information decoded by the entropy decoding unit 310 is provided to the prediction unit 330, and the residual value on which the entropy decoding has been performed by the entropy decoding unit 310, that is, the quantized transform coefficient, is used as a reordering unit ( 321 may be input.
  • the reordering unit 321 may rearrange the quantized transform coefficients into a two-dimensional block.
  • the reordering unit 321 may perform reordering in response to coefficient scanning performed by the encoding apparatus. Although the reordering unit 321 has been described in a separate configuration, the reordering unit 321 may be part of the inverse quantization unit 322.
  • the inverse quantization unit 322 may output the transform coefficients by inversely quantizing the quantized transform coefficients based on the (inverse) quantization parameter.
  • information for deriving a quantization parameter may be signaled from the encoding apparatus.
  • the inverse transformer 323 may induce residual samples by inversely transforming the transform coefficients.
  • the prediction unit 330 may perform prediction on the current block and generate a predicted block including prediction samples for the current block.
  • the unit of prediction performed by the prediction unit 330 may be a coding block, a transform block, or a prediction block.
  • the prediction unit 330 may determine whether to apply intra prediction or inter prediction based on the information about the prediction.
  • a unit for determining which of intra prediction and inter prediction is to be applied and a unit for generating a prediction sample may be different.
  • the unit for generating a prediction sample in inter prediction and intra prediction may also be different.
  • whether to apply inter prediction or intra prediction may be determined in units of CUs.
  • a prediction mode may be determined and a prediction sample may be generated in PU units
  • intra prediction a prediction mode may be determined in PU units and a prediction sample may be generated in TU units.
  • the prediction unit 330 may derive the prediction sample for the current block based on the neighbor reference samples in the current picture.
  • the prediction unit 330 may derive the prediction sample for the current block by applying the directional mode or the non-directional mode based on the neighbor reference samples of the current block.
  • the prediction mode to be applied to the current block may be determined using the intra prediction mode of the neighboring block.
  • the prediction unit 330 may derive the prediction sample for the current block based on the sample specified on the reference picture by the motion vector on the reference picture.
  • the prediction unit 330 may derive the prediction sample for the current block by applying any one of a skip mode, a merge mode, and an MVP mode.
  • motion information required for inter prediction of the current block provided by the video encoding apparatus for example, information about a motion vector, a reference picture index, and the like may be obtained or derived based on the prediction information.
  • the motion information of the neighboring block may be used as the motion information of the current block.
  • the neighboring block may include a spatial neighboring block and a temporal neighboring block.
  • the predictor 330 may construct a merge candidate list using motion information of available neighboring blocks, and may use information indicated by the merge index on the merge candidate list as a motion vector of the current block.
  • the merge index may be signaled from the encoding device.
  • the motion information may include a motion vector and a reference picture. When the motion information of the temporal neighboring block is used in the skip mode and the merge mode, the highest picture on the reference picture list may be used as the reference picture.
  • the difference (residual) between the prediction sample and the original sample is not transmitted.
  • the motion vector of the current block may be derived using the motion vector of the neighboring block as a motion vector predictor.
  • the neighboring block may include a spatial neighboring block and a temporal neighboring block.
  • a merge candidate list may be generated by using a motion vector of a reconstructed spatial neighboring block and / or a motion vector corresponding to a Col block, which is a temporal neighboring block.
  • the motion vector of the candidate block selected from the merge candidate list is used as the motion vector of the current block.
  • the information about the prediction may include a merge index indicating a candidate block having an optimal motion vector selected from candidate blocks included in the merge candidate list.
  • the prediction unit 330 may derive the motion vector of the current block by using the merge index.
  • a motion vector predictor candidate list may be generated using a motion vector of a reconstructed spatial neighboring block and / or a motion vector corresponding to a Col block which is a temporal neighboring block.
  • the prediction information may include a prediction motion vector index indicating an optimal motion vector selected from the motion vector candidates included in the list.
  • the prediction unit 330 may select the predicted motion vector of the current block from the motion vector candidates included in the motion vector candidate list using the motion vector index.
  • the prediction unit of the encoding apparatus may obtain a motion vector difference (MVD) between the motion vector of the current block and the motion vector predictor, and may encode the output vector in a bitstream form. That is, MVD may be obtained by subtracting the motion vector predictor from the motion vector of the current block.
  • the prediction unit 330 may obtain a motion vector difference included in the information about the prediction, and derive the motion vector of the current block by adding the motion vector difference and the motion vector predictor.
  • the prediction unit may also obtain or derive a reference picture index or the like indicating a reference picture from the information about the prediction.
  • the adder 340 may reconstruct the current block or the current picture by adding the residual sample and the predictive sample.
  • the adder 340 may reconstruct the current picture by adding the residual sample and the predictive sample in units of blocks. Since the residual is not transmitted when the skip mode is applied, the prediction sample may be a reconstruction sample.
  • the adder 340 has been described in a separate configuration, the adder 340 may be part of the predictor 330. On the other hand, the adder 340 may be called a restoration unit or a restoration block generation unit.
  • the filter unit 350 may apply the deblocking filtering sample adaptive offset, and / or ALF to the reconstructed picture.
  • the sample adaptive offset may be applied in units of samples and may be applied after deblocking filtering.
  • ALF may be applied after deblocking filtering and / or sample adaptive offset.
  • the memory 360 may store reconstructed pictures (decoded pictures) or information necessary for decoding.
  • the reconstructed picture may be a reconstructed picture after the filtering process is completed by the filter 350.
  • the memory 360 may store pictures used for inter prediction.
  • pictures used for inter prediction may be designated by a reference picture set or a reference picture list.
  • the reconstructed picture can be used as a reference picture for another picture.
  • the memory 360 may output the reconstructed picture in the output order.
  • the image decoding method may include entropy decoding, inverse quantization, inverse transform, and intra / inter prediction.
  • the reverse process of the encoding method may be performed in the decoding apparatus.
  • quantized transform coefficients may be obtained through entropy decoding on the bitstream
  • coefficient blocks of the current block that is, transform coefficients
  • the residual block of the current block may be derived through inverse transform on the transform coefficients
  • the prediction block of the current block derived through intra / inter prediction and the addition of the residual block may be added to the residual block.
  • a reconstructed block can be derived.
  • a correlation between samples may be used and a difference between an original block and a prediction block, that is, a residual may be obtained.
  • the above-described transformation and quantization may be applied to the residual, and thus spatial redundancy may be removed.
  • the encoding method and decoding method in which intra prediction is used may be as described below.
  • the encoding apparatus may derive an intra prediction mode for a current block (S500) and derive peripheral reference samples of the current block (S510).
  • the encoding apparatus may generate prediction samples in the current block based on the intra prediction mode and the peripheral reference samples (S520).
  • the encoding apparatus may perform a prediction sample filtering procedure (S530). Predictive sample filtering may be referred to as post filtering. Some or all of the prediction samples may be filtered by the prediction sample filtering procedure. In some cases, the S530 procedure may be omitted.
  • the encoding apparatus may generate residual samples for the current block based on the (filtered) prediction sample (S540).
  • the encoding apparatus may encode image information including prediction mode information indicating the intra prediction mode and residual information regarding the residual samples (S550).
  • the encoded image information may be output in the form of a bitstream.
  • the output bitstream may be delivered to the decoding apparatus via a storage medium or a network.
  • the decoding apparatus may perform an operation corresponding to the operation performed by the encoding apparatus.
  • the decoding apparatus may derive the intra prediction mode for the current block based on the received prediction mode information (S600).
  • the decoding apparatus may derive peripheral reference samples of the current block (S610).
  • the decoding apparatus may generate prediction samples in the current block based on the intra prediction mode and the peripheral reference samples (S620).
  • the decoding apparatus may perform a prediction sample filtering procedure (S630). Predictive sample filtering may be referred to as post filtering. Some or all of the prediction samples may be filtered by the prediction sample filtering procedure. In some cases, the S630 procedure may be omitted.
  • the decoding apparatus may generate residual samples for the current block based on the received residual information (S640).
  • the decoding apparatus may generate reconstructed samples for the current block based on the (filtered) prediction samples and the residual samples, and generate a reconstructed picture based on the (S650).
  • the encoding device / decoding device may derive an intra prediction mode for the current block, and predict the sample of the current block based on the intra prediction mode. Can be derived. That is, the encoding device / decoding device may derive the prediction sample of the current block by applying the directional mode or the non-directional mode based on the peripheral reference samples of the current block.
  • the intra prediction mode includes two non-directional or non-angular intra prediction modes and 65 directional or angular intra prediction modes. Can include them.
  • the non-directional intra prediction modes may include a planar intra prediction mode of 0 and a DC intra prediction mode of 1, and the directional intra prediction modes may include 65 intra prediction modes of 2 to 66. .
  • the present invention may be applied to a case where the number of intra prediction modes is different.
  • the intra prediction mode 67 may further be used, and the intra prediction mode 67 may represent a linear model (LM) mode.
  • LM linear model
  • an intra prediction mode having horizontal directionality and an intra prediction mode having vertical directionality may be distinguished from the intra prediction mode 34 having a left upward diagonal prediction direction.
  • H and V denote horizontal and vertical directions, respectively, and numbers from -32 to 32 represent a displacement of 1/32 on a sample grid position.
  • Intra prediction modes 2 to 33 have horizontal orientation
  • intra prediction modes 34 to 66 have vertical orientation.
  • Intra prediction mode 18 and intra prediction mode 50 indicate a horizontal intra prediction mode and a vertical intra prediction mode, respectively, and an intra prediction mode 2 indicates a left downward diagonal intra prediction mode
  • the 34th intra prediction mode may be referred to as a left upward diagonal intra prediction mode
  • the 66th intra prediction mode may be referred to as a right upward diagonal intra prediction mode.
  • the prediction mode information may include flag information (eg, prev_intra_luma_pred_flag) indicating whether a most probable mode (MPM) is applied to the current block or a remaining mode.
  • the prediction mode information may further include index information (eg, mpm_idx) indicating one of the intra prediction mode candidates (eg, MPM candidates).
  • the intra prediction mode candidates for the current block may be composed of an MPM candidate list or an MPM list. That is, the MPM candidate list or the MPM list for the current block may be configured, and the MPM candidate list or the MPM list may include the intra prediction mode candidates.
  • the prediction mode information further includes remaining intra prediction mode information (eg, rem_inra_luma_pred_mode) indicating one of the remaining intra prediction modes except for the intra prediction mode candidates. It may include.
  • the remaining intra prediction mode information may also be referred to as MPM remainder information.
  • the decoding apparatus may determine the intra prediction mode of the current block based on the prediction mode information.
  • the prediction mode information may be encoded / decoded through a coding method described below.
  • the prediction mode information may be based on entropy coding (eg, CABAC, CAVLC) based on truncated binary code or truncated rice binary code. Can be encoded / decoded.
  • the encoding device / decoding device may construct a reference sample (S800), and may derive a prediction sample for the current block based on the reference sample (In operation S 810, post filtering may be performed on the prediction sample.
  • the prediction unit of the encoding device / decoding device may obtain the advantages of the intra prediction mode and known neighboring reference samples in order to generate unknown samples of the current block.
  • the peripheral samples of the current block may include 2W upper peripheral samples, 2H left peripheral samples, and a left upper corner peripheral sample.
  • the left neighboring samples are p [-1] [0. ] To p [-1] [2H-1], the sample around the upper left corner is p [-1] [-1], and the sample around the upper side is p [0] [-1] to p [2W-1] [-1].
  • a prediction sample of the target sample may be derived based on a neighboring sample located in the prediction direction of the intra prediction mode of the current block based on the target sample of the current block. Meanwhile, neighboring samples of a plurality of lines may be used for intra prediction of the current block.
  • the encoding apparatus may jointly optimize bit rate and distortion to determine an optimal intra prediction mode for the current block. Thereafter, the encoding apparatus may code the prediction mode information for the optimal intra prediction mode into the bitstream. The decoding apparatus may derive the optimal intra prediction mode by parsing the prediction mode information and perform intra prediction of the current block based on the intra prediction mode.
  • the increased number of intra prediction modes requires efficient intra prediction mode coding to minimize signaling overhead.
  • the present invention proposes embodiments for reducing signaling overhead in transmitting information on intra prediction.
  • Floor (x) may represent a maximum integer value of x or less
  • Log2 (u) may represent a log value with a base of 2 of u
  • Ceil (x) may be a value of x or more. It can represent the minimum integer value. For example, in the case of Floor (5.93), since the maximum integer value of 5.93 or less is 5, it may represent 5.
  • x >> y may represent an operator for shifting x right by y
  • x ⁇ y may represent an operator for left shifting y by y.
  • the current block and the neighboring block to be coded may have similar image characteristics, and thus, the current block and the neighboring block have a high probability of having the same or similar intra prediction mode.
  • the MPM list of the current block may be determined based on the intra prediction mode of the neighboring block. That is, for example, the MPM list may include the intra prediction mode of the neighboring block as the MPM candidate.
  • the neighboring blocks of the current block used to construct the MPM list of the current block may be represented as follows.
  • the peripheral blocks of the current block may include a left peripheral block, an upper peripheral block, a lower left peripheral block, a right upper peripheral block, and / or an upper left peripheral block.
  • the left neighboring block has a coordinate of (-1, H-1).
  • a block including a sample the upper peripheral block is a block including a sample of (W-1, -1) coordinates, the right upper peripheral block is a block including a sample of (W, -1) coordinates,
  • the lower left peripheral block may be a block including samples of (-1, H) coordinates, and the upper left peripheral block may be a block including samples of (-1, -1) coordinates.
  • the decoding apparatus may construct an MPM list of the current block, and may derive the MPM candidate indicated by the MPM index among the MPM candidates of the MPM list in the intra prediction mode of the current block. Overhead can be minimized by signaling the MPM index when one of the MPM candidates is the optimal intra prediction mode for the current block.
  • Indexes indicating the MPM candidates may be coded with truncated unary code. In other words, the MPM index may be binarized using Trunked Unary code.
  • the value of the MPM index binarized through the truncated binary code may be represented as in the following table.
  • the MPM index may be derived as a binary value of 1 to 5 bins according to a value represented. Since the smaller the value of the MPM index binarized through the truncated binary code, the smaller the bin of the binary value, the order of the MPM candidates may be important to reduce the amount of bits.
  • the truncated binary code may also be referred to as a truncated rice code.
  • the list of Most Probable Modes (MPMs) of the current block may include six MPM candidates, and the MPM candidates may include an intra prediction mode of a left neighboring block, an intra prediction mode of an upper neighboring block, and a planner intra prediction mode. , DC intra prediction mode, intra prediction mode of the lower left neighboring block, intra prediction mode of the upper right neighboring block, and intra prediction mode of the upper left neighboring block.
  • the MPM flag may be signaled to indicate an exception. That is, the MPM flag may indicate whether an intra prediction mode applied to the current block is included in the MPM candidates or in other intra prediction modes not included in the MPM candidates.
  • the MPM flag when the value of the MPM flag is 1, the MPM flag may indicate that the intra prediction mode of the current block is included in MPM candidates (MPM list), and when the value of the MPM flag is 0, the MPM The flag may indicate that the intra prediction mode for the current block is included in the remaining intra prediction modes rather than included in MPM candidates (MPM list).
  • an optimal intra prediction mode for the current block may be coded using variable length coding or fixed length coding.
  • the number of MPM candidates included in the MPM list may be determined based on the number of intra prediction modes. For example, as the number of intra prediction modes increases, the number of the MPM candidates may increase, but may not.
  • the MPM list may include three MPM candidates, five candidates, or six MPM candidates.
  • an index indicating an intra prediction mode applied to the current block may be coded using variable length coding or fixed length coding.
  • the index when the index is coded with variable length coding, the higher the probability that the intra prediction mode (that is, the intra prediction mode corresponding to the case where the value of the index is small) of the previous order is higher, indicating the intra prediction mode of the image. Since the bit amount of the prediction mode information can be reduced, coding efficiency can be improved than when fixed length coding is used.
  • Truncated binary coding may be used with the variable length coding.
  • the first l symbols may be coded using k bits, and the ul symbols, i.e., in the whole u symbols Symbols other than l symbols may be coded using k + 1 bits.
  • the first l symbols may represent l symbols in the preceding order. Meanwhile, the symbols may be values that information can represent.
  • k may be derived as in the following equation.
  • l may be derived as in the following equation.
  • k and l depending on the number of symbols for which the truncated binary coding is used may be derived as shown in the following table.
  • the binarization value for each symbol according to the truncated binary coding may be derived as shown in the following table.
  • symbols 0 to 2 may be coded with a binarization value having 5 bits, and the remaining symbols may be coded with a binarization value having 6 (ie, k + 1) bits.
  • the symbols may indicate an index of an intra prediction mode list. That is, the symbols may indicate indices of intra prediction modes in a specific order.
  • the intra prediction mode list may be a list configured in increasing order of mode numbers as follows.
  • the intra prediction mode list may be a list configured in a pre-defined order as follows.
  • the present invention proposes a method of coding information for indicating an intra prediction mode using the truncated binary coding described above.
  • 11 exemplarily illustrates a method of coding information for indicating n intra prediction modes including MPM candidates and remaining intra prediction modes.
  • the encoding apparatus constructs an MPM list including m MPM candidates (S1100). Thereafter, the encoding apparatus may remove the MPM candidates from the predefined intra prediction mode list (S1110). Thereafter, an index representing the (n-m) remaining intra prediction modes may be coded using the truncated binary code (S1120). That is, an index representing one of the (n-m) remaining intra prediction modes may be coded using the truncated binary code. For example, when the value of the index is N, the remaining intra prediction mode information may indicate an N + 1th intra prediction mode in the (n-m) remaining intra prediction modes. As described above, an index representing the (n-m) remaining intra prediction modes may be coded by the truncated binary code. That is, for example, when the value of the index is N, the index may be binarized to a binary value corresponding to the N in the truncated binary code.
  • the intra prediction mode list may also be referred to as an intra mode map.
  • the intra mode map may indicate a pre-defined order of all u intra prediction modes. That is, the intra mode map may represent intra prediction modes excluding MPM candidates in intra prediction modes of a predetermined order. The remaining intra prediction modes except the m MPM candidates in all intra prediction modes may be mapped to symbols of the index in an order according to the intra mode map (that is, a predetermined order). For example, the index of the intra prediction mode, which is the first order in the intra mode map, among the intra prediction modes excluding the m MPM candidates may be 0, and the index of the intra prediction mode, which is the n th order, may be n-1. have.
  • an optimal intra prediction mode may be selected in a rate-distortion optimization (RDO) process.
  • RDO rate-distortion optimization
  • An intra mode map may be proposed in which high probability intra prediction modes are included in the above order.
  • the intra mode map may be as follows. That is, the intra prediction modes of the predetermined order may be as follows.
  • the 61 remaining intra prediction modes may be coded using truncated binary code. That is, the indexes for the remaining intra prediction modes may be coded based on the truncated binary code.
  • the six MPM candidates may be removed from the intra mode map.
  • three intra prediction modes of the preceding order (l for u being 61 is 3), that is, three intra predictions of the preceding order in the intra mode map among the remaining intra prediction modes.
  • the modes can be coded with 00000, 00001 and 00010, where 5 for k where 61 is 5.
  • the index of the first intra prediction mode according to the intra mode map is a binary value of 00000
  • the index of the second intra prediction mode is a binary value of 00001
  • the index of the third intra prediction mode is 00010. It can be coded as a binarization value.
  • 58 intra prediction modes other than the three intra prediction modes may be coded with 6-bit truncated binary code such as 000100, 000101. That is, the index of 58 intra prediction modes other than the three intra prediction modes may be coded with a 6-bit binarization value such as 000100, 000101.
  • the present invention also proposes another embodiment of coding information for indicating an intra prediction mode using the truncated binary coding.
  • FIG. 12 exemplarily illustrates a method of coding information for indicating n intra prediction modes including MPM candidates and remaining intra prediction modes.
  • the encoding apparatus constructs an MPM list including m MPM candidates (S1200). Thereafter, the encoding apparatus may include the offset of the directional intra prediction mode among the MPM candidates in the TBC list (S1210). For example, when the directional intra prediction mode, which is the MPM candidates, is n intra prediction mode, n + offset intra prediction mode obtained by adding the offset to n may be derived, and includes the n + offset intra prediction mode.
  • a TBC list can be constructed.
  • the offset may start at -1, +1, -2, +2, 2, -4, +4.
  • indices representing the (n-m) remaining intra prediction modes may be coded using the truncated binary code (S1220). As described above, an index representing the (n-m) remaining intra prediction modes may be coded by the truncated binary code.
  • the 61 remaining intra prediction modes may be coded using truncated binary code. That is, the indexes for the remaining intra prediction modes may be coded based on the truncated binary code. For example, if six MPM candidates included in the MPM list are ⁇ 50, 8, 0, 1, 66, 54 ⁇ , the TBC list is ⁇ 49, 51, 7, 9, 65, 53, 55, ⁇ , ⁇ .
  • the directional intra prediction mode among the MPM candidates is an intra prediction mode of 50, an intra prediction mode of 8, an intra prediction mode of 66, an intra prediction mode of 54, the intra prediction mode of 50, and an intra prediction mode of 8
  • the intra prediction mode derived from the 66th intra prediction mode and the 54th intra prediction mode and the offset may be added to the TBC list.
  • three intra prediction modes of the preceding order (l for u being 61 is 3), that is, three intra prediction modes of the preceding order in the TBC list among the remaining intra prediction modes.
  • the index of the prediction mode may be coded with a binarization value of 00010.
  • 58 intra prediction modes other than the three intra prediction modes may be coded with 6-bit truncated binary code such as 000100, 000101. That is, the index of 58 intra prediction modes other than the three intra prediction modes may be coded with a 6-bit binarization value such as 000100, 000101.
  • the MPM index may be signaled in the form of mpm_idx [x0 + i] [y0 + j] (or mpm_idx) syntax elements, and the remaining intra prediction mode information is rem_intra_luma_pred_mode [x0 + i] [y0 + j]. (Or rem_intra_luma_pred_mode) may be signaled in the form of a syntax element.
  • the MPM index may be signaled in the form of an intra_luma_mpm_idx [xCb] [yCb] syntax element, and the remaining intra prediction mode information may be signaled in the form of an intra_luma_mpm_remainder [xCb] [yCb] syntax element.
  • the MPM index may indicate one of the MPM candidates, and the remaining intra prediction mode information may indicate one of the remaining intra prediction modes other than the MPM candidates.
  • the array indices (x0 + i, y0 + i) may indicate the position (x0 + i, y0 + i) of the upper left luma sample of the prediction block with respect to the upper left luma sample of the picture.
  • array indices (xCb, yCb) may indicate positions (xCb, yCb) of the upper left luma sample of the prediction block with respect to the upper left luma sample of the picture.
  • binarization for remaining mode coding invokes a truncated binary (TB) binarization process with a cMax value equal to (num_intra_mode-mpm_idx).
  • TB truncated binary
  • binarization for remaining mode coding may be performed by a truncated binary binarization process in which a cMax value is obtained by subtracting the number of MPM candidates from the total number of intra prediction modes.
  • the num_intra_mode may indicate the total number of intra prediction modes
  • the mpm_idx may indicate the number of MPM candidates.
  • the truncated binary binarization process may be performed as follows.
  • the input of the process may be a request for TB binarization for a syntax element having a synVal value and a cMax value.
  • synVal may represent a value of the syntax element
  • cMax may represent a maximum value that the syntax element can represent.
  • the output of the process may be TB binarization of the syntax element.
  • the bin string of the TB binarization process of the syntax element synVal may be specified as described below.
  • TB binarization of a syntax element may be a null empty string.
  • a TB empty string may be derived by invoking a fixed length (FL) binarization process for synVal having input symbolVal and cMaX set to k. . That is, when cMax is not 0 and synVal is smaller than u, a TB empty string may be derived based on the FL binarization process for synVal having an input symbolVal set to k and cMaX set to k. According to Equation 4 which derives the length of a binary value, that is, the number of bits in the fixed length binarization process described below, the number of bits may be derived as k for cMaX set to k. Therefore, when synVal is smaller than u, a binary value of k bits for synVal can be derived.
  • the TB empty string is a fixed length (FL) binarization for synVal + u with input symbolVal and cMaX set to (k + 1). Can be derived by invoking the process. That is, if cMax is not zero and synVal is greater than or equal to u, then the TB empty string is FL binarized for synVal + u with input symbolVal set to (k + 1) and cMaX set to (k + 1) Can be derived based on the process.
  • Equation 4 for deriving the length of a binary value, that is, the number of bits in the fixed-length binarization process described below, the number of bits may be derived as (k + 1) for cMaX set to (k + 1). .
  • a binary value of (k + 1) bits for synVal can be derived.
  • binarization for remaining mode coding invokes a fixed length (FL) binarization process with a cMax value equal to (num_intra_mode-mpm_idx).
  • binarization for remaining mode coding may be performed by an FL binarization process in which a cMax value is obtained by subtracting the number of MPM candidates from the total number of intra prediction modes.
  • the num_intra_mode may indicate the total number of intra prediction modes
  • the mpm_idx may indicate the number of MPM candidates.
  • the FL binarization process may be performed as follows.
  • the input of the process may be a request for cMax and FL binarization.
  • the output of the process may also be FL binarization that associates each symbolVal value with a corresponding bin string.
  • FL binarization may be constructed using an unsigned integer bin string, which is a fixed length bit of the symbol value symbolVal.
  • the fixed length may be derived as in the following equation.
  • the fixedLength may represent the fixed length.
  • the remaining intra prediction mode information may be binarized and coded by the TR binarization process or the FL binarization process.
  • the MPM index and remaining intra prediction mode information may be binarized as shown in the following table.
  • rem_intra_luma_pred_mode [] [] is a syntax element indicating the remaining intra prediction mode information
  • mpm_idx [] [] is a syntax element indicating the MPM index.
  • the remaining intra prediction mode information may be binarized by an FL binarization process
  • cMax an input parameter of the FL binarization process, may be a value obtained by subtracting the number of MPM candidates from the total intra prediction mode. . For example, if the total number of intra prediction modes is 67 and the number of MPM candidates is 6, considering 61 remaining intra prediction modes from 0 to 60 (that is, an index indicating the remaining intra prediction modes) Values are 0 to 60), the cMax may be 60.
  • the cMax may be 61. That is, the cMax may be a maximum value that can be represented by the remaining intra prediction mode information.
  • the MPM index may be binarized by a truncated rice (TR) binarization process, and cMax, an input parameter of the TR binarization process, is minus one from the number of MPM candidates.
  • CRiceParam may be zero. For example, when the number of MPM candidates is six, the cMax may be five.
  • the MPM index and remaining intra prediction mode information may be binarized as shown in the following table.
  • rem_intra_luma_pred_mode [] [] is a syntax element indicating the remaining intra prediction mode information
  • mpm_idx [] [] is a syntax element indicating the MPM index.
  • the remaining intra prediction mode information may be binarized by a TB binarization process
  • cMax an input parameter of the TB binarization process, may be a value obtained by subtracting the number of MPM candidates from the total intra prediction mode. .
  • the cMax may be 60.
  • the cMax may be 61. That is, the cMax may be a maximum value that can be represented by the remaining intra prediction mode information.
  • the MPM index may be binarized by a truncated rice (TR) binarization process, and cMax, an input parameter of the TR binarization process, is minus 1 from the number of MPM candidates.
  • CRiceParam may be zero.
  • the cMax may be five.
  • the MPM index may be encoded / decoded based on a context model.
  • the present invention proposes a method of deriving the context model based on an intra prediction mode in relation to a method of encoding / decoding the MPM index based on a context model.
  • the assignment of the context model to the MPM index may be as shown in the following table.
  • NUM_INTRA_MODE may indicate the number of intra prediction modes indicated by the M th MPM candidate included in the MPM list. That is, when the M th MPM candidate is the Nth intra prediction mode, the NUM_INTRA_MODE may represent N.
  • mpmCtx may represent the context model for the MPM index.
  • a context model for the Mth bin of the MPM index may be derived based on the Mth MPM candidate included in the MPM list.
  • M may be 3 or less.
  • the context model for the first bin in the intra prediction mode information for the current block may be derived based on the first candidate included in the MPM list.
  • the context model for the second bin may be derived based on the second candidate included in the MPM list
  • the context model for the third bin may be derived based on the third candidate included in the MPM list.
  • the number of intra prediction modes may be as shown in the following table.
  • the number of the intra prediction mode indicated by the M-th MPM candidate is the number of the DC intra prediction mode (ie, 1) or the number of the intra prediction mode is the planar intra prediction mode (ie, 0).
  • the context model for the Mth bin of the MPM index may be derived as the context model 1.
  • the context model for the M-th bin of the MPM index may be derived as context model 1.
  • the context model for the M th bin of the MPM index may be derived as the context model 2.
  • the M th MPM candidate is a DC intra prediction mode and the intra prediction mode is not a planner intra prediction mode
  • the M th MPM candidate is the second intra prediction mode to the 34 th intra prediction mode
  • the context model for the Mth bean can be derived from the context model 2.
  • the context model for the Mth bin of the MPM index may be derived as context model 2 or context model 3.
  • the context model of the M-th bin of the MPM index may be derived as context model 2 or context model 3.
  • the assignment of the context model to the MPM index may be as shown in the following table.
  • the context model for the Mth bin of the MPM index is context. Can be derived from model 1.
  • the context model for the M th bin of the MPM index may be derived as context model 1.
  • the context model for the M th bin of the MPM index is It can be derived from the context model 2.
  • the context model for the M-th bin of the MPM index may be derived as context model 2.
  • the context model for the M th bin of the MPM index may be derived as context model 3. Can be.
  • the M th MPM candidate is not the DC intra prediction mode and the planner intra prediction mode, and the M th MPM candidate is the second intra prediction mode to the 34 th intra prediction mode, the M th bin of the MPM index is determined.
  • the context model can be derived from the context model 3.
  • the context model for the Mth bin of the MPM index may be derived as the context model 4.
  • the M th MPM candidate is not the DC intra prediction mode, the planner intra prediction mode, the second intra prediction mode to the 34 intra prediction mode, and the 35 intra prediction mode to the 66 intra prediction mode, the MPM index
  • the context model for the Mth bean of may be derived from the context model 4.
  • ctxInc for a syntax element having context-based coded bins of the MPM index may be allocated as shown in the following table.
  • rem_intra_luma_pred_mode [] [] may be a syntax element indicating remaining intra prediction mode information
  • mpm_idx [] [] may be a syntax element indicating an MPM index
  • binIdx may indicate an index of a syntax element.
  • bin 0, bin 1, and bin 2 of the MPM index may be coded based on a context model, ctxInc for 0 is 0, ctxInc for 1 is 1, and bin 2 is CtxInc can be derived as 2.
  • bypass coding may be applied to bins 3 and 4 of the MPM index.
  • the bypass coding may represent a method of coding by applying a uniform probability distribution (for example, 50:50) instead of applying a context model having a specific probability distribution.
  • FIG. 13 schematically illustrates an image encoding method by an encoding apparatus according to the present invention.
  • the method disclosed in FIG. 13 may be performed by the encoding apparatus disclosed in FIG. 1.
  • S1300 to S1320 of FIG. 13 may be performed by the prediction unit of the encoding apparatus
  • S1330 may be performed by the entropy encoding unit of the encoding apparatus.
  • a process of deriving a residual sample for the current block based on an original sample and a prediction sample for the current block may be performed by a subtractor of the encoding apparatus.
  • the generating of the information about the residual on the current block may be performed by a converter of the encoding apparatus, and the encoding of the information about the residual may be performed by an entropy encoding unit of the encoding apparatus. It can be performed by.
  • the encoding apparatus configures a Most Probable Mode (MPM) list of the current block based on the neighboring blocks of the current block (S1300).
  • MPM Most Probable Mode
  • the MPM list may include three MPM candidates, five MPM candidates, or six MPM candidates.
  • the encoding apparatus may construct the MPM list of the current block based on the neighboring block of the current block, and the MPM list may include six MPM candidates.
  • the peripheral block may include the left peripheral block, the upper peripheral block, the lower left peripheral block, the upper right peripheral block, and / or the upper left peripheral block of the current block.
  • the encoding apparatus may search the neighboring blocks of the current block in a specific order, and may derive the intra prediction mode of the neighboring block as the MPM candidate in the derived order.
  • an encoding apparatus may include an intra prediction mode of the left neighboring block, an intra prediction mode of the upper neighboring block, a planner intra prediction mode, a DC intra prediction mode, an intra prediction mode of the lower left neighboring block, and the right upper neighboring block.
  • Intra prediction mode of, the intra prediction mode of the upper left neighboring block may be searched in order to derive an MPM candidate and configure the MPM list of the current block.
  • an MPM candidate may be derived based on an intra prediction mode derived as the MPM candidate. For example, when the intra prediction mode derived as the MPM candidate is N intra prediction mode, the encoding apparatus sets the N + 1 intra prediction mode and / or N-1 intra prediction mode to the MPM candidate of the current block. Can be derived.
  • the encoding apparatus determines an intra prediction mode of the current block (S1310).
  • the encoding apparatus may perform various intra prediction modes to derive an intra prediction mode having an optimal RD cost as an intra prediction mode for the current block.
  • the intra prediction mode may be one of two non-directional intra prediction modes and 65 intra directional prediction modes. As described above, the two non-directional intra prediction modes may include an intra DC mode and an intra planner mode.
  • the intra prediction mode of the current block may be one of the remaining intra prediction modes.
  • the remaining intra prediction modes may be intra prediction modes except for MPM candidates included in the MPM list in all intra prediction modes.
  • the encoding apparatus may encode remaining intra prediction mode information indicating the intra prediction mode of the current block among the remaining intra prediction modes.
  • the encoding apparatus may select an MPM candidate having an optimal RD cost among the MPM candidates of the MPM list, and determine the selected MPM candidate as an intra prediction mode for the current block.
  • the encoding apparatus may encode an MPM index indicating the selected MPM candidate among the MPM candidates.
  • the encoding apparatus generates a prediction sample of the current block based on the intra prediction mode (S1320).
  • the encoding apparatus may derive at least one neighboring sample of the neighboring samples of the current block based on the intra prediction mode, and generate the predictive sample based on the neighboring sample.
  • the peripheral samples may include upper left corner peripheral samples, upper peripheral samples, and left peripheral samples of the current block. For example, when the size of the current block is WxH and the x component of the top-left sample position of the current block is 0 and the y component is 0, the left neighboring samples are p [-1] [0. ] To p [-1] [2H-1], the sample around the upper left corner is p [-1] [-1], and the sample around the upper side is p [0] [-1] to p [2W-1] [-1].
  • the encoding apparatus encodes image information including intra prediction information about the current block (S1330).
  • the encoding apparatus may output image information including the intra prediction information about the current block in the form of a bitstream.
  • the intra prediction information may include a Most Probable Mode (MPM) flag for the current block.
  • the MPM flag may indicate whether the intra prediction mode of the current block is included in the MPM candidates or in the remaining intra prediction modes not included in the MPM candidates. Specifically, when the value of the MPM flag is 1, the MPM flag may indicate that the intra prediction mode of the current block is included in the MPM candidates, and when the value of the MPM flag is 0, the MPM flag is the current It may indicate that the intra prediction mode of the block is not included in the MPM candidates, that is, included in the remaining intra prediction modes. Alternatively, when the intra prediction mode of the current block is included in the MPM candidates, the encoding apparatus may not encode the MPM flag. That is, when the intra prediction mode of the current block is included in the MPM candidates, the intra prediction information may not include the MPM flag.
  • MPM Most Probable Mode
  • the encoding apparatus may encode remaining intra prediction mode information for the current block. That is, when the intra prediction mode of the current block is one of the remaining intra prediction modes, the intra prediction information may include the remaining intra prediction mode information.
  • the remaining intra prediction mode information may indicate the intra prediction mode of the current block among the remaining intra prediction modes.
  • the remaining intra prediction modes may indicate remaining intra prediction modes not included in the MPM candidates of the MPM list.
  • the remaining intra prediction mode information may be signaled in the form of rem_intra_luma_pred_mode or intra_luma_mpm_remainder syntax elements.
  • the remaining intra prediction mode information may be coded through a TB (Truncated Binary (TB)) binarization process.
  • the binarization parameter for the TB binarization process may be preset.
  • the value of the binarization parameter may be 60 or 61.
  • the value of the parameter may be set to a value obtained by subtracting the number of MPM candidates from the total number of intra prediction modes.
  • the binarization parameter may indicate the cMax described above.
  • the binarization parameter may indicate a maximum value of the remaining intra prediction mode information that is coded.
  • the remaining intra prediction mode information may be coded through a TB binarization process.
  • the remaining intra prediction mode information may be binarized to a binary value of k bits.
  • the value of the remaining intra prediction mode information is greater than or equal to a specific value
  • the remaining intra prediction mode information may be binarized to a binary value of k + 1 bits.
  • the specific value and k may be derived based on the binarization parameter. For example, the specific value and k may be derived based on Equation 3 described above. When the value of the binarization parameter is 61, the specific value may be derived as 3, and k may be derived as 5.
  • the encoding apparatus may encode the MPM index. That is, when the intra prediction mode of the current block is included in the MPM candidates, the intra prediction information of the current block may include the MPM index.
  • the MPM index may indicate an MPM index indicating one of the MPM candidates of the MPM list.
  • the MPM index may be signaled in the form of an mpm_idx or intra_luma_mpm_idx syntax element.
  • the MPM index may be binarized through a TR (Truncated Rice, TR) binarization process.
  • TR Trusted Rice
  • the binarization parameter for the TR binarization process may be preset.
  • the value of the binarization parameter may be set to a value obtained by subtracting 1 from the number of MPM candidates.
  • the binarization parameter may be set to five.
  • the binarization parameter may indicate the cMax described above.
  • the binarization parameter may indicate a maximum value of the MPM index to be coded.
  • the cRiceParam for the TR binarization process may be preset to zero.
  • the MPM index may be coded based on a context model.
  • a context model for the Nth bin for the MPM index may be derived based on the Nth MPM candidate included in the MPM list.
  • the context model for the Nth bin derived based on the Nth candidate may be as follows.
  • the context model for the Nth bin may be derived as context model 1, and
  • the intra prediction mode indicated by the Nth MPM candidate is not the DC intra prediction mode and the planner intra prediction mode, and is the intra prediction mode 2 to 34 intra prediction mode
  • the context model for the N th bin is context.
  • Model 2 wherein the intra prediction mode indicated by the Nth MPM candidate is not the DC intra prediction mode, the planner intra prediction mode, the second intra prediction mode to the 34 intra prediction mode, and the intra prediction mode 35
  • the context model for the Nth bin in the mode to intra prediction mode 66 Can be derived from the context model 3.
  • the context model for the Nth bin may be derived as context model 1, and the Nth MPM If the intra prediction mode indicated by the candidate is not the planar intra prediction mode, but the DC intra prediction mode, the context model for the Nth bin may be derived as context model 2, and the Nth MPM candidate indicates the When the intra prediction mode is not the planner intra prediction mode and the DC intra prediction mode, and the intra prediction mode 2 to 34 intra prediction mode is used, the context model for the Nth bin may be derived as context model 3 Wherein the intra prediction mode indicated by the Nth MPM candidate is the planar intra prediction mode, When the DC intra prediction mode and the intra prediction mode 2 to 34 intra prediction modes other than the 35 intra prediction mode and the intra prediction mode 35 to 66 are derived, the context model for the Nth bin is derived as the context model 4. Can be.
  • the encoding apparatus may derive a residual sample for the current block based on an original sample and a prediction sample for the current block, and apply the residual sample to the residual for the current block based on the residual sample.
  • Information about the residual may be encoded.
  • the image information may include information about the residual.
  • the bitstream may be transmitted to a decoding device through a network or a (digital) storage medium.
  • the network may include a broadcasting network and / or a communication network
  • the digital storage medium may include various storage media such as USB, SD, CD, DVD, Blu-ray, HDD, SSD, and the like.
  • FIG. 14 schematically illustrates an encoding apparatus for performing an image encoding method according to the present invention.
  • the method disclosed in FIG. 13 may be performed by the encoding apparatus disclosed in FIG. 14.
  • the prediction unit of the encoding apparatus of FIG. 14 may perform S1300 to S1320 of FIG. 13, and the entropy encoding unit of the encoding apparatus of FIG. 14 may perform S1330 of FIG. 13.
  • the process of deriving the residual sample for the current block based on the original sample and the prediction sample for the current block may be performed by the subtraction unit of the encoding apparatus of FIG.
  • the generating of the information about the residual of the current block based on the residual sample may be performed by the converter of the encoding apparatus of FIG. 14, and the encoding of the information of the residual may be performed by FIG. 14. May be performed by an entropy encoding unit of the encoding apparatus.
  • FIG. 15 schematically illustrates an image decoding method by a decoding apparatus according to the present invention.
  • the method disclosed in FIG. 15 may be performed by the decoding apparatus disclosed in FIG. 3.
  • S1500 of FIG. 15 may be performed by the entropy decoding unit of the decoding apparatus
  • FIGS. S1510 to S1530 may be performed by the predicting unit of the decoding apparatus.
  • a process of obtaining information on prediction and / or residual information of a current block through a bitstream may be performed by an entropy decoding unit of the decoding apparatus.
  • the process of deriving the residual sample for the current block may be performed by an inverse transform unit of the decoding apparatus, and the process of generating a reconstructed picture based on the prediction sample and the residual sample of the current block may be performed. It may be performed by an adder of the decoding apparatus.
  • the decoding apparatus obtains intra prediction information of the current block from the bitstream (S1500).
  • the decoding apparatus may obtain image information including intra prediction information of the current block from the bitstream.
  • the intra prediction information may include a Most Probable Mode (MPM) flag for the current block.
  • MPM Most Probable Mode
  • the decoding apparatus may obtain the MPM index for the current block from the bitstream. That is, when the value of the MPM flag is 1, the intra prediction information of the current block may include the MPM index. Alternatively, the intra prediction information may not include the MPM flag, and in this case, the decoding apparatus may derive the value of the MPM flag as 1.
  • the MPM index may indicate an MPM index indicating one of the MPM candidates of the MPM list.
  • the MPM index may be signaled in the form of an mpm_idx or intra_luma_mpm_idx syntax element.
  • the decoding apparatus may obtain the remaining intra prediction mode information for the current block from the bitstream. That is, when the value of the MPM flag is 0, the intra prediction information may include remaining intra prediction mode information indicating one of the remaining intra prediction modes. In this case, the decoding apparatus may derive the intra prediction mode indicated by the remaining intra prediction mode information among the remaining intra prediction modes as the intra prediction mode for the current block.
  • the remaining intra prediction modes may indicate remaining intra prediction modes not included in the MPM candidates of the MPM list.
  • the remaining intra prediction mode information may be signaled in the form of rem_intra_luma_pred_mode or intra_luma_mpm_remainder syntax elements.
  • the remaining intra prediction mode information may be coded through a TB (Truncated Binary (TB)) binarization process.
  • the binarization parameter for the TB binarization process may be preset.
  • the value of the binarization parameter may be 60 or 61.
  • the value of the parameter may be set to a value obtained by subtracting the number of MPM candidates from the total number of intra prediction modes.
  • the binarization parameter may indicate the cMax described above.
  • the binarization parameter may indicate a maximum value of the remaining intra prediction mode information that is coded.
  • the remaining intra prediction mode information may be coded through a TB binarization process.
  • the remaining intra prediction mode information may be binarized to a binary value of k bits.
  • the value of the remaining intra prediction mode information is greater than or equal to a specific value
  • the remaining intra prediction mode information may be binarized to a binary value of k + 1 bits.
  • the specific value and k may be derived based on the binarization parameter. For example, the specific value and k may be derived based on Equation 3 described above. When the value of the binarization parameter is 61, the specific value may be derived as 3, and k may be derived as 5.
  • the MPM index may be binarized through TR (Truncated Rice, TR) binarization process.
  • TR Trusted Rice, TR
  • the binarization parameter for the TR binarization process may be preset.
  • the value of the binarization parameter may be set to a value obtained by subtracting 1 from the number of MPM candidates.
  • the binarization parameter may be set to five.
  • the binarization parameter may indicate the cMax described above.
  • the binarization parameter may indicate a maximum value of the MPM index to be coded.
  • the cRiceParam for the TR binarization process may be preset to zero.
  • the MPM index may be coded based on a context model.
  • a context model for the Nth bin for the MPM index may be derived based on the Nth MPM candidate included in the MPM list.
  • the context model for the Nth bin derived based on the Nth candidate may be as follows.
  • the context model for the Nth bin may be derived as context model 1, and
  • the intra prediction mode indicated by the Nth MPM candidate is not the DC intra prediction mode and the planner intra prediction mode, and is the intra prediction mode 2 to 34 intra prediction mode
  • the context model for the N th bin is context.
  • Model 2 wherein the intra prediction mode indicated by the Nth MPM candidate is not the DC intra prediction mode, the planner intra prediction mode, the second intra prediction mode to the 34 intra prediction mode, and the intra prediction mode 35
  • the context model for the Nth bin in the mode to intra prediction mode 66 Can be derived from the context model 3.
  • the context model for the Nth bin may be derived as context model 1, and the Nth MPM If the intra prediction mode indicated by the candidate is not the planar intra prediction mode, but the DC intra prediction mode, the context model for the Nth bin may be derived as context model 2, and the Nth MPM candidate indicates the When the intra prediction mode is not the planner intra prediction mode and the DC intra prediction mode, and the intra prediction mode 2 to 34 intra prediction mode is used, the context model for the Nth bin may be derived as context model 3 Wherein the intra prediction mode indicated by the Nth MPM candidate is the planar intra prediction mode, When the DC intra prediction mode and the intra prediction mode 2 to 34 intra prediction modes other than the 35 intra prediction mode and the intra prediction mode 35 to 66 are derived, the context model for the Nth bin is derived as the context model 4. Can be.
  • the decoding apparatus may configure a Most Probable Mode (MPM) list of the current block based on the neighboring block of the current block.
  • MPM Most Probable Mode
  • the MPM list may include three MPM candidates, five MPM candidates, or six MPM candidates.
  • the decoding apparatus may configure the MPM list of the current block based on the neighboring block of the current block, and the MPM list may include six MPM candidates.
  • the peripheral block may include the left peripheral block, the upper peripheral block, the lower left peripheral block, the upper right peripheral block, and / or the upper left peripheral block of the current block.
  • the decoding apparatus may search the neighboring blocks of the current block in a specific order, and may derive the intra prediction mode of the neighboring block as the MPM candidate in the derived order.
  • a decoding apparatus may include an intra prediction mode of the left neighboring block, an intra prediction mode of the upper neighboring block, a planner intra prediction mode, a DC intra prediction mode, an intra prediction mode of the lower left neighboring block, and the right upper neighboring block.
  • Intra prediction mode of the intra prediction mode of the upper left neighboring block may be searched in order to derive an MPM candidate and configure the MPM list of the current block. Meanwhile, when six MPM candidates are not derived after the search, an MPM candidate may be derived based on an intra prediction mode derived as the MPM candidate. For example, when the intra prediction mode derived as the MPM candidate is N intra prediction mode, the decoding apparatus sets the N + 1 intra prediction mode and / or the N-1 intra prediction mode to the MPM candidate of the current block. Can be derived.
  • the decoding apparatus derives the intra prediction mode of the current block based on the remaining intra prediction mode information (S1510).
  • the decoding apparatus may derive the intra prediction mode indicated by the remaining intra prediction mode information as the intra prediction mode of the current block.
  • the remaining intra prediction mode information may indicate one of the remaining intra prediction modes.
  • the remaining intra prediction modes may be intra prediction modes excluding the MPM candidates among all intra prediction modes.
  • the remaining intra prediction mode information when the value of the remaining intra prediction mode information is N, the remaining intra prediction mode information may indicate N intra prediction modes.
  • the remaining intra prediction mode information when the value of the remaining intra prediction mode information is N, the remaining intra prediction mode information may indicate an N + 1th intra prediction mode in an intra mode map.
  • the intra mode map may represent intra prediction modes excluding MPM candidates in intra prediction modes of a predetermined order.
  • the intra prediction modes of the predetermined order may be as follows.
  • the remaining intra prediction mode information may indicate the N + 1th intra prediction mode in the TBC list.
  • the TBC list may be composed of intra prediction modes derived based on a directional intra prediction mode and an offset among MPM candidates.
  • the decoding apparatus may obtain an MPM index for the current block from a bitstream, and derive an intra prediction mode of the current block based on the MPM index.
  • the decoding apparatus may derive the MPM candidate indicated by the MPM index into the intra prediction mode of the current block.
  • the MPM index may indicate one of the MPM candidates of the MPM list.
  • the decoding apparatus derives a prediction sample of the current block based on the intra prediction mode (S1520).
  • the decoding apparatus may derive at least one neighboring sample of the neighboring samples of the current block based on the intra prediction mode, and generate the predictive sample based on the neighboring sample.
  • the peripheral samples may include upper left corner peripheral samples, upper peripheral samples, and left peripheral samples of the current block. For example, when the size of the current block is WxH and the x component of the top-left sample position of the current block is 0 and the y component is 0, the left neighboring samples are p [-1] [0. ] To p [-1] [2H-1], the sample around the upper left corner is p [-1] [-1], and the sample around the upper side is p [0] [-1] to p [2W-1] [-1].
  • the decoding apparatus derives a reconstructed picture based on the prediction sample (S1530).
  • the decoding apparatus may directly use the prediction sample as a reconstruction sample according to a prediction mode, or generate a reconstruction sample by adding a residual sample to the prediction sample. If there is a residual sample for the current block, the decoding apparatus may receive information about the residual for the current block, and the information about the residual may be included in the information about the face.
  • the information about the residual may include transform coefficients regarding the residual sample.
  • the image information may include information about the residual.
  • the decoding apparatus may derive the residual sample (or residual sample array) for the current block based on the residual information.
  • the decoding apparatus may generate a reconstructed sample based on the prediction sample and the residual sample, and may derive a reconstructed block or a reconstructed picture based on the reconstructed sample.
  • the decoding apparatus may apply an in-loop filtering procedure such as a deblocking filtering and / or SAO procedure to the reconstructed picture in order to improve subjective / objective picture quality as necessary.
  • an in-loop filtering procedure such as a deblocking filtering and / or SAO procedure
  • FIG. 16 schematically shows a decoding apparatus for performing an image decoding method according to the present invention.
  • the method disclosed in FIG. 15 may be performed by the decoding apparatus disclosed in FIG. 16.
  • the entropy decoding unit of the decoding apparatus of FIG. 16 may perform S1500 of FIG. 15, and the prediction unit of the decoding apparatus of FIG. 16 may perform S1510 to S1530 of FIG. 15.
  • a process of acquiring image information including information on the residual of the current block through the bitstream may be performed by the entropy decoding unit of the decoding apparatus of FIG. 16.
  • the process of deriving the residual sample for the current block based on the related information may be performed by an inverse transform unit of the decoding apparatus of FIG. 16, and generates a reconstructed picture based on the prediction sample and the residual sample.
  • the process may be performed by an adder of the decoding apparatus of FIG. 16.
  • the embodiments described herein may be implemented and performed on a processor, microprocessor, controller, or chip.
  • the functional units shown in each drawing may be implemented and performed on a computer, processor, microprocessor, controller, or chip.
  • information for implementation (ex. Information on instructions) or an algorithm may be stored in a digital storage medium.
  • the decoding apparatus and encoding apparatus to which the present invention is applied include a multimedia broadcasting transmitting and receiving device, a mobile communication terminal, a home cinema video device, a digital cinema video device, a surveillance camera, a video chat device, a real time communication device such as video communication, and mobile streaming.
  • the OTT video device may include a game console, a Blu-ray player, an internet access TV, a home theater system, a smartphone, a tablet PC, a digital video recorder (DVR), and the like.
  • the processing method to which the present invention is applied can be produced in the form of a program executed by a computer, and can be stored in a computer-readable recording medium.
  • Multimedia data having a data structure according to the present invention can also be stored in a computer-readable recording medium.
  • the computer readable recording medium includes all kinds of storage devices and distributed storage devices in which computer readable data is stored.
  • the computer-readable recording medium may be, for example, a Blu-ray disc (BD), a universal serial bus (USB), a ROM, a PROM, an EPROM, an EEPROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, and an optical disc. It may include a data storage device.
  • the computer-readable recording medium also includes media embodied in the form of a carrier wave (eg, transmission over the Internet).
  • the bitstream generated by the encoding method may be stored in a computer-readable recording medium or transmitted through a wired or wireless communication network.
  • an embodiment of the present invention may be implemented as a computer program product by program code, which may be performed on a computer by an embodiment of the present invention.
  • the program code may be stored on a carrier readable by a computer.
  • FIG. 17 exemplarily shows a structure diagram of a content streaming system to which the present invention is applied.
  • the content streaming system to which the present invention is applied may largely include an encoding server, a streaming server, a web server, a media storage, a user device, and a multimedia input device.
  • the encoding server compresses content input from multimedia input devices such as a smart phone, a camera, a camcorder, etc. into digital data to generate a bitstream and transmit the bitstream to the streaming server.
  • multimedia input devices such as smart phones, cameras, camcorders, etc. directly generate a bitstream
  • the encoding server may be omitted.
  • the bitstream may be generated by an encoding method or a bitstream generation method to which the present invention is applied, and the streaming server may temporarily store the bitstream in the process of transmitting or receiving the bitstream.
  • the streaming server transmits the multimedia data to the user device based on the user's request through the web server, and the web server serves as a medium for informing the user of what service.
  • the web server delivers it to a streaming server, and the streaming server transmits multimedia data to the user.
  • the content streaming system may include a separate control server.
  • the control server plays a role of controlling a command / response between devices in the content streaming system.
  • the streaming server may receive content from a media store and / or an encoding server. For example, when the content is received from the encoding server, the content may be received in real time. In this case, in order to provide a smooth streaming service, the streaming server may store the bitstream for a predetermined time.
  • Examples of the user device include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, a slate PC, Tablet PCs, ultrabooks, wearable devices, such as smartwatches, glass glasses, head mounted displays, digital TVs, desktops Computer, digital signage, and the like.
  • PDA personal digital assistant
  • PMP portable multimedia player
  • slate PC slate PC
  • Tablet PCs ultrabooks
  • wearable devices such as smartwatches, glass glasses, head mounted displays, digital TVs, desktops Computer, digital signage, and the like.
  • Each server in the content streaming system may be operated as a distributed server, in which case data received from each server may be distributed.
  • the term " digital device" includes all digital devices capable of performing at least one of transmitting, receiving, processing and outputting data, content, services, and the like.
  • processing of data, content, services, and the like by the digital device includes an operation of encoding and / or decoding data, content, services, and the like.
  • These digital devices are paired or connected (hereinafter referred to as 'pairing') with other digital devices, external servers, or the like through a wired / wireless network to transmit and receive data. Convert accordingly.
  • Digital devices include, for example, standing devices such as network TVs, hybrid broadcast broadband TVs, smart TVs, internet protocol televisions, and personal computers.
  • a mobile device or handheld device such as a personal digital assistant (PDA), a smart phone, a tablet PC, a notebook, and the like, and an encoding and / or decoding device.
  • PDA personal digital assistant
  • a digital TV is described with reference to FIG. 19 and a mobile device is described with reference to FIG.
  • wired / wireless network refers to a communication network supporting various communication standards or protocols for interconnection and / or data transmission and reception between digital devices or digital devices and external servers.
  • wired / wireless networks may include both communication networks to be supported by the specification now or in the future and communication protocols therefor.
  • USB Universal Serial Bus
  • CVBS Composite Video Banking Sync
  • Component Component
  • S-Video S-Video
  • Wi-Fi Direct Wireless LAN
  • Wi-Fi Direct Direct
  • only digital device may mean a fixed device or a mobile device or both may be included depending on the context.
  • the digital device is, for example, an intelligent device that supports a broadcast reception function, a computer function or support, and at least one external input.
  • the digital device is an e-mail or web browsing method through the above-described wired / wireless network. It can support web browsing, banking, games, applications, and more.
  • the digital device may include an interface for supporting at least one input or control means (hereinafter, input means) such as a handwritten input device, a touch screen, and a spatial remote controller.
  • Digital devices can use a standardized general-purpose operating system (OS). For example, digital devices can add, delete, amend, update, etc. various applications on a general-purpose OS kernel, thereby further making it possible to It can provide a user-friendly environment.
  • OS general-purpose operating system
  • the external input described in this specification includes all the input means or digital device that is connected to the external input device, that is, the above-mentioned digital device by wire / wireless and can transmit / receive and process related data therethrough.
  • the external input may be, for example, a digital device such as a high definition multimedia interface (HDMI), a game device such as a play station or an X-box, a smartphone, a tablet PC, a printer, a smart TV, and the like. Include all devices.
  • HDMI high definition multimedia interface
  • a game device such as a play station or an X-box
  • smartphone a tablet PC
  • printer a printer
  • smart TV smart TV
  • server includes a client, that is, a digital device that includes all digital devices or systems for supplying data to the above-mentioned digital device, and is called a processor.
  • a server may include, for example, a portal server for providing a web page or web content, an advertising server for providing advertising data, a content server for providing content, and an SNS ( SNS server (SNS server) for providing a Social Network Service (Service), may include a service server (service server or manufacturing server) provided by the manufacturer.
  • the term "channel" described herein refers to a path, means, and the like for transmitting and receiving data, and may include, for example, a broadcasting channel.
  • the broadcast channel is expressed in terms of a physical channel, a virtual channel, a logical channel, etc. according to the activation of digital broadcasting.
  • the broadcast channel may be called a broadcast network.
  • a broadcast channel refers to a channel for providing or accessing broadcast content provided by a broadcasting station.
  • the broadcast content is also called a live channel, which is mainly based on real-time broadcasting. .
  • the live channel is not only a real time broadcasting but also a broadcast including a non real time broadcasting in some cases. It may be understood as a term meaning the entire channel.
  • the present specification further defines an "arbitrary channel" in relation to the channel in addition to the above-described broadcast channel.
  • the arbitrary channel may be provided with a service guide such as an EPG (Electronic Program Guide) together with a broadcast channel, and the service guide, a graphical user interface (GUI), or an OSD display (On-Screen Display) may be provided using only an arbitrary channel. screen) may be configured / provided.
  • a service guide such as an EPG (Electronic Program Guide) together with a broadcast channel
  • GUI graphical user interface
  • OSD display On-Screen Display
  • a random channel is a channel that is arbitrarily assigned by a receiver and is assigned a channel number which is basically not overlapped with a channel number for representing the broadcast channel.
  • the receiver receives a broadcast signal for transmitting broadcast content and signaling information therefor through the tuned channel.
  • the receiver parses the channel information from the signaling information, and configures a channel browser, an EPG, etc. based on the parsed channel information and provides the same to the user.
  • the receiver responds thereto.
  • a broadcast channel is a content previously promised between a transmitter and a receiver, in the case of randomly assigning an arbitrary channel to a broadcast channel, confusion or possibility of confusion exists. Therefore, it is preferable not to overlap the assignment as described above. .
  • the random channel number is not overlapped with the broadcast channel number as described above, there is still a risk of confusion in the user's channel surfing process, so it is required to allocate the random channel number in consideration of this. This is because any channel according to the present invention can be implemented to be approached like a broadcast channel in the same manner according to the user's channel switching request through the input means in the same manner as the conventional broadcast channel.
  • the random channel number is a form in which characters are written together, such as random channel-1, random channel-2, etc., not in the form of a number, such as a broadcast channel, for the purpose of distinguishing or identifying a user from a random channel access and a broadcast channel number.
  • a number such as a broadcast channel
  • the display of the arbitrary channel number may be implemented in the form of letters written together, such as arbitrary channel-1, but internally in the receiver in the form of numbers, such as the number of the broadcast channel.
  • the arbitrary channel number may be provided in numerical form, such as a broadcast channel, and the channel number may be defined and displayed in various ways that can be distinguished from the broadcast channel, such as a video channel-1, a title-1, a video-1, and the like. have.
  • the digital device provides a user with various types of web pages by executing a web browser for a web service.
  • the web page also includes a web page including a video content.
  • the video is separately or independently separated from the web page and processed.
  • the separated video may be implemented by allocating the aforementioned arbitrary channel number, providing the same through a service guide, and outputting the user according to a channel switching request in a service guide or a broadcast channel viewing process.
  • predetermined content, images, audio, items, and the like are separately processed from the broadcast content, games, applications, and the like for playback, processing, and the like. Any channel number can be assigned and implemented as described above.
  • FIG. 18 is a diagram schematically illustrating an example of a service system including a digital device.
  • a service system including a digital device includes a content provider (CP) 1810, a service provider (SP) 1820, a network provider (NP) 1830, and a home network end user (HNED). (Customer) 1840.
  • the HNED 1840 is, for example, a client 1800, that is, a digital device.
  • the content provider 1810 produces and provides various contents.
  • a content provider 1810 as shown in FIG. 18, a terrestrial broadcaster, a cable SO (System Operator) or an MSO (Multiple SO), a satellite broadcaster , Various Internet broadcasters, private CPs, and the like.
  • the content provider 1810 provides various applications in addition to broadcast content.
  • the service provider 1820 may service-package content provided by the content provider 1810 to the HNED 1840.
  • the service provider 1820 of FIG. 18 packages the first terrestrial broadcast, the second terrestrial broadcast, the cable MSO, satellite broadcast, various Internet broadcasts, applications, and the like to the HNED 1840.
  • the service provider 1820 provides a service to the client 1800 in a uni-cast or multi-cast manner. Meanwhile, the service provider 1820 may transmit data to a plurality of clients 1800 registered in advance. For this, the service provider 1820 may use an Internet Group Management Protocol (IGMP) protocol.
  • IGMP Internet Group Management Protocol
  • the above-described content provider 1810 and service provider 1820 may be the same entity (same or single entity).
  • the content produced by the content provider 1810 may be packaged as a service and provided to the HNED 1840 to perform the functions of the service provider 1820 together or vice versa.
  • the network provider 1830 provides a network for data exchange between the content provider 1810 or / and the service provider 1820 and the client 1800.
  • the client 1800 may establish a home network and transmit and receive data.
  • the content provider 1810 or the service provider 1820 in the service system may use conditional access or content protection means to protect the transmitted content.
  • the client 1800 may use processing means such as a cable card (POD: Point of Deployment), a DCAS (Downloadable CAS), etc. in response to the restriction reception or content protection.
  • POD Point of Deployment
  • DCAS Downloadable CAS
  • the client 1800 may also use a bidirectional service through a network (or a communication network). In this case, rather, the client 1800 may perform the function of the content provider, and the existing service provider 1820 may receive it and transmit it to another client.
  • FIG. 19 is a block diagram illustrating an embodiment of a digital device.
  • FIG. 19 may correspond to, for example, the client 1800 of FIG. 18 and refers to the aforementioned digital device.
  • the digital device 1900 includes a network interface 1901, a TCP / IP manager 1902, a service delivery manager 1903, an SI decoder 1904, Demux (1905), Audio Decoder (1906), Video Decoder (1907), Display (A / V and OSD Module) (1908), Service Control Manager (Service) Control Manager (1909), Service Discovery Manager (1910), SI & Metadata Database (SI & Metadata DB) (1911), Metadata Manager (1912), Service Manager (1913), UI Manager 1914 or the like.
  • the network interface unit 1901 receives or transmits IP packets (internet protocol (IP) packets) through a network. That is, the network interface unit 1901 receives a service, content, and the like from the service provider 1820 through a network.
  • IP internet protocol
  • the TCP / IP manager 1902 is configured for IP packets received by the digital device 1900 and IP packets transmitted by the digital device 1900, that is, packet delivery between a source and a destination. Get involved.
  • the TCP / IP manager 1902 classifies the received packet (s) to correspond to an appropriate protocol, and the service delivery manager 1905, the service discovery manager 1910, the service control manager 1909, and the metadata manager 1912.
  • Output packet (s) classified as The service delivery manager 1903 is responsible for controlling the received service data.
  • the service delivery manager 1903 may use RTP / RTCP when controlling real-time streaming data.
  • the service delivery manager 1903 parses the received data packet according to the RTP and transmits it to the demultiplexer 1905 or the control of the service manager 1913. According to the SI & metadata database 1911.
  • the service delivery manager 1903 feeds back the network reception information to a server providing a service using RTCP.
  • the demultiplexer 1905 demultiplexes the received packet into audio, video, SI (System Information) data, and the like, and transmits the demultiplexed unit to the audio / video decoders 1906/1907 and the SI decoder 1904, respectively.
  • the SI decoder 1904 decodes service information such as, for example, Program Specific Information (PSI), Program and System Information Protocol (PSIP), and Digital Video Broadcasting-Service Information (DVB-SI).
  • PSI Program Specific Information
  • PSIP Program and System Information Protocol
  • DVB-SI Digital Video Broadcasting-Service Information
  • the SI decoder 1904 stores the decoded service information, for example, in the SI & metadata database 1911.
  • the stored service information may be read and used by a corresponding configuration, for example, at the request of a user.
  • the audio / video decoders 1906/1907 decode each audio data and video data demultiplexed by the demultiplexer 1905.
  • the decoded audio data and video data are provided to the user through the display unit 1908.
  • the application manager may include, for example, a UI manager 1914 and a service manager 1913.
  • the application manager may manage the overall state of the digital device 1900, provide a user interface, and manage other managers.
  • the UI manager 1914 provides a Graphic User Interface (GUI) for a user by using an OSD (On Screen Display) and the like, and receives a key input from the user to perform a device operation according to the input. For example, the UI manager 1914 transmits the key input signal to the service manager 1913 when receiving a key input related to channel selection from a user.
  • GUI Graphic User Interface
  • the service manager 1913 controls a manager associated with a service, such as a service delivery manager 1903, a service discovery manager 1910, a service control manager 1909, and a metadata manager 1912.
  • the service manager 1913 creates a channel map and selects a channel using the channel map according to a key input received from the user interface manager 1914.
  • the service manager 1913 receives service information of a channel from the SI decoder 1904 and sets an audio / video packet identifier (PID) of the selected channel to the demultiplexer 1905.
  • PID audio / video packet identifier
  • the PID thus set is used in the demultiplexing process described above. Accordingly, the demultiplexer 1905 filters the audio data, the video data, and the SI data by using the PID.
  • the service discovery manager 1910 provides information necessary for selecting a service provider providing a service. Upon receiving a signal regarding channel selection from the service manager 1913, the service discovery manager 1910 finds a service using the information.
  • the service control manager 1909 is in charge of selecting and controlling a service.
  • the service control manager 1909 uses IGMP or RTSP when the user selects a live broadcasting service such as a conventional broadcasting method, and selects a service such as VOD (Video on Demand).
  • the RTSP is used to perform service selection and control.
  • the RTSP protocol may provide a trick mode for real time streaming.
  • the service control manager 1909 may initialize and manage a session through the IMS gateway 1950 using an IP multimedia subsystem (IMS) or a session initiation protocol (SIP).
  • IMS IP multimedia subsystem
  • SIP session initiation protocol
  • the protocols are one embodiment, and other protocols may be used depending on implementation.
  • the metadata manager 1912 manages metadata associated with the service and stores the metadata in the SI & Metadata Database 1911.
  • the SI & Metadata Database 1911 includes service information decoded by the SI decoder 1904, metadata managed by the metadata manager 1912, and information necessary to select a service provider provided by the service discovery manager 1910. Save it.
  • the SI & Metadata Database 1911 can also store set-up data and the like for the system.
  • the SI & metadata database 1911 may be implemented using non-volatile memory (NVRAM), flash memory, or the like.
  • NVRAM non-volatile memory
  • the IMS gateway 1950 is a gateway that collects functions necessary for accessing an IMS-based IPTV service.
  • FIG. 20 is a block diagram illustrating another embodiment of a digital device.
  • FIG. 20 illustrates a block diagram of a mobile device as another embodiment of the digital device.
  • the mobile device 2000 may include a wireless communication unit 2010, an audio / video input unit 2020, a user input unit 2030, a sensing unit 2040, an output unit 2050, and the like. It may include a memory 2060, an interface unit 2070, a controller 2080, a power supply unit 2090, and the like.
  • the components shown in FIG. 20 are not essential, so that a mobile device having more or fewer components may be implemented.
  • the wireless communication unit 2010 may include one or more modules that enable wireless communication between the mobile device 2000 and the wireless communication system or between the mobile device and the network in which the mobile device is located.
  • the wireless communication unit 2010 may include a broadcast receiving module 2011, a mobile communication module 2012, a wireless internet module 2013, a short range communication module 2014, a location information module 2015, and the like. .
  • the broadcast receiving module 2011 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
  • the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
  • the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 2012.
  • the broadcast related information may exist in various forms, for example, in the form of an electronic program guide (EPG) or an electronic service guide (ESG).
  • EPG electronic program guide
  • ESG electronic service guide
  • the broadcast receiving module 2011 may be, for example, ATSC, Digital Video Broadcasting-Terrestrial (DVB-T), DVB-S (Satellite), Media Forward Link Only (MediaFLO), DVB-H (Handheld), and ISDB-T (Digital broadcasting signals may be received using a digital broadcasting system such as Integrated Services Digital Broadcast-Terrestrial.
  • DVD-T Digital Video Broadcasting-Terrestrial
  • DVB-S Setellite
  • MediaFLO Media Forward Link Only
  • DVB-H Highandheld
  • ISDB-T Digital broadcasting signals may be received using a digital broadcasting system such as Integrated Services Digital Broadcast-Terrestrial.
  • the broadcast receiving module 2011 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module 2011 may be stored in the memory 2060.
  • the mobile communication module 2012 transmits / receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to transmission and reception of a voice signal, a video call signal, or a text / multimedia message.
  • the wireless internet module 2013 may include a module for wireless internet access and may be embedded or external to the mobile device 2000.
  • Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.
  • the short range communication module 2014 refers to a module for short range communication.
  • Short range communication technologies include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, RS-232, and RS-485. Can be.
  • the location information module 2015 may be a module for acquiring location information of the mobile device 2000 and may be a Global Position System (GPS) module as an example.
  • GPS Global Position System
  • the A / V input unit 2020 is for inputting an audio or / video signal, and may include a camera 2021, a microphone 2022, and the like.
  • the camera 2021 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the imaging mode.
  • the processed image frame may be displayed on the display portion 2051.
  • the image frame processed by the camera 2021 may be stored in the memory 2060 or transmitted to the outside through the wireless communication unit 2010. Two or more cameras 2021 may be provided depending on the use environment.
  • the microphone 2022 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data.
  • the processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 2012 and output in the call mode.
  • the microphone 2022 may be implemented with various noise removing algorithms for removing noise generated while receiving an external sound signal.
  • the user input unit 2030 generates input data for the user to control the operation of the terminal.
  • the user input unit 2030 may include a key pad, a dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.
  • the sensing unit 2040 may determine the current state of the mobile device 2000, such as an open / closed state of the mobile device 2000, a location of the mobile device 2000, presence or absence of user contact, orientation of the mobile device, and acceleration / deceleration of the mobile device.
  • the sensing unit generates a sensing signal for controlling the operation of the mobile device 2000. For example, when the mobile device 2000 is moved or tilted, the position or tilt of the mobile device may be sensed. Also, whether the power supply unit 2090 is supplied with power or whether the interface unit 2070 is coupled to an external device may be sensed.
  • the sensing unit 2040 may include a proximity sensor 2041 including near field communication (NFC).
  • the output unit 2050 is for generating output related to visual, auditory, or tactile sense, and may include a display unit 2051, a sound output module 2052, an alarm unit 2053, and a haptic module 2054. have.
  • the display unit 2051 displays (outputs) information processed by the mobile device 2000. For example, when the mobile device is in a call mode, a user interface (UI) or a graphic user interface (GUI) related to a call is displayed. When the mobile device 2000 is in a video call mode or a shooting mode, the mobile device 2000 displays a captured image and / or a received image, UI, or GUI.
  • UI user interface
  • GUI graphic user interface
  • the display unit 2051 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (LCD). It may include at least one of a flexible display, a 3D display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • LCD flexible display
  • Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display.
  • a representative example of the transparent display is TOLED (Transparant OLED).
  • the rear structure of the display portion 2051 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 2051 of the terminal body.
  • two or more display units 2051 may exist.
  • a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile device 2000, or may be disposed on different surfaces, respectively.
  • the display unit 2051 and a sensor for detecting a touch operation form a mutual layer structure (hereinafter referred to as a touch screen)
  • the display unit 2051 may be input in addition to an output device. It can also be used as a device.
  • the touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display portion 2051 or capacitance generated at a specific portion of the display portion 2051 into an electrical input signal.
  • the touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.
  • the corresponding signal (s) is sent to the touch controller.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 2080.
  • the controller 2080 may determine which area of the display unit 2051 is touched.
  • the proximity sensor 2041 may be disposed in an inner region of the mobile device surrounded by the touch screen or near the touch screen.
  • the proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • Proximity sensors have a longer life and higher utilization than touch sensors.
  • the proximity sensor examples include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer.
  • the touch screen may be classified as a proximity sensor.
  • the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch
  • the act of actually touching the pointer on the screen is called “contact touch.”
  • the position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.
  • the proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state).
  • a proximity touch and a proximity touch pattern for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state.
  • Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
  • the sound output module 2052 may output audio data received from the wireless communication unit 2010 or stored in the memory 2060 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output module 2052 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed by the mobile device 2000.
  • the sound output module 2052 may include a receiver, a speaker, a buzzer, and the like.
  • the alarm unit 2053 outputs a signal for notifying occurrence of an event of the mobile device 2000. Examples of events occurring in the mobile device include call signal reception, message reception, key signal input, and touch input.
  • the alarm unit 2053 may output a signal for notifying the occurrence of an event in a form other than a video signal or an audio signal, for example, vibration.
  • the video signal or the audio signal may also be output through the display unit 2051 or the audio output module 2052, and they may be classified as part of the alarm unit 2053.
  • the haptic module 2054 generates various tactile effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 2054.
  • the intensity and pattern of vibration generated by the haptic module 2054 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.
  • the haptic module 2054 may be used to stimulate pins that vertically move with respect to the contact skin surface, the blowing force or suction force of air through the injection or inlet, grazing to the skin surface, contact of the electrode, electrostatic force, and the like.
  • Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.
  • the haptic module 2054 may not only transmit a tactile effect through direct contact, but also may allow a user to feel the tactile effect through a muscle sense such as a finger or an arm. Two or more haptic modules 2054 may be provided according to a configuration aspect of the mobile device 2000.
  • the memory 2060 may store a program for the operation of the controller 2080 and may temporarily store input / output data (eg, a phone book, a message, a still image, a video, etc.).
  • the memory 2060 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
  • the memory 2060 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, It may include a storage medium of at least one type of magnetic disk, optical disk.
  • the mobile device 2000 may operate in association with a web storage that performs a storage function of the memory 2060 on the Internet.
  • the interface unit 2070 serves as a path with all external devices connected to the mobile device 2000.
  • the interface unit 2070 may receive data from an external device, receive power, transfer the power to each component inside the mobile device 2000, or transmit data within the mobile device 2000 to an external device.
  • wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video I / O port, the earphone port, and the like may be included in the interface unit 2070.
  • the identification module is a chip that stores various types of information for authenticating the use authority of the mobile device 2000.
  • the identification module may include a user identification module (UIM), a subscriber identify module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like.
  • a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the mobile device 2000 through the port.
  • the interface unit 2070 may be a path through which power from the cradle is supplied to the mobile device 2000 or input by the user from the cradle. It may be a passage through which a command signal is transmitted to the mobile terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.
  • the controller 2080 generally controls the overall operation of the mobile device. For example, perform related control and processing for voice calls, data communications, video calls, and the like.
  • the controller 2080 may include a multimedia module 2081 for multimedia playback.
  • the multimedia module 2081 may be implemented in the controller 2080 or may be implemented separately from the controller 2080.
  • the controller 2080 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on a touch screen as a character and an image, respectively.
  • the power supply unit 2090 receives an external power source and an internal power source under the control of the controller 2080 to supply power for operation of each component.
  • Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
  • embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of a processor, a controller, micro-controllers, microprocessors, and other electrical units for performing other functions. 2080 may be implemented by itself.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • embodiments such as the procedures and functions described herein may be implemented as separate software modules.
  • Each of the software modules may perform one or more functions and operations described herein.
  • Software code may be implemented in software applications written in the appropriate programming languages.
  • the software code may be stored in the memory 2060 and executed by the controller 2080.
  • 21 is a block diagram illustrating another embodiment of a digital device.
  • the digital device 2100 include a broadcast receiving unit 2105, an external device interface unit 2135, a storage unit 2140, a user input interface unit 2150, a control unit 2170, a display unit 2180, and audio. It may include an output unit 2185, a power supply unit 2190, and a photographing unit (not shown).
  • the broadcast receiver 2105 may include at least one tuner 2110, a demodulator 2120, and a network interface 2130. However, in some cases, the broadcast receiver 2105 may include a tuner 2110 and a demodulator 2120, but may not include the network interface 2130, or vice versa.
  • the broadcast receiving unit 2105 includes a multiplexer and a signal demodulated by the demodulator 2120 via the tuner 2110 and a signal received through the network interface 2130. You can also multiplex.
  • the broadcast receiving unit 2125 may include a demultiplexer to demultiplex the multiplexed signal or to demultiplex the demodulated signal or the signal passed through the network interface unit 2130. Can be.
  • the tuner 2110 receives an RF broadcast signal by tuning a channel selected by a user or all previously stored channels among radio frequency (RF) broadcast signals received through an antenna.
  • the tuner 2110 converts the received RF broadcast signal into an intermediate frequency (IF) signal or a baseband signal.
  • IF intermediate frequency
  • the received RF broadcast signal is a digital broadcast signal
  • it is converted into a digital IF signal (DIF).
  • the analog broadcast signal is converted into an analog baseband video or audio signal (CVBS / SIF). That is, the tuner 2110 may process both digital broadcast signals or analog broadcast signals.
  • the analog baseband video or audio signal CVBS / SIF output from the tuner 2110 may be directly input to the controller 2170.
  • the tuner 2110 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of multiple carriers according to a digital video broadcasting (DVB) scheme.
  • ATSC Advanced Television System Committee
  • DVD digital video broadcasting
  • the tuner 2110 may sequentially tune and receive RF broadcast signals of all broadcast channels stored through a channel memory function among RF broadcast signals received through an antenna and convert them into intermediate frequency signals or baseband signals. .
  • the demodulator 2120 receives and demodulates the digital IF signal DIF converted by the tuner 2110. For example, when the digital IF signal output from the tuner 2110 is an ATSC scheme, the demodulator 2120 performs 8-VSB (8-Vestigal Side Band) demodulation, for example. In addition, the demodulator 2120 may perform channel decoding. To this end, the demodulator 2120 includes a trellis decoder, a de-interleaver, a reed-solomon decoder, and the like. Reed-Soloman decoding can be performed.
  • the demodulator 2120 when the digital IF signal output from the tuner 2110 is a DVB scheme, the demodulator 2120 performs coded orthogonal frequency division modulation (COFDMA) demodulation, for example.
  • the demodulator 2120 may perform channel decoding.
  • the demodulator 2120 may include a convolution decoder, a deinterleaver, a reed-soloman decoder, and the like to perform convolutional decoding, deinterleaving, and reed soloman decoding.
  • the demodulator 2120 may output a stream signal TS after performing demodulation and channel decoding.
  • the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.
  • the stream signal may be an MPEG-2 Transport Stream (TS) multiplexed with an MPEG-2 standard video signal, a Dolby AC-3 standard audio signal, and the like.
  • the MPEG-2 TS may include a header of 4 bytes and a payload of 184 bytes.
  • the demodulation unit 2120 described above may be provided separately according to the ATSC method and the DVB method. That is, the digital device may include an ATSC demodulator and a DVB demodulator separately.
  • the stream signal output from the demodulator 2120 may be input to the controller 2170.
  • the controller 2170 may control demultiplexing, image / audio signal processing, and the like, may control an image through the display 2180 and an audio output through the audio output unit 2185.
  • the external device interface unit 2135 provides an environment in which various external devices are interfaced to the digital device 2100.
  • the external device interface unit 2135 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).
  • the external device interface unit 2135 may include a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop, tablet), a smartphone, a Bluetooth device, and a cloud. It can be connected to an external device such as a cloud through wired / wireless connection.
  • the external device interface unit 2135 transmits an image, audio, or data (including image) signal input from the outside to the controller 2170 of the digital device through the connected external device.
  • the controller 2170 may control to output the processed video, audio, or data signal to a connected external device.
  • the external device interface unit 2135 may further include an A / V input / output unit (not shown) or a wireless communication unit (not shown).
  • the A / V input / output unit may be provided with a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), and a DVI (eg Digital Visual Interface (HDMI) terminal, High Definition Multimedia Interface (HDMI) terminal, RGB terminal, D-SUB terminal and the like.
  • the wireless communication unit may perform near field communication with another electronic device.
  • the digital device 2100 may include, for example, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Digital Living Network Alliance (DLNA). It can be networked with other electronic devices according to a communication protocol.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • DLNA Digital Living Network Alliance
  • the external device interface unit 2135 may be connected through at least one of the various set-top boxes and the various terminals described above to perform input / output operations with the set-top box.
  • the external device interface unit 2135 may receive an application or a list of applications in an adjacent external device and transmit the received application or application list to the control unit 2170 or the storage unit 2140.
  • the network interface unit 2130 provides an interface for connecting the digital device 2100 with a wired / wireless network including an internet network.
  • the network interface unit 2130 may include, for example, an Ethernet terminal for connection with a wired network, and for example, a wireless LAN (WLAN) for connection with a wireless network.
  • WLAN wireless LAN
  • Fi Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), and High Speed Downlink Packet Access (HSDPA) communication standards.
  • the network interface unit 2130 may transmit or receive data with another user or another digital device through the connected network or another network linked to the connected network.
  • some content data stored in the digital device 2100 may be transmitted to another user, a user selected in advance from the digital device 2100 or a selected digital device among the other digital devices.
  • the network interface unit 2130 may access a predetermined web page through a connected network or another network linked to the connected network. That is, by accessing a predetermined web page through the network, it is possible to send or receive data with the server.
  • content or data provided by a content provider or a network operator may be received. That is, content such as a movie, an advertisement, a game, a VOD, a broadcast signal, and related information provided from a content provider or a network provider may be received through a network.
  • the network interface unit 2130 may select and receive a desired application from among applications that are open to the public through the network.
  • the storage unit 2140 may store a program for processing and controlling each signal in the controller 2170, or may store a signal-processed video, audio, or data signal.
  • the storage unit 2140 may perform a function for temporarily storing an image, audio, or data signal input from the external device interface unit 2135 or the network interface unit 2130.
  • the storage unit 2140 may store information about a predetermined broadcast channel through a channel storage function.
  • the storage unit 2140 may store an application or a list of applications input from the external device interface unit 2135 or the network interface unit 2130.
  • the storage unit 2140 may store various platforms described below.
  • the storage unit 2140 may be, for example, a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD). Memory, etc.), RAM (RAM), or ROM (EEPROM, etc.) may include at least one type of storage medium.
  • the digital device 2100 may reproduce and provide a content file (video file, still image file, music file, document file, application file, etc.) stored in the storage unit 2140 to the user.
  • FIG. 21 illustrates an embodiment in which the storage unit 2140 is provided separately from the control unit 2170, but the scope of the present invention is not limited thereto. That is, the storage unit 2140 may be included in the control unit 2170.
  • the user input interface unit 2150 transmits a signal input by the user to the controller 2170 or transmits a signal of the controller 2170 to the user.
  • the user input interface unit 2150 controls power on / off, channel selection, screen setting, etc. from the remote controller 2195 according to various communication methods such as an RF communication method and an infrared (IR) communication method.
  • the signal may be received and processed, or may be processed to transmit a control signal of the controller 2170 to the remote controller 2195.
  • the user input interface unit 2150 may transmit a control signal input from a local key (not shown), such as a power key, a channel key, a volume key, and a set value, to the controller 2170.
  • a local key such as a power key, a channel key, a volume key, and a set value
  • the user input interface unit 2150 transmits a control signal input from a sensing unit (not shown) that senses a user's gesture to the controller 2170, or transmits a signal of the controller 2170 to the sensing unit.
  • the sensing unit may include a touch sensor, a voice sensor, a position sensor, an operation sensor, and the like.
  • the controller 2170 demultiplexes the stream input through the tuner 2110, the demodulator 2120, or the external device interface 2135, or processes the demultiplexed signals to generate a signal for video or audio output. And output.
  • the image signal processed by the controller 2170 may be input to the display unit 2180 and displayed as an image corresponding to the image signal.
  • the image signal processed by the controller 2170 may be input to the external output device through the external device interface 2135.
  • the audio signal processed by the controller 2170 may be audio output to the audio output unit 2185.
  • the voice signal processed by the controller 2170 may be input to the external output device through the external device interface 2135.
  • the controller 2170 may include a demultiplexer, an image processor, and the like.
  • the controller 2170 may control overall operations of the digital device 2100. For example, the controller 2170 may control the tuner 2110 to tune the RF broadcast corresponding to a channel selected by a user or a previously stored channel.
  • the controller 2170 may control the digital device 2100 by a user command or an internal program input through the user input interface 2150. In particular, it is possible to connect to the network so that the user can download the desired application or application list into the digital device 2100.
  • the controller 2170 controls the tuner 2110 such that a signal of a selected channel is input according to a predetermined channel selection command received through the user input interface 2150. It processes the video, audio or data signal of the selected channel.
  • the controller 2170 allows the channel information selected by the user to be output through the display unit 2180 or the audio output unit 2185 together with the processed image or audio signal.
  • the controller 2170 may, for example, receive an external device image playback command received through the user input interface unit 2150, from an external device input through the external device interface unit 2135, for example, a camera or a camcorder.
  • the video signal or the audio signal may be output through the display unit 2180 or the audio output unit 2185.
  • the controller 2170 may control the display 2180 to display an image.
  • an image For example, a broadcast image input through the tuner 2110, an external input image input through the external device interface unit 2135, an image input through a network interface unit, or an image stored in the storage unit 2140 is stored.
  • the display unit 2180 may control the display.
  • the image displayed on the display unit 2180 may be a still image or a video, and may be a 2D image or a 3D image.
  • the controller 2170 may control to reproduce the content.
  • the content may be content stored in the digital device 2100, received broadcast content, or external input content input from the outside.
  • the content may be at least one of a broadcast image, an external input image, an audio file, a still image, a connected web screen, and a document file.
  • the controller 2170 may control to display an application or a list of applications that can be downloaded from the digital device 2100 or from an external network.
  • the controller 2170 may control to install and run an application downloaded from an external network, along with various user interfaces. In addition, by selecting a user, an image related to an application to be executed may be controlled to be displayed on the display unit 2180.
  • a channel browsing processor may be further provided to generate a thumbnail image corresponding to the channel signal or the external input signal.
  • the channel browsing processor receives a stream signal TS output from the demodulator 2120 or a stream signal output from the external device interface 2135, and extracts an image from the input stream signal to generate a thumbnail image. Can be.
  • the generated thumbnail image may be input as it is or encoded to the controller 2170.
  • the generated thumbnail image may be encoded in a stream form and input to the controller 2170.
  • the controller 2170 may display a thumbnail list including a plurality of thumbnail images on the display unit 2180 using the input thumbnail image. Meanwhile, the thumbnail images in the thumbnail list may be updated sequentially or simultaneously. Accordingly, the user can easily grasp the contents of the plurality of broadcast channels.
  • the display unit 2180 converts an image signal, a data signal, an OSD signal processed by the controller 2170, or an image signal, data signal, etc. received from the external device interface unit 2135 into R, G, and B signals, respectively. Generate a drive signal.
  • the display unit 2180 may be a PDP, an LCD, an OLED, a flexible display, a 3D display, or the like.
  • the display unit 2180 may be configured as a touch screen and used as an input device in addition to the output device.
  • the audio output unit 2185 receives a signal processed by the controller 2170, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal, and outputs the voice.
  • the audio output unit 2185 may be implemented as various types of speakers.
  • a sensing unit including at least one of a touch sensor, a voice sensor, a position sensor, and a motion sensor may be further provided in the digital device 2100. .
  • the signal detected by the sensing unit may be transmitted to the control unit 2170 through the user input interface unit 2150.
  • a photographing unit (not shown) for photographing the user may be further provided.
  • the image information photographed by the photographing unit may be input to the controller 2170.
  • the controller 2170 may detect a user's gesture by combining or combining the image photographed by the photographing unit (not shown) or the detected signal from the sensing unit (not shown).
  • the power supply unit 2190 supplies the corresponding power throughout the digital device 2100.
  • power may be supplied to the controller 2170, which may be implemented in the form of a System On Chip (SOC), a display 2180 for displaying an image, and an audio output 2185 for audio output.
  • SOC System On Chip
  • a display 2180 for displaying an image
  • an audio output 2185 for audio output.
  • the power supply unit 2190 may include a converter (not shown) for converting AC power into DC power.
  • a converter for converting AC power into DC power.
  • an inverter capable of PWM operation may be further provided for driving of variable brightness or dimming. have.
  • the remote control device 2195 transmits the user input to the user input interface unit 2150.
  • the remote controller 2195 may use Bluetooth, RF (Radio Frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee (ZigBee) method and the like.
  • the remote controller 2195 may receive an image, an audio or a data signal output from the user input interface unit 2150, display the same on the remote controller 2195, or output a voice or vibration.
  • the digital device 2100 described above may be a digital broadcast receiver capable of processing a fixed or mobile ATSC or DVB digital broadcast signal.
  • the digital device according to the present invention may omit some of the components of the illustrated components as necessary or further include components not shown on the contrary.
  • the digital device does not include a tuner and a demodulator, and may receive and play content through a network interface unit or an external device interface unit.
  • FIG. 22 is a block diagram illustrating an exemplary embodiment of a detailed configuration of a controller of the digital apparatus illustrated in FIGS. 19 to 21.
  • control unit may include a demultiplexer 2210, an image processor 2220, an on-screen display (OSD) generator 2240, a mixer 2250, a frame rate converter (FRC). 2255, and a formatter 2260.
  • controller may further include a voice processor and a data processor.
  • the demultiplexer 2210 demultiplexes an input stream.
  • the demultiplexer 2210 may demultiplex the input MPEG-2 TS video, audio, and data signals.
  • the stream signal input to the demultiplexer 2210 may be a stream signal output from a tuner, a demodulator, or an external device interface unit.
  • the image processor 2220 performs image processing of the demultiplexed image signal.
  • the image processor 2220 may include an image decoder 2225 and a scaler 2235.
  • the image decoder 2225 decodes the demultiplexed image signal, and the scaler 2235 scales the resolution of the decoded image signal so that the display unit can output the resolution.
  • the image decoder may represent the decoding device described above.
  • the image decoder 2225 may support various standards.
  • the video decoder 2225 performs the function of the MPEG-2 decoder when the video signal is encoded in the MPEG-2 standard, and the video signal is encoded in the Digital Multimedia Broadcasting (DMB) method or the H.264 standard. In this case, the function of the H.264 decoder can be performed.
  • DMB Digital Multimedia Broadcasting
  • the video signal can be a versatile video coding (VVC) standard, an essential video coding (ECC) standard, an AOMedia Video 1 (AV1) standard, a second generation of audio video coding standard (AVS2), or a next-generation video / image coding standard (e.g., , H.267, H.268, etc.) may function as a standard decoder.
  • VVC versatile video coding
  • ECC essential video coding
  • AV1 AOMedia Video 1
  • AVS2 second generation of audio video coding standard
  • next-generation video / image coding standard e.g., H.267, H.268, etc.
  • the video signal decoded by the video processor 2220 is input to the mixer 2250.
  • the OSD generator 2240 generates OSD data according to a user input or itself. For example, the OSD generator 2240 generates data for displaying various data in the form of a graphic or text on the screen of the display 2180 based on a control signal of the user input interface unit.
  • the generated OSD data includes various data such as a user interface screen of a digital device, various menu screens, widgets, icons, viewing rate information, and the like.
  • the OSD generator 2240 may generate data for displaying broadcast information based on subtitles or EPGs of a broadcast image.
  • the mixer 2250 mixes the OSD data generated by the OSD generator 2240 and the image signal processed by the image processor to the formatter 2260. Since the decoded video signal and the OSD data are mixed, the OSD is overlaid and displayed on the broadcast video or the external input video.
  • the frame rate converter (FRC) 2255 converts the frame rate of the input video.
  • the frame rate converter 2255 may convert the frame rate of the input 60Hz image to have a frame rate of, for example, 120 Hz or 240 Hz according to the output frequency of the display unit.
  • various methods may exist in the method of converting the frame rate. For example, when the frame rate converter 2255 converts the frame rate from 60 Hz to 120 Hz, the frame rate converter 2255 inserts the same first frame between the first frame and the second frame, or predicts the first frame from the first frame and the second frame. It can be converted by inserting three frames.
  • the frame rate converter 2255 when the frame rate converter 2255 converts the frame rate from 60 Hz to 240 Hz, the frame rate converter 2255 may insert and convert three more identical or predicted frames between existing frames. Meanwhile, when no separate frame conversion is performed, the frame rate converter 2255 may be bypassed.
  • the formatter 2260 changes the output of the input frame rate converter 2255 according to the output format of the display unit.
  • the formatter 2260 may output R, G, B data signals, and the R, G, B data signals may be output as low voltage differential signaling (LVDS) or mini-LVDS. Can be.
  • the formatter 2260 may support a 3D service through the display by configuring the output in a 3D form according to the output format of the display.
  • the voice processing unit (not shown) in the controller may perform voice processing of the demultiplexed voice signal.
  • the voice processor (not shown) may support processing of various audio formats. For example, even when a voice signal is encoded in a format such as MPEG-2, MPEG-4, AAC, HE-AAC, AC-3, BSAC, etc., a decoder corresponding thereto may be provided.
  • the voice processing unit (not shown) in the controller may process base, treble, volume control, and the like.
  • the data processor in the control unit may perform data processing of the demultiplexed data signal.
  • the data processor may decode the demultiplexed data signal even when it is encoded.
  • the encoded data signal may be EPG information including broadcast information such as a start time and an end time of a broadcast program broadcasted in each channel.
  • each component may be integrated, added, or omitted according to the specifications of the digital device that is actually implemented. That is, as needed, two or more components may be combined into one component or one component may be subdivided into two or more components.
  • the function performed in each block is for explaining an embodiment of the present invention, the specific operation or device does not limit the scope of the present invention.
  • the digital device may be an image signal processing apparatus that performs signal processing of an image stored in the apparatus or an input image.
  • a set top box (STB) excluding the display unit 2180 and the audio output unit 2185 shown in FIG. 21, the above-described DVD player, Blu-ray player, game machine, computer And the like can be further illustrated.
  • FIG. 23 is a diagram illustrating an example in which a screen of a digital device simultaneously displays a main image and a sub image, according to an exemplary embodiment.
  • the digital device may simultaneously display the main image 2310 and the sub image 2320 on the screen 2300.
  • the main image 2310 may be called a first image
  • the auxiliary image 2320 may be called a second image.
  • the main image 2310 and the auxiliary image 2320 may include a video, a still image, an electronic program guide (EPG), a graphical user interface (GUI), an on-screen display (OSD), and the like, but are not limited thereto.
  • the main image 2310 may mean an image relatively smaller in size than the screen 2300 of the electronic device while being displayed simultaneously with the auxiliary image 2320 on the screen 2300 of the electronic device. Also referred to.
  • the main image 2310 is displayed on the upper left of the screen 2300 of the digital device.
  • the position where the main image 2310 is displayed is not limited thereto, and the main image 2310 may be a digital device. Can be displayed at any location within the screen 2300 of the.
  • the main image 2310 and the auxiliary image 2320 may be directly or indirectly related to each other.
  • the main image 2310 may be a streaming video
  • the auxiliary image 2320 may be a GUI that sequentially displays thumbnails of videos including information similar to the streaming video.
  • the main image 2310 may be a broadcast image
  • the auxiliary image 2320 may be an EPG.
  • the main image 2310 may be a broadcast image
  • the auxiliary image 2320 may be a GUI. Examples of the main image 2310 and the auxiliary image 2320 are not limited thereto.
  • the main image 2310 may be a broadcast image received through a broadcasting channel
  • the auxiliary image 2320 may be information related to a broadcast image received through a broadcast channel.
  • the information related to the broadcast image received through the broadcast channel may include, for example, EPG information including a comprehensive channel schedule, detailed information of a broadcast program, information about a broadcast program review, and the like, but is not limited thereto.
  • the main image 2310 may be a broadcast image received through a broadcast channel
  • the auxiliary image 2320 may be an image generated based on information previously stored in a digital device.
  • An image generated based on information previously stored in a digital device may include, for example, a basic UI (User Interface), basic channel information, an image resolution manipulation UI, a sleep reservation UI, and the like of the EPG. It doesn't work.
  • the main image 2310 may be a broadcast image received through a broadcast channel
  • the auxiliary image 2320 may be information related to a broadcast image received through a network.
  • the information related to the broadcast image received through the network may be, for example, information obtained through a search engine based on the network. More specifically, for example, information related to the person currently being displayed on the main image 2310 may be obtained through a search engine based on a network.
  • the example is not limited thereto, and the information related to the broadcast image, which is received through the network, may be obtained by using, for example, an artificial intelligence (AI) system. More specifically, for example, an estimated-location in map of a place currently displayed on the main image 2310 may be obtained by using deep-learning based on a network, and may be digital. The device may receive information about an estimated position on a map of a place currently displayed on the main image 2310 through a network.
  • AI artificial intelligence
  • the digital device may receive at least one of image information of the main image 2310 and image information of the auxiliary image 2320 from the outside.
  • the image information of the main image 2310 may be, for example, a broadcasting signal received through a broadcasting channel, source code information of the main image 2310, and a main received through a network.
  • IP packet (Internet Protocol packet) information of the image 2310 may be included, but is not limited thereto.
  • the video information of the auxiliary video 2320 may be, for example, a broadcast signal received through a broadcast channel, source code information of the auxiliary video 2320, IP packet information of the auxiliary video 2320 received through a network, and the like. It may include, but is not limited thereto.
  • the digital device may decode the image information of the main image 2310 or the image information of the auxiliary image 2320 received from the outside. However, in some cases, the digital device may internally store video information of the main video 2310 or video information of the auxiliary video 2320.
  • the digital device may display the main image 2310 and the auxiliary image 2320 on the screen 2300 of the digital device based on the image information of the main image 2310 and the information related to the auxiliary image 2320.
  • the decoding apparatus 300 of the digital device includes a main image decoding apparatus and an auxiliary image decoding apparatus, and the main image decoding apparatus and the auxiliary image decoding apparatus are respectively the image information and the auxiliary image 2320 of the main image 2310.
  • Can decode video information includes a main image renderer (first renderer) and an auxiliary image renderer (second renderer), and the main image renderer displays the main image 2310 on the screen 2300 of the digital device based on the information decoded by the main image decoder.
  • the auxiliary image renderer may cause the auxiliary image 2320 to be displayed on the second area of the screen 2300 of the digital device based on the information decoded by the auxiliary image decoding apparatus. .
  • the decoding apparatus 300 of the digital device may decode the image information of the main image 2310 and the image information of the auxiliary image 2320. Based on the information decoded by the decoding apparatus 300, the renderer may process the main image 2310 and the auxiliary image 2320 together so that they are simultaneously displayed on the screen 2300 of the digital device.
  • FIG. 24 schematically illustrates a video service processing method by a digital device according to the present invention.
  • the method disclosed in FIG. 24 may be performed by the digital device disclosed in FIGS. 19 to 22.
  • S2400 of FIG. 24 may be performed by a network interface unit, a wireless communication unit, or a broadcast receiving unit of the digital device
  • S2410 of FIG. 24 may be performed by a controller of the digital device
  • S2420 of 24 may be performed by the display unit of the digital device.
  • the digital device receives the image information (S2400).
  • the image information may be received through a broadcast network.
  • the image information may be received through a communication network.
  • the main video may be a streaming video.
  • the video information may include a broadcasting signal received through a broadcasting channel, source code information of the main video, and an IP packet of the main video received through a network.
  • Internet Protocol packet Internet Protocol packet
  • the digital device decodes a main image based on the image information (S2410).
  • the decoding of the main image based on the image information may follow the image decoding method by the decoding apparatus of FIG. 3 and FIG. 15.
  • decoding the main image may include deriving prediction samples for the current block based on inter or intra prediction, and residual samples for the current block based on the received residual information. Deriving them (optionally) and generating reconstructed samples based on the prediction samples and / or the residual samples.
  • the decoding of the first image may include performing an in-loop filtering procedure on the reconstructed picture including the reconstructed samples.
  • the decoding of the main image may include deriving an intra prediction mode of the current block based on remaining intra prediction mode information included in the image information. Deriving a prediction sample of the current block based on an intra prediction mode, and deriving a reconstructed picture based on the prediction sample.
  • the remaining intra prediction mode information may be coded through a truncated binary (TB) binarization process.
  • TB truncated binary
  • the digital device displays the decoded main image in a first area of the display (S2420), and displays an auxiliary image in the second area of the display (S2430).
  • the digital device may render or display the decoded main image in a first area of the display and render or display an auxiliary image in a second area of the display.
  • the auxiliary image may be an electronic program guide (EPG), an on screen display (OSD), or a graphical user interface (GUI).
  • the main image and the sub image may be directly or indirectly related to each other.
  • the main video may be a streaming video
  • the auxiliary video may be a GUI that sequentially displays thumbnails of videos including information similar to the streaming video.
  • the information about the auxiliary image may be received through a communication network.
  • the information about the auxiliary image may be pre-stored in a storage medium in the digital device.
  • the main image may be a broadcast image
  • the auxiliary image may be an EPG associated with the broadcast image.
  • the image information and the information about the auxiliary image may be received through a broadcast network.
  • the information about the auxiliary image may be pre-stored in a storage medium in the digital device.
  • the information about the auxiliary video may include an EPG information including a comprehensive channel schedule, detailed program information, broadcast program review information, and the like.
  • the main video may be a broadcast video
  • the auxiliary video may be a GUI.
  • Examples of the main image and the auxiliary image are not limited thereto.
  • the video information may be received through a broadcast network
  • the information on the auxiliary video may be pre-stored in a storage medium in a digital device
  • the main video may be a broadcast video.
  • the auxiliary image may be an image generated based on information about the auxiliary image previously stored in the digital device.
  • An image generated based on the stored auxiliary image information may include, for example, a basic UI (User Interface), basic channel information, an image resolution manipulation UI, a sleep reservation UI, and the like of the EPG. It is not limited.
  • the main video may be a broadcast video received through a broadcast channel
  • the auxiliary video may be information related to the broadcast video received through a network.
  • the information related to the broadcast image received through the network may be information obtained through a search engine based on a network.
  • the obtained information may be information related to a person displayed on the main image through a search engine based on a network.
  • the information related to the broadcast image received through the network may be information obtained by using an artificial intelligence (AI) system.
  • AI artificial intelligence
  • the obtained information may be an estimated-location in map of a place displayed in the main image using deep-learning based on a network.
  • the digital device may receive information about the estimated position on the map of the place displayed on the main image through the network.
  • the information about the auxiliary video may include a broadcast signal received through a broadcast channel, source code information of the auxiliary video, and / or IP packet information of the auxiliary video received through a network. have.
  • the digital device may display the main image and the auxiliary image on the screen of the digital device based on the image information of the main image and the information about the auxiliary image.
  • a highly selectable intra prediction mode may be represented as a small bit of binary code and intra prediction information corresponding to a value, thereby reducing signaling overhead of intra prediction information and improving overall coding efficiency. You can.
  • a video service processing method for displaying a main image and an auxiliary image related to the main image for the image information received by the digital device, thereby providing a video service to the user more efficiently. Can be.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Procédé de traitement de service d'image mis en œuvre par un dispositif numérique selon la présente invention, comprenant les étapes consistant à : recevoir des informations d'image; décoder une image principale sur la base des informations d'image; afficher l'image principale décodée dans une première zone d'un dispositif d'affichage; et afficher une image auxiliaire dans une seconde zone du dispositif d'affichage. L'étape de décodage d'une image principale comprend les étapes consistant à : obtenir un mode de prédiction intra pour le bloc en cours sur la base des informations de mode de prédiction intra restantes contenues dans les informations d'image; obtenir un échantillon de prédiction pour le bloc en cours sur la base du mode de prédiction intra; et obtenir une image de reconstruction sur la base de l'échantillon de prédiction, les informations de mode de prédiction intra restantes ayant été codées par l'intermédiaire d'un processus de binarisation TB.
PCT/KR2019/007970 2018-07-13 2019-07-01 Procédé de traitement de service d'image dans un système de service de contenu, et dispositif associé WO2020013498A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862698013P 2018-07-13 2018-07-13
US62/698,013 2018-07-13

Publications (1)

Publication Number Publication Date
WO2020013498A1 true WO2020013498A1 (fr) 2020-01-16

Family

ID=69141863

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/007970 WO2020013498A1 (fr) 2018-07-13 2019-07-01 Procédé de traitement de service d'image dans un système de service de contenu, et dispositif associé

Country Status (1)

Country Link
WO (1) WO2020013498A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040128317A1 (en) * 2000-07-24 2004-07-01 Sanghoon Sull Methods and apparatuses for viewing, browsing, navigating and bookmarking videos and displaying images
US20150341635A1 (en) * 2014-05-23 2015-11-26 Qualcomm Incorporated Coding run values based on palette index in palette-based video coding
US20170041616A1 (en) * 2015-08-03 2017-02-09 Arris Enterprises Llc Intra prediction mode selection in video coding
KR20180070716A (ko) * 2009-07-31 2018-06-26 벨로스 미디어 인터내셔널 리미티드 화상 복호 장치, 화상 복호 방법 및 기록 매체

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040128317A1 (en) * 2000-07-24 2004-07-01 Sanghoon Sull Methods and apparatuses for viewing, browsing, navigating and bookmarking videos and displaying images
KR20180070716A (ko) * 2009-07-31 2018-06-26 벨로스 미디어 인터내셔널 리미티드 화상 복호 장치, 화상 복호 방법 및 기록 매체
US20150341635A1 (en) * 2014-05-23 2015-11-26 Qualcomm Incorporated Coding run values based on palette index in palette-based video coding
US20170041616A1 (en) * 2015-08-03 2017-02-09 Arris Enterprises Llc Intra prediction mode selection in video coding

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
AKULA, SRI NITCHITH ET AL.: "Description of SDR, HDR and 360° video coding technology proposal by Samsung, Huawei, GoPro, and HiSilicon "" mobile application scenario", JOINT VIDEO EXPLORATION TEAM (JVET) OF ITU-T SG 16 WP 3 AND ISO/IEC JTC 1/SC 29/WG 11, 14 April 2018 (2018-04-14), San Diego, US, pages 1 - 115, XP030151190 *

Similar Documents

Publication Publication Date Title
WO2020122640A1 (fr) Procédé et dispositif de traitement de signal vidéo sur la base d'une transformée de vecteurs de mouvements basés sur l'historique
WO2020141914A1 (fr) Procédé et appareil de traitement de signal vidéo sur la base d'une prédiction de vecteur de mouvement basé sur l'historique
WO2020197083A1 (fr) Procédé et dispositif d'interprédiction basée sur le dmvr et le bdof
WO2020117018A1 (fr) Procédé et dispositif de traitement de signaux vidéo sur la base d'une interprédiction
WO2020197084A1 (fr) Procédé et appareil d'inter-prédiction sur la base d'un dmvr
WO2020071871A1 (fr) Procédé et appareil de traitement de service d'image
WO2015137783A1 (fr) Procédé et dispositif de configuration d'une liste de candidats de fusion pour le codage et le décodage de vidéo intercouche
WO2014051408A1 (fr) Procédé de compensation sao des erreurs de prévision inter-couches de codage et appareil correspondant
WO2015099506A1 (fr) Procédé de décodage vidéo inter-couche pour effectuer une prédiction de sous-bloc et appareil associé, ainsi que procédé de codage vidéo inter-couche pour effectuer une prédiction de sous-bloc et appareil associé
WO2020117016A1 (fr) Procédé et dispositif de traitement de signal vidéo sur la base d'une inter-prédiction
WO2014109594A1 (fr) Procédé et dispositif pour coder une vidéo entre couches pour compenser une différence de luminance, procédé et dispositif pour décoder une vidéo
WO2016117930A1 (fr) Procédé et appareil de décodage de vidéo inter-couche, et procédé et appareil de codage de vidéo inter-couche
WO2015053597A1 (fr) Procédé et appareil de codage de vidéo multicouche et procédé et appareil de décodage de vidéo multicouche
WO2015012622A1 (fr) Procédé pour déterminer un vecteur de mouvement et appareil associé
WO2020141913A1 (fr) Procédé et appareil permettant de traiter un signal vidéo sur la base d'une inter-prédiction
WO2020117013A1 (fr) Procédé et appareil de traitement de signal vidéo sur la base d'une inter-prédiction
WO2015102439A1 (fr) Procede et appareil pour la gestion de memoire tampon pour le codage et le decodage de video multicouche
WO2015093920A1 (fr) Procédé de codage vidéo inter-couches utilisant une compensation de luminosité et dispositif associé, et procédé de décodage vidéo et dispositif associé
WO2015053593A1 (fr) Procédé et appareil pour coder une vidéo extensible pour coder une image auxiliaire, procédé et appareil pour décoder une vidéo extensible pour décoder une image auxiliaire
WO2014163453A1 (fr) Procédé et appareil d'encodage vidéo intercouche et procédé et appareil de décodage vidéo intercouche permettant de compenser une différence de luminance
WO2020060312A1 (fr) Procédé et dispositif de traitement du signal d'image
WO2020141915A1 (fr) Procédé et dispositif de traitement de signal vidéo sur la base d'une prédiction de vecteur de mouvement basé sur l'historique
WO2020141853A1 (fr) Procédé et appareil de traitement de signal vidéo sur la base d'une inter-prédiction
WO2015194877A1 (fr) Procédé d'encodage vidéo multicouche et procédé de décodage vidéo multicouche utilisant des blocs de profondeur
WO2014163454A1 (fr) Procédé et appareil d'encodage vidéo intercouche et procédé et appareil de décodage vidéo intercouche et appareil permettant de compenser une différence de luminance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19834833

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19834833

Country of ref document: EP

Kind code of ref document: A1