WO2015076616A1 - 신호 송수신 장치 및 신호 송수신 방법 - Google Patents
신호 송수신 장치 및 신호 송수신 방법 Download PDFInfo
- Publication number
- WO2015076616A1 WO2015076616A1 PCT/KR2014/011262 KR2014011262W WO2015076616A1 WO 2015076616 A1 WO2015076616 A1 WO 2015076616A1 KR 2014011262 W KR2014011262 W KR 2014011262W WO 2015076616 A1 WO2015076616 A1 WO 2015076616A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video data
- video
- wcg
- information
- color
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 230000011664 signaling Effects 0.000 claims abstract description 150
- 238000013507 mapping Methods 0.000 claims description 133
- 230000008054 signal transmission Effects 0.000 claims description 25
- 238000009877 rendering Methods 0.000 claims 2
- 238000010586 diagram Methods 0.000 description 47
- 238000006243 chemical reaction Methods 0.000 description 46
- 239000011159 matrix material Substances 0.000 description 42
- 108091006146 Channels Proteins 0.000 description 34
- 238000012805 post-processing Methods 0.000 description 19
- 230000005540 biological transmission Effects 0.000 description 6
- 239000003086 colorant Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 238000012937 correction Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 241000023320 Luma <angiosperm> Species 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- PWPJGUXAGUPAHP-UHFFFAOYSA-N lufenuron Chemical compound C1=C(Cl)C(OC(F)(F)C(C(F)(F)F)F)=CC(Cl)=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F PWPJGUXAGUPAHP-UHFFFAOYSA-N 0.000 description 2
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical group COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 244000309469 Human enteric coronavirus Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004737 colorimetric analysis Methods 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/30—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/46—Embedding additional information in the video signal during the compression process
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234327—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into layers, e.g. base layer and one or more enhancement layers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234363—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440227—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by decomposing into layers, e.g. base layer and one or more enhancement layers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
Definitions
- the present invention relates to a signal transmission and reception apparatus and a signal transmission and reception method.
- WCG Wide color gamut
- WCG service a video service or content that provides a WCG based color gamut.
- An image acquisition device or display device capable of accurately acquiring or expressing WCG-based colors is under development. For a certain period of time, services including wide color gamut images such as UHD broadcasting will be provided through existing image capturing devices and display devices.
- An object of the present invention is to provide a signal transmission / reception method and a signal transmission / reception apparatus for displaying an image service based on a wide color gamut.
- Another object of the present invention is to provide a signal transmission / reception method and a signal transmission / reception apparatus capable of providing compatible WCG content in an existing receiver.
- Another object of the present invention is to provide a signal transmission / reception method and a signal transmission / reception apparatus, in which a wide-area color space based video service may be provided compatible with a plurality of display devices.
- Another object of the present invention is to provide a signal transmission / reception method and a signal transmission / reception apparatus capable of providing a broadcast service that expresses WCG information of a content in a compatible manner.
- encoding base layer video data and enhancement video data capable of providing scalable wide color gamut (WCG) video service, and signaling capable of composing the scalable WCG video data, respectively Generating information, outputting a stream multiplexed with the generated signaling information and the encoded base layer video data and the encoded enhancement video data, and transmitting the multiplexed stream;
- WCG wide color gamut
- the signaling information may include information for identifying the scalable wide color gamut (WCG) video service.
- the encoded base layer video data or the encoded enhancement video data may include color gamut mapping information, color bit depth information, or color mapping information that may render the scalable Wide Color Gamut (WCG) video service.
- the branch may include metadata.
- an encoder for encoding base layer video data and enhancement video data capable of providing a scalable wide color gamut (WCG) video service, and configured to configure the scalable WCG video data
- WCG wide color gamut
- a signaling information encoder for encoding signaling information, a multiplexer for outputting a multiplexed stream of the generated signaling information and the encoded base layer video data and the encoded enhancement video data, and a transmitter for transmitting the multiplexed stream It provides a signal transmission apparatus comprising a.
- receiving a stream including base layer video data and enhancement video data that may constitute scalable wide color gamut (WCG) video data and demultiplexing the received stream Outputting video data and signaling information including base layer video data and enhancement video data, decoding the demultiplexed signaling information, and based on the decoded signaling information; And decoding the enhancement video data to output legacy UHD video or WCG video.
- WCG wide color gamut
- a receiver for receiving a stream including base layer video data and enhancement video data capable of composing scalable wide color gamut (WCG) video data, and demultiplexing the received stream
- a demultiplexer configured to output video data and signaling information including base layer video data and enhancement video data, a decoder to decode the demultiplexed signaling information, and the base layer video data based on the decoded signaling information or
- a video decoder for decoding the enhancement video data to output legacy UHD video or WCG video.
- the WCG video is a color gamut mapping of the base layer video data using the color gamut mapping information based on the enhancement layer video data, or a color bit of the base layer video data using the color bit depth information.
- the depth can be configured by upscaling.
- a wide-area color space based video service can be displayed regardless of the display device.
- the existing receiver can provide compatible WCG content.
- an image service based on a wide color space may be provided to be compatible with a plurality of display devices.
- FIG. 1 illustrates an embodiment of a signal transmission method according to the present invention.
- FIG. 2 is a diagram illustrating a method of expressing WCG content according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating an example of a WCG video configuration unit according to an embodiment of the present invention.
- FIG. 4 illustrates another example of a WCG video configuration unit according to an embodiment of the present invention.
- FIG. 5 illustrates a post processing unit 190 according to an embodiment of the present invention.
- FIG. 6 is a diagram illustrating an example of generating scalable WCG video according to an embodiment of the present invention.
- FIG 7 illustrates another example of configuring WCG video according to an embodiment of the present invention.
- PMT broadcast signaling information
- FIG 9 illustrates a case where a stream descriptor describing a scalable WCG video service is located in a PMT according to an embodiment of the present invention.
- FIG. 10 illustrates an example of a descriptor WCG_sub_stream_descriptor disclosed in accordance with an embodiment of the present invention.
- FIG. 11 is a diagram illustrating syntax for a payload of an SEI region of video data according to an embodiment of the present invention.
- FIG. 12 is a diagram illustrating metadata of a scalable WCG video included in a payload of an SEI region disclosed according to an embodiment of the present invention.
- FIG. 13 illustrates a method of arbitrarily representing color gamut information of base layer video data or enhancement layer video data in metadata of scalable WCG video according to an embodiment of the present invention
- FIG. 14 is a diagram illustrating information (original_UD_video_type) of an original UHD video format among metadata of scalable WCG video according to an embodiment of the present invention.
- FIG. 16 illustrates in detail color gamut information of a video of an enhancement layer of metadata of a scalable WCG video according to an embodiment of the present invention
- 17 is a diagram illustrating in detail color gamut mapping function information for obtaining a WCG video of metadata of a scalable WCG video according to an embodiment of the present invention.
- FIG. 18 is a diagram illustrating broadcast signaling information as an embodiment of the present invention that can implement such an embodiment.
- 19 illustrates another syntax for a payload of an SEI region of video data according to an embodiment of the present invention.
- FIG. 20 illustrates another example of metadata of scalable WCG video included in a payload of an SEI region disclosed according to an embodiment of the present invention.
- 21 is a view illustrating an example of color gamut mapping information included in metadata of a scalable WCG video according to an embodiment of the present invention.
- FIG. 22 is a diagram illustrating color gamut mapping matrix type information (matrix_composition_type) that can be used to map color gamut information according to an embodiment of the present invention.
- FIG. 23 is a table illustrating color gamut mapping matrix type information included in metadata of WCG video according to an embodiment of the present invention.
- the normalized primary matrix according to 709 shows a detailed color mapping matrix embodiment
- FIG. 24 is a diagram illustrating an embodiment in which a color gamut mapping matrix type information included in metadata of a WCG video is normalized based on a color primary value of a current image according to an embodiment of the present invention
- FIG. 25 is a diagram illustrating a conversion equation representing a coefficient (gamut_mapping_coeff [i]) for color space conversion among color gamut mapping information included in metadata of a WCG video according to an embodiment of the present invention.
- FIG. 26 is a diagram illustrating types of Look Up Table (LUT) according to LUT_type field among color gamut mapping information included in metadata of WCG video according to an embodiment of the present invention.
- LUT Look Up Table
- 27 is a diagram illustrating broadcast signaling information as an embodiment of the present invention.
- FIG. 28 is a diagram for describing a specific example including a descriptor for signaling a scalable WCG video included in such broadcast signaling information as an embodiment of the present invention.
- 29 is a diagram illustrating another example in which signaling information for signaling scalable WCG video is included in broadcast signaling information according to an embodiment of the present invention.
- FIG. 30 is a diagram illustrating another example in which signaling information for signaling scalable WCG video is included in broadcast signaling information as an embodiment of the present invention.
- FIG. 31 is a diagram illustrating another example in which signaling information for signaling scalable WCG video is included in broadcast signaling information as an embodiment of the present invention.
- FIG. 33 is a view showing another embodiment of a signal transmission apparatus according to an embodiment of the present invention.
- 34 is a diagram illustrating an example of a signal receiving apparatus according to an embodiment of the present invention.
- 35 is a diagram illustrating an example of a signal receiving method according to an embodiment of the present invention.
- a content having a wide color gamut (WCG) and a display device capable of expressing the same are introduced in a broadcast, it is necessary to supply appropriate content according to the range of color representation of a display that a consumer has.
- WCG wide color gamut
- content corresponding to each display color expression characteristic should be supplied.
- bandwidth for a broadcast service is limited, generating and providing the same content in a different color space has a problem of using twice the bandwidth, which is a big burden for a broadcaster or a content provider.
- a broadcast service having different color spaces may be provided using multiple layer data according to scalable coding for color gamut in a content, and bandwidth may be efficiently used. An example is disclosed.
- the WCG video refers to a video (content) in which the color representation of the content is expressed according to the range of the WCG.
- FIG. 1 is a view showing an example of a signal transmission method according to an embodiment of the present invention.
- Base layer video data and enhancement video data which may constitute scalable WCG video data, are encoded, respectively (S110).
- signaling information constituting scalable WCG video data may be included in metadata in base layer video data or enhancement video data. Examples of metadata will be described with reference to FIGS. 11 to 17 and 19 to 26.
- Signaling information capable of composing scalable WCG video data is generated (S120).
- the signaling information of this step refers to system level signaling information as broadcast signaling information. A detailed example thereof will be described with reference to FIGS. 8, 10, 18, and 27-31.
- the generated signaling information, the encoded base layer video data, and the enhancement video data are multiplexed and output (S130).
- the multiplexed stream is transmitted (S140).
- the receiver may reconstruct the WCG video with the data of the enhancement layer video data and the color bit depth of the base layer video data compatible with the legacy UHD video.
- the receiver may reconstruct the WCG video with color gamut mapped data of enhancement layer video data and base layer video data compatible with legacy UHD video.
- the WCG video may be displayed according to the performance of the display device of the receiver, or the legacy UHD video may be output using only base layer video data.
- FIG. 2 is a diagram illustrating a method of expressing WCG content according to an embodiment of the present invention. This figure illustrates an embodiment of the operation of the receiver considering backward compatibility for WCG content.
- the conventional receiver decodes the video data, and then the existing receiver transmits the base layer video data (legacy UHD color gamut video).
- the WCG image may be displayed on an existing display, and the WCG image may be displayed on a display capable of expressing WCG content using WCG display using enhancement layer video data.
- the first demultiplexer 110 demultiplexes the UHD base layer video stream from the stream.
- the base layer video stream transmits UHD video (hereinafter, legacy UHD video or legacy UHD color gamut video) data that can be displayed on an existing display.
- UHD video hereinafter, legacy UHD video or legacy UHD color gamut video
- the base layer decoder 120 decodes the demultiplexed UHD base layer video stream and outputs legacy UHD video data.
- the base layer decoder 120 may be codec capable of HEVC decoding.
- the color conversion unit (EOTF) 130 converts and outputs legacy UHD video data. Then, the existing UHD display (legacy UHD display) 200 can express the color-converted legacy UHD video data.
- the upscaler 150 upscales the color bit depth of legacy UHD video data output by the color conversion unit (EOTF) 130 so that the bit depth representing the color is upscaled. scaled UHD base layer) Outputs video data.
- the color conversion unit (EOTF) 130 may upscale UHD base layer video data having a color 8-bit depth to UHD base layer video data having a 10-bit depth.
- the second demultiplexer 160 demultiplexes the UHD enhancement layer video stream from the stream.
- the first demultiplexer 110 and the second demultiplexer 160 may operate as one demultiplexer.
- the enhancement layer decoder 120 decodes the demultiplexed UHD enhancement layer video stream and outputs WCG enhancement layer video data so that content can be represented in the WCG color space.
- the WCG video configuration unit 180 outputs the WCG video using the WCG enhancement layer video data and the upscaled UHD base layer video data output by the upscaler 150.
- WCG video refers to video in which the color representation of the content is represented according to the range of WCG.
- WCG video data compatible with an existing display using scalable coding is referred to as scalable WCG video hereinafter.
- the post processing unit 190 post-processes the scalable WCG video configured using the other layer data to more naturally output the converted color to the WCG display 300.
- signaling information on the UHD broadcasting service may be used.
- the existing receiver may identify the UHD service and then the receiver itself may not process the enhancement layer video data itself or may be acquired through the enhancement layer video data. If it is determined that the WCG video cannot be displayed on the display device, only the base layer video may be decoded and output.
- the existing receiver cannot process video using signaling information about the UHD service, for example, the UHD service type in the signaling information or service descriptors describing the UHD service (hereinafter, UD_program_descriptor () or UD_program_format_type, etc. to be described below).
- the data can be identified. Signaling information on the broadcast service will be described later.
- the WCG display 300 plays back the final WCG video obtained using the enhancement layer video data.
- the post processing unit 190 may not perform separate image processing. However, if the display device can express more various colors, or if the manufacturer can provide a color expression suitable for the characteristics of the display panel, the improved color may be provided through the post processing unit 190 related to the WCG information. In this case, signaling information of a broadcast service may be used as a criterion for determining color gamut of content, which will be described later.
- the WCG video configuration unit 180 outputs WCG video, which is WCG content, using the upscaled UHD base layer video and WCG enhancement layer video.
- WCG video may be configured by enhancing the color of the video in detail using signaling information on the broadcast service.
- the WCG video composition unit 180 may include a color detail enhancement unit for recovering color of the original image from the base layer video using residual video data of the enhancement layer. The description thereof will be described later with reference to FIG. 6.
- the WCG video composition unit 180 may include a color gamut mapping unit 182 and a color enhancement processor 184.
- the color gamut mapping unit 182 maps the color gamut of the base layer video data to an area capable of expressing WCG color for the upscaled UHD base layer video using the signaling information on the broadcast service. Then, the color enhancement processor 184 composes and outputs the WCG video by enhancing the color of the video by using the mapped base layer video data and the residual video data of the enhancement layer.
- the residual video data is composed of a difference between base layer data in which color gamut is mapped in the WCG region and original video data.
- the final WCG video can be composed by adding the base layer data mapped with the color gamut and the residual video data to the WCG region. This will be backed out below with reference to FIG.
- the color gamut mapping unit 182 expands the color gamut of the base layer video and maps the color gamut to the video of the color gamut adjacent to the WCG video of the original image.
- the color gamut mapping unit 182 may know color gamut information of each layer data through signaling information (BL_video_color_gamut_type field and EL_video_color_gamut_type field) to be described later, and the color gamut information about the start and end points of the color gamut. You can get information.
- the base layer video is BT. 709 has a video color format defined by 709 and the enhancement layer video is BT. If the color gamut mapping function of the color gamut mapping unit 182 has a video color format defined as 2020, the color gamut mapping function is BT. 709 defines the video as BT. Performs mapping with video defined by 2020.
- Color gamut mapping can be implemented in a variety of ways. If you do not need a separate mapping between two layers (if you do not need the enhancement information of the enhancement layer), or if mapping is performed independently for each channel, if you are mapping using a linear matrix, In some cases, mapping is done point-by-point using a lookup table (LUT).
- LUT lookup table
- Such a color gamut mapping method may be signaled with signaling information (EL_gamut_mapping_type) to be described later, and the color gamut mapping unit may obtain a specific parameter through this signaling information.
- signaling information EL_gamut_mapping_type
- the color gamut mapping may be added as part of scalable coding, or may operate in conjunction with a color correction matrix of a post processing part for existing image quality processing. That is, the post processing unit may perform a gamut mapping by recognizing a coefficient according to a color gamut mapping function according to the signaling information EL_gamut_mapping_type. This will be described in detail as follows.
- FIG. 5 is a diagram illustrating a post processing unit 190 according to an embodiment of the present invention.
- the post processing unit may include a tone mapping unit, a transfer curve unit, and / or a color correction matrix unit.
- the post processing unit 190 may perform tone mapping on the WCG video, or perform post-processing such as a color correction matrix that changes the color using a color addition conversion curve or performs color gamut mapping. Accordingly, the post processing unit 190 may output WCG video with enhanced WCG video color.
- the base layer video data is BT. 709, video data defined in 8-bit depth color, the base layer video data upscaled to 10-bit depth color is BT. 709, 10 bit.
- WCG video data is BT. It may be a 10-bit depth video defined as 2020. Therefore, the difference between the WCG video data and the base layer video data upscaled to 10 bit depth may be residual video data of scalable video coding.
- FIG. 3 illustrates a process of reconstructing WCG video data using two video differences.
- FIG. 7 is a diagram illustrating another example of configuring WCG video according to an embodiment of the present invention.
- WCG video data is BT. 10 bit depth video defined in 2020
- the base layer video data is BT. 709, video data defined as 8 bits.
- color-map the base layer video data It can be mapped to a video having a 10-bit depth defined by 2020.
- the difference between the WCG video data and the color gamut-mapped base layer video data may be residual video data of scalable video coding.
- the WCG video data can be reconstructed by adding color gamut-mapped base layer video data and residual video data.
- the description of this figure corresponds to the embodiment of FIG. 4.
- the embodiment disclosed below may provide signaling information according to the example of FIG. 3 or 6 or the example of FIG. 4 or 7.
- the disclosed embodiment delivers a color gamut scalable video configuration method of composing WCG video using residual data of an enhancement layer at a system level of a broadcast, and transmits it to a decoder when decoding enhancement layer video.
- an embodiment of the present invention may transmit signaling information for WCG video configuration in an SEI message.
- the PMT can transmit information of a codec type, a profile information, a level information, and a tier information of the video data to the enhancement layer decoder through the WCG_sub_stream_descriptor to be described later.
- video related metadata such as color gamut information and gamut mapping parameters of the original video and the WCG video may be transmitted and received.
- the signaling information of the broadcast system level of the WCG video transmitted by scalable coding is as follows.
- FIG. 8 is a diagram illustrating broadcast signaling information as an embodiment of the present invention.
- the PMT and the signaling information included therein among the broadcast signaling information are described below.
- PMT may be in accordance with the disclosure in ISO / IEC 13818-1. If this field is described using this, it is as follows.
- the table_id field indicates an 8-bit identifier indicating the type of a PMT table section (table_id-This is an 8-bit field, which in the case of a TS_program_map_section shall be always set to 0x02).
- the section_syntax_indicator field is a 1-bit field set to 1 for a VCT table section (section_syntax_indicator-The section_syntax_indicator is a 1-bit field which shall be set to '1'.
- program_number-program_number is a 16-bit field.It specifies the program to which the program_map_PID is applicable.One program definition shall be carried within only one TS_program_map_section.This implies that a program definition is never longer than 1016 (0x3F8) . See Informative Annex C for ways to deal with the cases when that length is not sufficient.
- the program_number may be used as a designation for a broadcast channel, for example.By describing the different program elements belonging to a program, data from different sources (eg sequential events) can be concatenated together to form a continuous set of streams using a program_number.)
- the version_number field indicates the version number of the VCT (version_number-This 5-bit field is the version number of the TS_program_map_section.The version number shall be incremented by 1 modulo 32 when a change in the information carried within the section occurs.Version number refers to the definition of a single program, and therefore to a single section.When the current_next_indicator is set to '1', then the version_number shall be that of the currently applicable TS_program_map_section.When the current_next_indicator is set to '0', then the version_number shall be that of the next applicable TS_program_map_section)
- section_number-The value of this 8-bit field shall be 0x00
- the last_section_number field indicates the number of the last section (last_section_number-The value of this 8-bit field shall be 0x00.)
- PCR_PID indicates the PID of the TS packet including the PCR field of the program specified by the program number (PCR_PID-This is a 13-bit field indicating the PID of the Transport Stream packets which shall contain the PCR fields valid for the program specified by program_number. If no PCR is associated with a program definition for private streams, then this field shall take the value of 0x1FFF.)
- the program_info_length field indicates the length of the program level descriptor after this field (program_info_length-This is a 12-bit field, the first two bits of which shall be '00'.The remaining 10 bits specify the number of bytes of the descriptors immediately following the program_info_length field)
- the stream_type field indicates the type of the program element stream (stream_type-This is an 8-bit field specifying the type of program element carried within the packets with the PID whose value is specified by the elementary_PID).
- ES_info_length field indicates the length of the program element level descriptor (ES_info_length-This is a 12-bit field, the first two bits of which shall be '00'.The remaining 10 bits specify the number of bytes of the descriptors of the associated program element immediately following the ES_info_length field)
- CRC 32 field indicates a 32-bit field containing a CRC value (CRC_32-This is a 32-bit field that contains the CRC value that gives a zero output of the registers in the decoder).
- the PMT may include program level descriptors and elementary stream level descriptors.
- PMT is a legacy enhancement layer video that is the difference between base layer video data, legacy UHD video compatible with existing displays, and WCG video and legacy UHD video (or video whose color bit depth is upscaled).
- Descriptors that can describe programs that can compose WCG video using data can be included at the program level.
- a UD_program_descriptor immediately after the program_info_length field of the PMT may signal a program for composing WCG video.
- the program composes scalable WCG video using base layer video data and residual enhancement layer video data compatible with legacy UHD video as described above. It can be a program (hereinafter referred to as WCG configuration program).
- the PMT may include a descriptor (WCG_sub_stream_descriptor ()) including stream information about a program constituting a scalable WCG video service in a stream level descriptor. This will be described in detail below.
- FIG. 9 is a diagram illustrating a case where a stream descriptor describing a scalable WCG video service is located in a PMT according to an embodiment of the present invention.
- the stream_type field is 0x24
- the elementary_PID field may have a value of 0x109A.
- the HEVC_video_descriptor () is located in the PMT, it may indicate that the video stream is coded with HECV, and a descriptor describing the HEVC video may be included.
- the stream_type field is 0xA1
- this may represent a video stream according to the HEVC scalable layer video codec.
- the elementary_PID may have 0x109B.
- WCG_sub_stream_descriptor () which is a descriptor for describing streams constituting the video, may be located at the stream level of the PMT.
- WCG_sub_stream_descriptor () may include information about an enhancement layer of the scalable WCG video service and configuration information of the scalable WCG video service.
- WCG_sub_stream_descriptor is a descriptor including information about a stream constituting the WCG video service.
- the descriptor_tag field represents a unique code value indicating that it is a WCG_sub_stream_descriptor.
- the descriptor_length field represents the total length of the WCG_sub_stream_descriptor.
- the EL_video_codec_type field indicates the codec of the video element constituting the scalable WCG video. For example, it may have the same value as stream_type of PMT.
- the EL_video_profile field indicates a profile for the video stream, that is, a basic specification for decoding the stream. Bit depth information (8-bit, 10-bit, etc.) of the video stream and requirement information about a coding tool may be included.
- the EL_video_level field defines the level of the corresponding video stream, that is, to what extent the description element defined in the profile is supported.
- the EL_video_level field may include resolution information, frame rate information or bit rate information.
- the EL_video_tier field may indicate tier information about a corresponding video stream.
- signaling information of video level of scalable WCG video is as follows.
- Information constituting the scalable WCG video may be included in the video level.
- the SEI message of the video data may include information about the scalable WCG video.
- FIG. 11 illustrates syntax for a payload of an SEI region of video data according to an embodiment of the present invention.
- the SEI message may include information (UDTV_scalable_color_gamut_service_info (payloadSize)) signaling the format of scalable WCG video data as illustrated. This signaling information represents metadata of scalable WCG video.
- An embodiment of parsing video data according to syntax illustrated by a decoder of a receiver is as follows.
- the decoder When the decoder decodes the video data, it parses the AVC or HEVC NAL unit from the video element stream. If the nal_unit_type value corresponds to SEI data and payloadType is 52 in the SEI data, information according to the UDTV_scalable_color_gamut_service_info can be obtained.
- UDTV_scalable_color_gamut_service_info (payloadSize), which is information signaling the format of scalable WCG video data in the payload region of the SEI region, may include a field (UD_program_format_type) indicating format information of UHD video data.
- metadata of the scalable WCG video may include metadata (WCG_substream_metadata).
- WCG_substream_metadata The metadata of the scalable WCG video is described in detail below.
- FIG. 12 is a diagram illustrating metadata of a scalable WCG video included in a payload of an SEI region disclosed according to an embodiment of the present invention.
- information about color gamut of each of the base layer and enhancement layer video data and information that can extend color gamut information of the base layer data of the sub stream are provided. It may include.
- the metadata of the scalable WCG video may describe a method of extending the color gamut of the base layer by using a substream of the enhancement layer data. A detailed description of each item follows.
- the original_UD_video_type field is information about a UHD video format and indicates basic information of base layer video data such as a resolution and a frame rate of a video. Or, it may represent video information corresponding to a video having a higher quality than a base layer and a video based on the same. Detailed examples thereof will be described later.
- the BL_bitdepth field represents bit depth information of base layer video data.
- EL_bitdepth_diff A value representing bit depth information of the scalable WCG video that will be finally obtained using enhancement layer video data.
- the BL_video_color_gamut_type field represents color gamut of base layer video data. Detailed examples thereof will be described later.
- the EL_video_color_gamut_type field represents color gamut of video generated by enhancement layer video data. Detailed examples thereof will be described later.
- the EL_gamut_mapping_type field represents information about a gamut mapping function used to obtain a final WCG image.
- RGBW_primaries () is a color that can define the color gamut when the color gamut type of base layer video data or enhancement layer video data uses an arbitrary value other than a predetermined value, that is, R, G, Information indicating coordinates on the color space of B and W (white).
- a random value may be set for a color gamut of video data to be described below. Can be used. Detailed examples thereof will be described later.
- FIG. 13 is a diagram illustrating a method of arbitrarily representing color gamut information of base layer video data or enhancement layer video data in metadata of scalable WCG video according to an embodiment of the present invention.
- the color_primary_r_x field represents the x coordinate of the R color of the color space (for example, CIE 1931). It can be used to determine whether the viewer's display includes targeted color gamut information.
- the color_primary_r_x field may represent a binary value for a value between 0 and 1, or may indicate a difference from a reference value.
- the color_primary_r_y field represents the y coordinate of the R color of the color space (for example, CIE 1931). It can be used to determine whether the viewer's display includes targeted color gamut information.
- the color_primary_r_y field may represent a binary value for a value between 0 and 1, or may indicate a difference value from a reference value.
- the color_primary_g_x field indicates the x coordinate of the G color of the color space (for example, CIE 1931). It can be used to determine whether the viewer's display includes targeted color gamut information.
- the color_primary_g_x field may represent a binarized value for a value between 0 and 1, or may indicate a difference from a reference value.
- the color_primary_g_y field indicates the y coordinate of the G color of the color space (for example, CIE 1931). It can be used to determine whether the viewer's display includes targeted color gamut information.
- the color_primary_g_y field may represent a binary value for a value between 0 and 1, or may indicate a difference value from a reference value.
- the color_primary_b_x field represents the x coordinate of B color of the color space (for example, CIE 1931). It can be used to determine whether the viewer's display includes targeted color gamut information.
- the color_primary_b_x field may represent a binary value for a value between 0 and 1, or may indicate a difference value from a reference value.
- the color_primary_b_y field represents the y coordinate of the B color of the color space (for example, CIE 1931). It can be used to determine whether the viewer's display includes targeted color gamut information.
- the color_primary_b_y field may represent a binary value for a value between 0 and 1, or may indicate a difference value from a reference value.
- the white_primary_x field indicates the x coordinate value in the color space when an arbitrary color temperature is specified.
- the white_primary_x field may send a binary value for a value between 0 and 1, and may also be expressed as a difference between a reference color temperature and an arbitrary color temperature.
- the white_primary_y field indicates the y coordinate value in the color space when an arbitrary color temperature is specified.
- the white_primary_y field may send a binary value for a value between 0 and 1, and may also be expressed as a difference between a reference color temperature and an arbitrary color temperature.
- FIG. 14 is a diagram illustrating information (original_UD_video_type) of an original UHD video format among metadata of scalable WCG video according to an embodiment of the present invention.
- the information on the original UHD video format among the metadata of the scalable WCG video is information on the UHD video format as described, and the original UHD video format such as the resolution information of the video, the frame rate information, and the like. It can represent information about.
- the information about the original UHD video format may indicate basic information about the base layer video data.
- Information on the UHD video format is the resolution and frame rate of the video is 3840 x 2160 (60p), 3840 x 2160 (120p), 4096 x 2160 (60p), 4096 x 2160 (120p), 7680 x 4320 (60p), 7680 x 4320 (120p), 8192 x 4320 (60p), 8192 x 4320 (120p), where p represents a progressive scheme.
- the color space information of the color gamut of the video of the base layer among the metadata of the scalable WCG video according to the embodiment of the present invention is BT. 601, BT. 709, DCI-P3, BT. 2020 (NCL), BT. Information indicating a specific color space, such as 2020 (CL), XYZ, and User defined.
- the color gamut information of the video of the enhancement layer among the metadata of the scalable WCG video according to the embodiment of the present invention is BT. 601, BT. 709, DCI-P3, BT. 2020 (NCL), BT. It may refer to a specific color gamut format, such as 2020 (CL), XYZ, and User defined.
- FIG. 17 is a diagram illustrating in detail color gamut mapping function information for obtaining a WCG video among metadata of a scalable WCG video according to an embodiment of the present invention.
- the color gamut mapping function information that allows to obtain scalable WCG video includes No mapping, Gain-offset conversion, Linear matrix conversion, and Look-up table. It may refer to a mapping function such as.
- the color gamut mapping function information in the metadata of the WCG video may provide information about the color gamut mapping function if the color gamut mapping function is used to obtain the final WCG video to be displayed.
- the receiver may know each video format or color gamut information of the base layer video data and the enhancement layer video data, and based on the scale Can output flexible WCG video. Accordingly, a receiver having a display device capable of expressing existing colors may express legacy UHD video using base layer video data, and a receiver having a display device capable of WCG service may express WCG video service.
- the receiver may receive signaling information and combine substreams of scalable WCG video to output a WCG image.
- the signaling information decoder of the receiver uses the program descriptor (UD_program_descriptor) of the received PMT to determine whether there is a separate service or media to be additionally received in order to configure the original UHDTV broadcast.
- the scalable WCG video described in this embodiment corresponds to the case where UD_program_format_type is 0x08, and it can be understood that scalable WCG video can be composed using enhancement layer video data and additional information of an SEI message in the video data. Can be.
- the receiver's signaling information decoder recognizes codec information, profile information, level information, and tier information of a service stream through the stream descriptor WCG_sub_stream_descriptor. It may be determined whether the decoder can process the information.
- the video decoder of the receiver obtains color gamut and bit depth information (bit depth related information) of the scalable WCG video configured by the base layer and the enhancement layer from the UDTV_scalable_color_gamut_service_info SEI message in the video data.
- the device may determine whether the final output image is possible.
- the video decoder of the receiver may decode only base layer video data.
- the receiver may cause the video decoder to configure scalable WCG video when the signaling information decoder determines that the video decoder can decode and process the WCG video.
- the receiver When the brightness information of the scalable WCG video obtained by decoding the SEI message of the video data cannot be output from the display of the receiver, the receiver outputs only the base layer video or provides the brightness information for the scalable WCG video. Proper post-processing can be used to output WCG video.
- the receiver may allow the video decoder to decode the sub stream.
- the decoder of the receiver may configure scalable WCG video using the UDTV_scalable_color_gamut_service_info SEI message together with the enhancement layer video data.
- the decoder of the receiver obtains the color bit depth information (BL_bitdepth field) of the base layer and the difference information (EL_bitdepth_diff field) of the color bit depth of the enhancement layer video data and the base layer video data from the metadata, and uses the base
- the color bit depth of the layer video data may be upscaled.
- the decoder of the receiver may compensate detailed data of the color space of the upscaled base layer video data using the residual data of the enhancement layer.
- the receiver may perform post processing of the WCG video before the final display of the video to display an image with improved brightness or to perform color conversion on an image that is difficult to display on the display device.
- the color gamut information EL_video_color_gamut_type
- arbitrary color gamut information of the enhancement layer video data transmitted in the SEI message color_primary_A_x, color_primary_A_y as color primary values of RGBW, where A is R, G, B, One of W
- a signaling method for configuring scalable WCG video using color gamut-mapped base layer video data and enhancement layer video data is illustrated.
- information for constructing scalable WCG video at the system level of a broadcast may be provided, and metadata may be provided to an SEI message at a video level to perform color gamut mapping of base layer video data.
- FIG. 18 is a diagram illustrating broadcast signaling information as an embodiment of the present invention that can implement such an embodiment.
- the PMT and the signaling information included therein among the broadcast signaling information are described below.
- the PMT may include program level descriptors and elementary stream level descriptors.
- the PMT may include, as a program level descriptor, a descriptor capable of describing a program capable of composing a WCG video by color gamut mapping of base layer video data compatible with legacy UHD video.
- the UD_program_descriptor may signal a program through which the scalable WCG video is transmitted.
- the color gamut mapping of the base layer video data indicates that the program can configure WCG video.
- the PMT may include a descriptor (WCG_sub_stream_descriptor ()) including stream information about a program constituting a scalable WCG video service in a stream level descriptor.
- the descriptor (WCG_sub_stream_descriptor) including stream information about a program may include information about a stream of base layer video data compatible with legacy UHD video.
- 19 illustrates another syntax for a payload of an SEI region of video data according to an embodiment of the present invention.
- payloadType When payloadType is set to a specific value (52 in this example) in the SEI payload, it may include information (UDTV_scalable_color_gamut_service_info (payloadSize)) signaling the format of scalable WCG video data as illustrated.
- UDTV_scalable_color_gamut_service_info payloadSize
- An embodiment of parsing video data according to syntax illustrated by a decoder of a receiver is as follows.
- the decoder When the decoder decodes the video data, it parses the AVC or HEVC NAL unit from the video element stream. If the nal_unit_type value corresponds to SEI data and payloadType is 52 in the SEI data, information according to the UDTV_scalable_color_gamut_service_info can be obtained.
- UDTV_scalable_color_gamut_service_info (payloadSize), which is information signaling the format of scalable WCG video data in the payload region of the SEI region, may include a field (UD_program_format_type) indicating format information of UHD video data.
- UD_program_format_type indicating format information of UHD video data.
- the format information of the UHD video data includes WCG mapped to the WCG enhancement layer video data and the color gamut mapping data of the base layer video data compatible with legacy UHD video. It can indicate that the program can configure video.
- Metadata When the format information of the UHD video data indicates the format of the scalable WCG video, metadata of the scalable WCG video may include metadata (WCG_substream_metadata). This will be described later in detail.
- FIG. 20 is a diagram illustrating another example of metadata of a scalable WCG video included in a payload of an SEI region disclosed according to an embodiment of the present invention.
- the metadata of the example scalable WCG video is finally obtained using information on the UHD video format (original_UD_video_type field), bit depth information (BL_bitdepth field) of base layer video data, and enhancement layer video data as shown in FIG. 12.
- Information indicating the difference between the bit depth of the scalable WCG video and the bit depth of the base layer video data (EL_bitdepth_diff field), information about the color gamut of the base layer video data (BL_video_color_gamut_type field), and the video of the video generated by the enhancement layer video data.
- Color gamut information (EL_video_color_gamut_type field), information on the gamut mapping function used to obtain the final WCG image (EL_gamut_mapping_type field), and base layer video data or enhancement layer video data. dog (Color gamut), and the like type information (RGBW_primaries () field).
- the metadata of the scalable WCG video may further include color gamut mapping information (gamut_mapping_info ()).
- gamut_mapping_info ()
- the color gamut mapping information included in the metadata of the scalable WCG video will be described in detail below.
- 21 is a diagram illustrating an example of color gamut mapping information included in metadata of a scalable WCG video according to an embodiment of the present invention.
- the color gamut mapping information included in the metadata of the scalable WCG video may represent a method for extending the color gamut of the base layer based on the enhancement layer.
- the color gamut mapping information may indicate a color gamut mapping type for obtaining a video having improved quality or color from the base layer video data.
- a color gamut mapping method may be signaled according to the EL_gamut_mapping_type field, which is a color gamut mapping type, and the type of parameters to be transmitted to this information may vary according to the type of this function.
- the color gamut mapping method is a method of mapping a color by signaling a gain and an offset of a function.
- color gamut mapping type information (EL_gamut_mapping_type) 0010
- the color gamut information is mapped using a matrix.
- ITU-R BT It can be used based on the method described in 2250. If you use this, YCbCr converted color coordinates are converted to RGB coordinates for encoding. Primary transformation can be performed to transform gamut on CIE colorimetry for the transformed RGB coordinates.
- the matrix_composition_type field represents a method of configuring a matrix for mapping color gamut information based on matrix conversion.
- the method of constructing a matrix that maps color gamut information is performed based on the NPM (Normalized Primary Matrix) of the color gamut of the source and target. Do it in a way.
- NPM Normalized Primary Matrix
- Equation 2 An example of mapping a color gamut used in an HDTV to another target color gamut in Equation 2 is disclosed.
- the signaling information may directly include a color gamut mapping matrix.
- the matrix_composition_type field may indicate methods of mapping with various color gamuts according to the value of this field.
- An example of color gamut mapping methods is illustrated in FIG. 22.
- the Number_of_coeff field represents the number of coefficients used for further color space conversion.
- the gamut_mapping_coeff [i] field represents a coefficient for color space conversion. If it is assumed that the conversion to an arbitrary color space for optimal color expression is made based on the color space expressed by the color_gamut syntax, the optimal color space may be used by using a conversion equation. An example of a conversion equation is illustrated in FIG. 25. Alternatively, other transformations can be used, depending on the user's specification.
- color gamut mapping type information (EL_gamut_mapping_type) 0011, it may represent LUT-based color gamut mapping.
- the most widely used method for color gamut mapping is the Look Up Table (LUT) method, which uses a table that matches input and output values one-to-one.
- LUT Look Up Table
- LUT instead of using all 3D coordinates, matching may be performed independently on each channel, or a method of estimating an LUT component based on a reference point may be used.
- the LUT_type field represents the type of look up table (LUT) to be used. This field value may represent an LUT that independently matches each channel, an LUT using all 3D coordinates, or a method of estimating an LUT component based on a reference point.
- LUT look up table
- color gamut mapping matrix type information (matrix_composition_type) that can be used to map color gamut information according to an embodiment of the present invention. As illustrated here, color gamut mapping may be performed according to color gamut mapping matrix type information (matrix_composition_type).
- matrix_composition_type For example, if the color gamut mapping matrix type information (matrix_composition_type) field is 0000, BT. Represents a normalized primary matrix according to 709. The matrix method is illustrated in FIG. 23.
- matrix_composition_type When the color gamut mapping matrix type information (matrix_composition_type) field is 0001, this indicates a normalized primary matrix according to DCI-P3.
- BT when the color gamut mapping matrix type information (matrix_composition_type) field is 0010. Represents a normalized primary matrix according to 2020.
- color gamut mapping matrix type information (matrix_composition_type) field is 0100, this indicates a normalized primary matrix based on the color primary value of the current image.
- matrix and the mapping scheme are illustrated in FIG. 24.
- FIG. 23 is a table illustrating color gamut mapping matrix type information included in metadata of WCG video according to an embodiment of the present invention.
- the color gamut mapping matrix type information illustrated here indicates a matrix when the color gamut mapping follows the matrix of NPM_709 (normalized primary matrix according to BT.709).
- FIG. 24 illustrates an embodiment in which a color gamut mapping matrix type information included in metadata of a WCG video is normalized based on a color primary value of a current image according to an embodiment of the present invention.
- a method of converting a current color to X, Y, and Z into a color primary value and a color gamut mapping matrix using the same are illustrated.
- FIG. 25 illustrates a conversion equation representing a coefficient (gamut_mapping_coeff [i]) for color space conversion among color gamut mapping information included in metadata of a WCG video according to an embodiment of the present invention.
- the coefficient for color space conversion among the color gamut mapping information may be a coefficient included in a component of the matrix.
- FIG. 26 is a diagram illustrating a type of look up table (LUT) according to a LUT_type field of color gamut mapping information included in metadata of a WCG video according to an embodiment of the present invention.
- Look up able such as LUT, 3D LUT, 3D LUT (linear interpolation) may be referred to according to the LUT_type value.
- the receiver may receive signaling information and combine substreams of scalable WCG video to output a WCG image.
- the signaling information decoder of the receiver uses the program descriptor (UD_program_descriptor) of the received PMT to determine whether there is a separate service or media to be additionally received in order to configure the original UHDTV broadcast.
- the scalable WCG video described in this embodiment corresponds to the case where UD_program_format_type is 0x09, and it can be understood that scalable WCG video can be composed using enhancement layer video data and additional information of an SEI message in the video data. Can be.
- the UD_program_format_type field is 0x09 (ie, when the UD_program_format_type field is 0x09) (i.e., a program constituting WCG video with data updated with color bit depth of WCG enhancement layer video data and base layer video data compatible with legacy UHD video). )
- WCG_sub_stream_descriptor it is possible to determine codec information, profile information, level information, and tier information of a service stream to determine whether the information can be processed by the decoder of the receiver.
- the video decoder of the receiver obtains color gamut and color gamut mapping information of the scalable WCG video configured by the base layer and the enhancement layer from the UDTV_scalable_color_gamut_service_info SEI message in the video data, and enables the final output on the display device of the receiver.
- the image can be determined.
- the video decoder of the receiver may decode only base layer video data. If the receiver determines that the signaling information decoder can decode and process the video information, the receiver uses the WCG enhancement layer video data and data updated by the color bit depth of the base layer video data compatible with legacy UHD video. Configure scalable WCG video.
- the receiver When the brightness information of the scalable WCG video obtained by decoding the SEI message of the video data cannot be output from the display of the receiver, the receiver outputs only the base layer video or provides the brightness information for the scalable WCG video. Proper post-processing can be provided to output the example scalable WCG video.
- the receiver may cause the video decoder to decode the substream.
- the decoder of the receiver may configure scalable WCG video using color gamut mapping information of the UDTV_scalable_color_gamut_service_info SEI message together with the enhancement layer data.
- the scalable WCG video is formed by using the WCG enhancement layer video data and data updated with color bit depths of base layer video data compatible with legacy UHD video.
- the receiver Before the final display of the video, the receiver may display an image with improved brightness or perform color conversion on an image that is difficult to display on the display device through WCG video post processing.
- color gamut information EL_video_color_gamut_type
- arbitrary color gamut information of the enhancement layer video data transmitted in the SEI message color_primary_A_x, color_primary_A_y as color primary values of RGBW, where A is R, G, B, One of W
- the signaling information according to the above two embodiments may be included in the system level and the SEI message and transmitted together.
- FIG. 27 is a diagram illustrating broadcast signaling information as an embodiment of the present invention and may correspond to FIG. 8 or 18.
- the program level descriptor is a descriptor (UD_program_descriptor) that upscales a base layer compatible with existing UHD video or identifies a program capable of composing scalable WCG video using color mapped data and enhancement layers. May be included.
- UD_program_descriptor a descriptor that upscales a base layer compatible with existing UHD video or identifies a program capable of composing scalable WCG video using color mapped data and enhancement layers. May be included.
- the descriptor (UD_program_descriptor) for identifying a program capable of constructing scalable WCG video is data updated with color bit depth of enhancement layer video data and base layer video data compatible with legacy UHD video.
- Program / service format (0x08) that can configure video, WCG enhancement layer video data, and color / mapping mapped data of base layer video data that is compatible with legacy UHD video. It may include a field (UD_program_format_type) for identifying the format (0x09).
- the stream level descriptor may include coding information of a stream constituting the scalable WCG video.
- FIG. 28 is a diagram for describing a specific example including a descriptor for signaling a scalable WCG video included in such broadcast signaling information as an embodiment of the present invention. This figure may correspond to FIG. 10 or 19 as disclosed.
- a descriptor for signaling scalable WCG video is a descriptor including information on a stream constituting a WCG video service.
- the EL_video_codec_type field, the EL_video_profile field, the EL_video_level field, the EL_video_level field, and the EL_video_tier field are the same as described with reference to FIG. 10 or 19.
- the metadata WCG_substream_metadata () signaling a scalable WCG video stream according to an embodiment of the present invention is the metadata as illustrated in FIG. 12 or 20.
- FIG. 29 is a diagram illustrating another example in which signaling information for signaling scalable WCG video is included in broadcast signaling information according to an embodiment of the present invention.
- SDT Service Description Table
- the table_id field represents an identifier of a table.
- section_syntax_indicator field is a 1-bit field set to 1 for an SDT table section (section_syntax_indicator: The section_syntax_indicator is a 1-bit field which shall be set to "1").
- section_length This is a 12-bit field, the first two bits of which shall be "00" .It specifies the number of bytes of the section, starting immediately following the section_length field and including the CRC. The section_length shall not exceed 1 021 so that the entire section has a maximum length of 1 024 bytes.
- transport_stream_id This is a 16-bit field which serves as a label for identification of the TS, about which the SDT informs, from any other multiplex within the delivery system.
- the version_number field indicates the version number of this subtable.
- version_number This 5-bit field is the version number of the sub_table.
- the version_number shall be incremented by 1 when a change in the information carried within the sub_table occurs.When it reaches value "31", it wraps around to "0" .
- the version_number shall be that of the currently applicable sub_table.
- the version_number shall be that of the next applicable sub_table.
- section_number This 8-bit field gives the number of the section.
- the section_number of the first section in the sub_table shall be "0x00" .
- the section_number shall be incremented by 1 with each additional section with the same table_id, transport_stream_id, and original_network_id.
- last_section_number This 8-bit field specifies the number of the last section (that is, the section with the highest section_number) of the sub_table of which this section is part.
- service_id This is a 16-bit field which serves as a label to identify this service from any other service within the TS.
- the service_id is the same as the program_number in the corresponding program_map_section.
- EIT_schedule_flag This is a 1-bit field which when set to "1" indicates that EIT schedule information for the service is present in the current TS , see TR 101 211 [i.2] for information on maximum time interval between occurrences of an EIT schedule sub_table). If the flag is set to 0 then the EIT schedule information for the service should not be present in the TS.
- the EIT_present_following_flag field may indicate whether EIT_present_following_flag: This is a 1-bit field which when set to "1" indicates that EIT_present_following information for the service is present in the current TS, see TR 101 211 [i.2] for information on maximum time interval between occurrences of an EIT present / following sub_table.If the flag is set to 0 then the EIT present / following information for the service should not be present in the TS.)
- the running_status field may refer to the state of the service defined in Table 6 of the DVB-SI document.
- running_status This is a 3-bit field indicating the status of the service as defined in table 6.For an NVOD reference service the value of the running_status shall be set to "0".
- free_CA_mode This 1-bit field, when set to “0" indicates that all the component streams of the service are not scrambled.When set to "1" it indicates that access to one or more streams may be controlled by a CA system.
- descriptors_loop_length field indicates the length of the following descriptor (descriptors_loop_length: This 12-bit field gives the total length in bytes of the following descriptors).
- CRC_32 is a 32-bit field that contains a CRC value (CRC_32: This is a 32-bit field that contains the CRC value that gives a zero output of the registers in the decoder).
- the descriptor of the SDT may include information capable of describing a scalable WCG video service.
- Metadata (WCG_substream_metadata) exemplified in FIG. 1 may be included.
- signaling information describing the scalable WCG video service may be included.
- FIG. 30 is a diagram illustrating another example in which signaling information for signaling scalable WCG video is included in broadcast signaling information as an embodiment of the present invention.
- an Event Information Table (EIT) is illustrated as broadcast signaling information.
- EIT may be in accordance with ETSI EN 300 468. Using this to describe each field is as follows.
- table_id Represents a table identifier.
- section_syntax_indicator field is a 1-bit field set to 1 for an EIT table section (section_syntax_indicator: The section_syntax_indicator is a 1-bit field which shall be set to "1").
- section_length This is a 12-bit field.It specifies the number of bytes of the section, starting immediately following the section_length field and including the CRC.The section_length shall not exceed 4 093 so that the entire section has a maximum length of 4 096 bytes.
- service_id This is a 16-bit field which serves as a label to identify this service from any other service within a TS.
- the service_id is the same as the program_number in the corresponding program_map_section.
- the version_number field indicates the version number of this subtable.
- version_number This 5-bit field is the version number of the sub_table.
- the version_number shall be incremented by 1 when a change in the information carried within the sub_table occurs.When it reaches value 31, it wraps around to 0.When the current_next_indicator is set to "1”, then the version_number shall be that of the currently applicable sub_table.When the current_next_indicator is set to "0”, then the version_number shall be that of the next applicable sub_table.
- section_number This 8-bit field gives the number of the section.
- the section_number of the first section in the sub_table shall be "0x00" .
- the section_number shall be incremented by 1 with each additional section with the same table_id, service_id, transport_stream_id, and original_network_id.
- the sub_table may be structured as a number of segments.With each segment the section_number shall increment by 1 with each additional section, but a gap in numbering is permitted between the last section of a segment and the first section of the adjacent segment.
- last_section_number This 8-bit field specifies the number of the last section (that is, the section with the highest section_number) of the sub_table of which this section is part.
- transport_stream_id This is a 16-bit field which serves as a label for identification of the TS, about which the EIT informs, from any other multiplex within the delivery system.
- segment_last_section_number This 8-bit field specifies the number of the last section of this segment of the sub_table.For sub_tables which are not segmented, this field shall be set to the same value as the last_section_number field.
- the last_table_id field is (last_table_id: This 8-bit field identifies the last table_id used (see table 2).)
- event_id This 16-bit field contains the identification number of the described event (uniquely allocated within a service definition).
- the start_time field contains the start time of the event (start_time: This 40-bit field contains the start time of the event in Universal Time, Co-ordinated (UTC) and Modified Julian Date (MJD) (see annex C). is coded as 16 bits giving the 16 LSBs of MJD followed by 24 bits coded as 6 digits in 4-bit Binary Coded Decimal (BCD) . If the start time is undefined (eg for an event in a NVOD reference service) all bits of the field are set to "1".)
- running_status This is a 3-bit field indicating the status of the event as defined in table 6. For an NVOD reference event the value of the running_status shall be set to "0".
- free_CA_mode This 1-bit field, when set to “0" indicates that all the component streams of the event are not scrambled.When set to “1” it indicates that access to one or more streams is controlled by a CA system.
- descriptors_loop_length field indicates the length of the following descriptor. (descriptors_loop_length: This 12-bit field gives the total length in bytes of the following descriptors.)
- CRC_32 This is a 32-bit field that contains the CRC value that gives a zero output of the registers in the decoder
- the descriptors_loop_length field may include a UHD_program_type_descriptor illustrated in FIG. 16 and a UHD_composition_descriptor illustrated in FIGS. 18, 24, or 25 according to an embodiment of the present invention in the following descriptor position.
- the descriptor of the EIT may include information for describing a scalable WCG video service.
- Metadata (WCG_substream_metadata) exemplified in FIG. 1 may be included.
- signaling information describing the scalable WCG video service may be included.
- FIG. 31 is a diagram illustrating another example in which signaling information for signaling scalable WCG video is included in broadcast signaling information according to an embodiment of the present invention.
- a diagram illustrating a Virtual Channel Table (VCT) as broadcast signaling information is described as follows.
- the VCT may comply with the ATSC PSIP specification. According to ATSC PSIP, each field is described as follows. Each bit description starts as follows.
- the table_id field indicates an 8-bit unsigned integer that indicates the type of the table section (table_id-An 8-bit unsigned integer number that indicates the type of table section being defined here.For the terrestrial_virtual_channel_table_section (), the table_id shall be 0xC8)
- the section_syntax_indicator field is a 1-bit field set to 1 for a VCT table section (section_syntax_indicator-The section_syntax_indicator is a one-bit field which shall be set to '1' for the terrestrial_virtual_channel_table_section ()).
- section_length field represents the length of the section in bytes. (section_length-This is a twelve bit field, the first two bits of which shall be ‘00’. It specifies the number of bytes of the section, starting immediately following the section_length field, and including the CRC.)
- the transport_stream_id field indicates an MPEG-TS ID as in a PAT that can identify TVCT (transport_stream_id-The 16-bit MPEG-2 Transport Stream ID, as it appears in the Program Association Table (PAT) identified by a PID value of zero for this multiplex.
- the transport_stream_id distinguishes this Terrestrial Virtual Channel Table from others that may be broadcast in different PTCs.
- the version_number field indicates the version number of the VCT (version_number-This 5 bit field is the version number of the Virtual Channel Table.
- version number shall be incremented by 1 whenever the definition of the current VCT changes. Upon reaching the value 31, it wraps around to 0.
- version number shall be one unit more than that of the current VCT (also in modulo 32 arithmetic) In any case, the value of the version_number shall be identical to that of the corresponding entries in the MGT)
- current_next_indicator-A one-bit indicator, which when set to '1' indicates that the Virtual Channel Table sent is currently applicable.When the bit is set to ' 0 ', it indicates that the table sent is not yet applicable and shall be the next table to become valid. This standard imposes no requirement that “next” tables (those with current_next_indicator set to' 0 ') must be sent.An update to the currently applicable table shall be signaled by incrementing the version_number field)
- section_number-This 8 bit field gives the number of this section.
- the section_number of the first section in the Terrestrial Virtual Channel Table shall be 0x00. It shall be incremented by one with each additional section in the Terrestrial Virtual Channel Table)
- last_section_number-This 8 bit field specifies the number of the last section (that is, the section with the highest section_number) of the complete Terrestrial Virtual Channel Table.)
- protocol_version field indicates the protocol version for a parameter to be defined differently from the current protocol (protocol_version-An 8-bit unsigned integer field whose function is to allow, in the future, this table type to carry parameters that may be structured differently than those defined in the current protocol.At present, the only valid value for protocol_version is zero.Non-zero values of protocol_version may be used by a future version of this standard to indicate structurally different tables)
- the num_channels_in_section-This 8 bit field specifies the number of virtual channels in this VCT section. The number is limited by the section length)
- the major_channel_number field indicates the number of major channels associated with a virtual channel (major_channel_number-A 10-bit number that represents the “major” channel number associated with the virtual channel being defined in this iteration of the “for” loop.Each virtual channel shall be associated with a major and a minor channel number. The major channel number, along with the minor channel number, act as the user's reference number for the virtual channel. The major_channel_number shall be between 1 and 99. The value of major_channel_number shall be set such that in no case is a major_channel_number / minor_channel_number pair duplicated within the TVCT.For major_channel_number assignments in the US, refer to Annex B.)
- the minor_channel_number field indicates the number of minor channels associated with the virtual channel (minor_channel_number-A 10-bit number in the range 0 to 999 that represents the "minor" or "sub"-channel number.This field, together with major_channel_number, performs as a two-part channel number, where minor_channel_number represents the second or right-hand part of the number.When the service_type is analog television, minor_channel_number shall be set to 0.
- Minor_channel_number shall be set such that in no case is a major_channel_number / minor_channel_number pair duplicated within the TVCT.For other types of services, such as data broadcasting, valid minor virtual channel numbers are between 1 and 999.
- modulation_mode mode indicates the modulation mode of the carrier associated with the virtual channel (modulation_mode-An 8-bit unsigned integer number that indicates the modulation mode for the transmitted carrier associated with this virtual channel.Values of modulation_mode shall be as defined in Table 6.5. digital signals, the standard values for modulation mode (values below 0x80) indicate transport framing structure, channel coding, interleaving, channel modulation, forward error correction, symbol rate, and other transmission-related parameters, by means of a reference to an appropriate standard The modulation_mode field shall be disregarded for inactive channels)
- carrier_frequency-The recommended value for these 32 bits is zero.Use of this field to identify carrier frequency is allowed, but is deprecated.
- channel_TSID field indicates the MPEG-2 TS ID associated with the TS carrying the MPEG-2 program referenced by this virtual channel (channel_TSID-A 16-bit unsigned integer field in the range 0x0000 to 0xFFFF that represents the MPEG-2 Transport Stream ID associated with the Transport Stream carrying the MPEG-2 program referenced by this virtual channel8.
- channel_TSID shall represent the ID of the Transport Stream that will carry the service when it becomes active. The receiver is expected to use the channel_TSID to verify that any received Transport Stream is actually the desired multiplex.
- channel_TSID shall indicate the value of the analog TSID included in the VBI of the NTSC signal.Refer to Annex D Section 9 for a discussion on use of the analog TSID)
- the program_number field indicates an integer value defined in association with this virtual channel and PMT (program_number-A 16-bit unsigned integer number that associates the virtual channel being defined here with the MPEG-2 PROGRAM ASSOCIATION and TS PROGRAM MAP tables.
- PMT program_number-A 16-bit unsigned integer number that associates the virtual channel being defined here with the MPEG-2 PROGRAM ASSOCIATION and TS PROGRAM MAP tables.
- channels representing analog services a value of 0xFFFF shall be specified for program_number.
- program_number shall be set to zero.This number shall not be interpreted as pointing to a Program Map Table entry .
- the access_controlled field may refer to an event associated with an access controlled virtual channel (access_controlled-A 1-bit Boolean flag that indicates, when set, that the events associated with this virtual channel may be access controlled.When the flag is set to '0', event access is not restricted)
- the hidden field may indicate when the virtual channel is not accessed by the user's direct channel input. the virtual channel number.Hidden virtual channels are skipped when the user is channel surfing, and appear as if undefined, if accessed by direct channel entry.Typical applications for hidden channels are test signals and NVOD services. Whether a hidden channel and its events may appear in EPG display devices depends on the state of the hide_guide bit.)
- the hide_guide field may indicate whether a virtual channel and its events may be displayed in the EPG (hide_guide-A Boolean flag that indicates, when set to '0' for a hidden channel, that the virtual channel and its events may appear in EPG Display Devices.This bit shall be ignored for channels which do not have the hidden bit set, so that non-hidden channels and their events may always be included in EPG Display Devices regardless of the state of the hide_guide bit.
- Typical applications for hidden channels with the hide_guide bit set to '1' are test signals and services accessible through application-level pointers.
- service_type-This 6-bit field shall carry the Service Type identifier.Service Type and the associated service_type field are defined in A / 53 Part 1 [1] to identify the type of service carried in this virtual channel.Value 0x00 shall be reserved.Value 0x01 shall represent analog television programming.Other values are defined in A / 53 Part 3 [3], and other ATSC Standards may define other Service Types9)
- the source_id field is an identification number identifying a program source associated with a virtual channel (source_id-A 16-bit unsigned integer number that identifies the programming source associated with the virtual channel.
- a source is one specific source of video, text Source ID values zero is reserved.Source ID values in the range 0x0001 to 0x0FFF shall be unique within the Transport Stream that carries the VCT, while values 0x1000 to 0xFFFF shall be unique at the regional level.Values for source_ids 0x1000 and above shall be issued and administered by a Registration Authority designated by the ATSC.)
- descriptors_length-Total length (in bytes) of the descriptors for this virtual channel that follows
- descriptor ()-Zero or more descriptors, as appropriate, may be included.
- the service_type field of the VCT may include service type information for identifying a UHD service, a scalable UHD service, or a scalable WCG video service. For example, when the service_type field is 0x07, 0x09, or 0x10, it may signal that the corresponding service is such a service.
- the descriptor of the VCT may include information capable of describing a scalable WCG video service.
- Metadata (WCG_substream_metadata) exemplified in FIG. 1 may be included.
- signaling information describing the scalable WCG video service may be included.
- FIG. 32 is a diagram illustrating an embodiment of a signal transmission apparatus according to an embodiment of the present invention. An embodiment of a signal transmission apparatus will be described below with reference to this drawing.
- This embodiment may encode and transmit base layer video data and enhancement layer video data compatible with legacy UHD video to transmit scalable WCG video.
- An embodiment of a signal transmission apparatus includes a video encoder, which includes a color gamut mapping unit 510, a first color conversion unit 520, and a second color conversion unit 530.
- the first encoder 540, the upscaling unit 550, the calculator 560, the second encoder 570, and the metadata generator 580 may be included.
- the color gamut mapping unit 510 may output the legacy UHD video by color gamut mapping the scalable WCG video.
- the color gamut mapping unit 510 may map the color gamut of the scalable WCG video to map the scalable WCG video to a color space that can be expressed on an existing display.
- the color gamut mapping unit 510 outputs UHD video that can be output from a conventional receiver by mapping the entire color expression range to a predetermined space.
- the transmitter outputs this information in the form of metadata.
- the first color converter 520 converts the image format to transmit the color gamut-mapped video to the corresponding color space. For example, luma signals can be maintained according to visual characteristics during color video transmission, while sub-sampling can be performed on chroma signals. Through this color conversion, the transfer curve of the video is changed, and the transfer curve (EOTF) can be converted to match the existing receiver to fit the existing receiver.
- EOTF transfer curve
- the second color conversion unit 530 may perform color conversion that can be expressed in a legacy UHD display device even for YCbCr conversion.
- the first color converter 520 and the second color converter 530 may operate only when necessary for the corresponding video data.
- the first encoder 540 uses a color gamut mapping unit 510 or a first color conversion unit using a codec that can be processed by an existing UHD receiver such as HEVC to transmit UHD video that can be output from an existing display device.
- a codec that can be processed by an existing UHD receiver such as HEVC to transmit UHD video that can be output from an existing display device.
- 520 or the second color converter 530 encodes the video data output to base layer video data and outputs the encoded video data.
- the scaling unit 550 downsamples the bit depth of the UHD video that can be output from the existing receiver and up-scales the bit depth of the video (SCG) before color conversion such as OETF to perform the same as the original scalable WCG video. It has a bit depth.
- the calculation unit 560 generates residual data between the original scalable WCG video and the video data output from the scaling unit 550.
- the second encoder 570 encodes the residual data and outputs the enhancement data.
- the metadata generator 580 generates metadata for legacy UHD video generated through color gamut mapping.
- the generated metadata includes color gamut conversion and color conversion performed by the color gamut mapping unit 510, the first color conversion unit 520, and the second color conversion unit 530.
- Information such as conversion or YCbCr conversion matrix).
- the metadata generator 580 may generate the information illustrated in FIGS. 11 to 17 and 19 to 26.
- the signal transmission apparatus may further include a signaling information encoder, a multiplexer, and a transmitter.
- the signaling information encoder may encode signaling information that may constitute scalable WCG video data.
- the information that the signaling information encoder can encode is illustrated in FIGS. 8, 10, 18, and 27-31.
- the multiplexer may multiplex and output video data output from the video encoder, base layer video data encoded by the video encoder, and enhancement video data.
- the transmitter may transmit the multiplexed stream by performing channel coding.
- FIG 33 is a view showing another embodiment of a signal transmission apparatus according to an embodiment of the present invention. An embodiment of a signal transmission apparatus will be described below with reference to this drawing.
- This embodiment may encode and transmit a video to compose a WCG video with enhancement layer video data and color gamut mapped data of base layer video data compatible with legacy UHD video.
- An embodiment of a signal transmission apparatus may include a video encoder, a signaling information encoder, a multiplexer, and a transmitter.
- the video encoder includes a first color gamut mapping unit 610, a first color converter 620, a second color converter 630, a first encoder 640, an upscaling unit 650, and a second color gamut mapping unit.
- the unit 655, the calculator 660, the second encoder 670, and the metadata generator 680 may be included.
- the first color gamut mapping unit 610 may output the legacy UHD video by color gamut mapping the scalable WCG video.
- the first color converter 620 and the second color converter 630 may perform color conversion similar to those described above. For example, luma signals can be maintained according to visual characteristics during color video transmission, while sub-sampling can be performed on chroma signals. Through this color conversion, the transfer curve of the video is changed. The transfer curve (EOTF) can be converted to fit the existing receiver.
- EOTF transfer curve
- the second color conversion unit 630 may perform color conversion that can be expressed in a legacy UHD display device even for YCbCr conversion.
- the first color converter 620 and the second color converter 630 may operate only when a corresponding video is needed.
- the first encoder 640 uses a codec that can be processed in an existing UHD receiver such as HEVC for scalable color gamut video to encode and output the UHD video that can be output from the existing receiver.
- the video output by 630 is compressed into base layer video data and output.
- the scaling unit 650 performs down-depth bit sampling on the UHD video that can be output from the existing receiver and up-scaling the bit depth of the video (SCG) before color conversion such as OETF to perform the same as the original scalable WCG video. It has a bit depth.
- the second color gamut mapping unit 655 performs color gamut mapping on the upscaled video output by the scaling unit 650 and extends the color gamut of the base layer video data to display the video similarly to the color gamut of the WCG video. Convert.
- the video scaled up by the scaling unit 650 may be mapped to a color space in which bit depth expansion occurs, and thus a quantization error may occur. Therefore, metadata capable of correcting the information may be generated by the metadata generator or included in the residual data and processed.
- the calculator 660 generates residual data between the original scalable WCG video and the video data output by the second color gamut mapping unit 655.
- the second encoder 670 encodes the residual data and outputs the enhancement data.
- the metadata generator 680 generates metadata for legacy UHD video generated through color gamut mapping.
- the metadata for legacy UHD video includes color gamut conversion, color conversion (EOTF conversion or YCbCr conversion matrix) performed by the color gamut mapping unit 510, the first color conversion unit 520, and the second color conversion unit 530. Information), and the like.
- the metadata generator 680 transmits information for composing the enhancement layer video data in the form of metadata.
- the metadata may include not only information related to a gamut mapping function (gamut mapping type, parameter, etc.) but also information on a base layer video data, a configuration method, and the like.
- the metadata generator 680 may generate the information illustrated in FIGS. 11 to 17 and 19 to 26.
- 34 is a diagram illustrating an example of a signal receiving apparatus according to an embodiment of the present invention.
- An example of a signal receiving apparatus includes a receiver 710, a channel decoder 720, a demultiplexer 730, a signaling information decoder 740, and a video decoder.
- the video decoder includes a base layer decoder and an enhancement layer decoder.
- the base layer decoder 810 decodes the base layer video data output from the demultiplexer 730 and outputs legacy UHD video data 920.
- the enhancement layer decoder may include an upscaling unit 910, a color gamut mapping unit 920, a scalable decoder 930, and a WCG postprocessor 940.
- the receiver 710 may tune a broadcast signal and demodulate a signal frame included in the broadcast signal.
- the channel decoder 720 may channel decode data included in the signal frame.
- the demultiplexer 730 demultiplexes and outputs channel decoded data.
- the demultiplexer 730 may demultiplex broadcast signaling information and streams of base layer video data or enhancement layer video data, respectively.
- the signaling information decoder 740 may decode the demultiplexed signaling information.
- examples of information that the signaling information decoder can decode are illustrated in FIGS. 8, 10, 18, and 27-31, respectively.
- the signaling information decoder 740 may know that the corresponding service is a scalable WCG video service using the disclosed program level descriptor UD_program_descriptor or stream descriptor WCG_sub_stream_descriptor, and the codec information, profile information, and level of video of the stream Information, tier information, and the like.
- the video decoder may decode the demultiplexed base layer video data or enhancement layer video data.
- the signaling information included in the base layer video data or the enhancement layer video data may be referred to. Examples of the signaling information decoded by the video decoder are illustrated in FIGS. 11 to 17 and 19 to 26, respectively.
- the video decoder determines scalable WCG video based on the performance of the display device of the receiver based on the signaling information demultiplexed by the signaling information decoder 740 and the signaling information included in the base layer video data or the enhancement layer video data. It can provide legacy UHD video.
- the video decoder may output legacy UHD video compatible with the existing display device.
- the video decoder may use WCG as data obtained by updating color bit depths of enhancement layer video data and base layer video data compatible with legacy UHD video. Can be configured.
- the video decoder may configure WCG video with enhancement layer video data and color gamut mapped data of base layer video data compatible with legacy UHD video.
- the base layer decoder 810 may decode the base layer video data demultiplexed by the demultiplexer 730.
- the base layer video data decoded by the base layer decoder 810 may be video data 820 compatible with legacy UHD video.
- the base layer decoder 810 decodes the video data 820 compatible with legacy UHD video based on the signaling information decoded by the signaling information decoder 740 and the signaling information included in the base layer video data, and outputs the same to the display device. do.
- the base layer video decoder 810 illustrated in this figure may correspond to the base layer decoder and color converter EOTF of FIG. 2.
- the base layer decoder 810 may include signaling information demultiplexed by the signaling information decoder 740 and signaling information included in base layer video data or enhancement layer video data.
- the demultiplexer 730 may decode the base layer video data demultiplexed based on.
- the enhancement layer decoder may include a base layer decoder, and may further include an upscaling unit 910, a color gamut mapping unit 920, a scalable decoder 930, and a post processing unit 940.
- the enhancement layer decoder may include an enhancement layer demultiplexed by the demultiplexer 730 based on signaling information demultiplexed by the signaling information decoder 740 and signaling information included in base layer video data or enhancement layer video data.
- the video can be decoded.
- the upscaling unit 910 may upscale the color bit depth of the base layer video data decoded by the base layer decoder 810.
- the bit depth information BL_bitdepth of the base layer video data included in the metadata of the video data and the bit depth difference information EL_bitdepth_diff of the WCG video and the base layer video data may be used.
- the color gamut mapping unit 920 may map the color gamut of the base layer video data decoded by the base layer decoder 810. In this case, color primary information, gamut mapping function information, etc. for mapping color gamut included in metadata of video data may be used.
- the scalable video decoder 930 may output the WCG video using the enhancement layer video data and the data upscaling the color bit depth of the base layer video data.
- the scalable video decoder 930 may output the WCG video by using the enhancement layer video data and the color mapping data of the base layer video data.
- the post processing unit 940 may output the WCG UHD video 950 that post-processes the video data decoded by the scalable video decoder 930 using signaling information included in the video data.
- the enhancement layer decoder illustrated in this figure may correspond to the base layer decoder, the color converting unit (EOTF), the upscaler, the WCG video configuration unit, and the post processing unit of FIG. 2.
- legacy UHD video or WCG video may be output according to the display performance of the receiver.
- 35 is a diagram illustrating an embodiment of a signal receiving method according to the present invention.
- an embodiment of a signal receiving method according to the present invention will be described with reference to the drawings.
- a stream including base layer video data and enhancement video data capable of composing scalable WCG video data is received (S210).
- the received stream is demultiplexed to output signaling information, base layer video data, and enhancement video data (S220).
- the demultiplexed signaling information is decoded (S230).
- the base layer video data is decoded based on the decoded signaling information to output legacy UHD video, or the base layer video data and enhancement video data are decoded to output WCG video (S240).
- a wide-area color space based video service can be displayed regardless of the display device.
- the existing receiver can provide compatible WCG content.
- an image service based on a wide color space may be provided to be compatible with a plurality of display devices.
- the present invention has industrial applicability that is usable and repeatable in the field of broadcast and video signal processing.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Processing Of Color Television Signals (AREA)
Abstract
Description
Claims (14)
- 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 제공할 수 있는 베이스 레이어 비디오 데이터와 인핸스먼트 비디오 데이터를 각각 인코딩하는 단계;상기 스케일러블 WCG 비디오 데이터를 구성할 수 있는 시그널링 정보를 생성하는 단계;상기 생성된 시그널링 정보와 상기 인코딩된 베이스 레이어 비디오 데이터 및 상기 인코딩된 인핸스먼트 비디오 데이터를 다중화한 스트림을 출력하는 단계; 및상기 다중화한 스트림을 전송하는 단계;를 포함하는 신호 송신 방법.
- 제 1항에 있어서,상기 시그널링 정보는,상기 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 식별하는 정보를 포함하는 신호 송신 방법.
- 제 1항에 있어서,상기 인코딩된 베이스 레이어 비디오 데이터 또는 상기 인코딩된 인핸스먼트 비디오 데이터는, 상기 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 구성(render)할 수 있는 컬러 개멋 매핑 정보, 컬러 비트 뎁스 정보 또는 컬러 매핑 정보를 가지는 메타데이터를 포함하는 신호 송신 방법.
- 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 제공할 수 있는 베이스 레이어 비디오 데이터와 인핸스먼트 비디오 데이터를 각각 인코딩하는 인코더;상기 스케일러블 WCG 비디오 데이터를 구성할 수 있는 시그널링 정보를 인코딩하는 시그널링 정보 인코더;상기 생성된 시그널링 정보와 상기 인코딩된 베이스 레이어 비디오 데이터 및 상기 인코딩된 인핸스먼트 비디오 데이터를 다중화화 스트림을 출력하는 다중화부; 및상기 다중화한 스트림을 전송하는 전송부;를 포함하는 신호 송신 장치.
- 제 4항에 있어서,상기 시그널링 정보는,상기 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 식별하는 정보를 포함하는 신호 송신 장치.
- 제 4항에 있어서,상기 인코딩된 베이스 레이어 비디오 데이터 또는 상기 인코딩된 인핸스먼트 비디오 데이터는, 상기 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 구성(render)할 수 있는 컬러 개멋 매핑 정보, 컬러 비트 뎁스 정보 또는 컬러 매핑 정보를 가지는 메타데이터를 포함하는 신호 송신 장치.
- 스케일러블 Wide Color Gamut(WCG) 비디오 데이터를 구성할 수 있는 베이스 레이어 비디오 데이터와 인핸스먼트 비디오 데이터를 포함하는 스트림을 수신하는 단계;상기 수신한 스트림을 역다중화하여 베이스 레이어 비디오 데이터 및 인핸스먼트 비디오 데이터를 포함하는 비디오 데이터와 시그널링 정보를 각각 출력하는 단계;상기 역다중화한 시그널링 정보를 디코딩하는 단계; 및상기 디코딩한 시그널링 정보를 기반으로 상기 베이스 레이어 비디오 데이터 또는/및 상기 인핸스먼트 비디오 데이터를 복호하여 legacy UHD 비디오 또는 WCG 비디오를 출력하는 단계;를 포함하는 신호 수신 방법.
- 제 7항에 있어서,상기 시그널링 정보는,상기 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 식별하는 정보를 포함하는 신호 수신 방법.
- 제 7항에 있어서,상기 베이스 레이어 비디오 데이터 또는 상기 인핸스먼트 비디오 데이터는, 상기 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 구성(render)할 수 있는 컬러 개멋 매핑 정보, 컬러 비트 뎁스 정보 또는 컬러 매핑 정보를 가지는 메타데이터를 포함하는 신호 수신 방법.
- 제 9항에 있어서,상기 WCG 비디오는,상기 인핸스먼트 레이어 비디오 데이터를 기반으로, 상기 컬러 개멋 매핑 정보를 이용하여 상기 베이스 레이어 비디오 데이터를 컬러 개멋 매핑하거나, 상기 컬러 비트 뎁스 정보를 이용하여 상기 베이스 레이어 비디오 데이터의 컬러 비트 뎁스를 업 스케링하여 구성하는 신호 수신 장치
- 스케일러블 Wide Color Gamut(WCG) 비디오 데이터를 구성할 수 있는 베이스 레이어 비디오 데이터와 인핸스먼트 비디오 데이터를 포함하는 스트림을 수신하는 수신부;상기 수신한 스트림을 역다중화하여 베이스 레이어 비디오 데이터 및 인핸스먼트 비디오 데이터를 포함하는 비디오 데이터와 시그널링 정보를 각각 출력하는 역다중화부;상기 역다중화한 시그널링 정보를 디코딩하는 디코더; 및상기 디코딩한 시그널링 정보를 기반으로 상기 베이스 레이어 비디오 데이터 또는/및 상기 인핸스먼트 비디오 데이터를 복호하여 legacy UHD 비디오 또는 WCG 비디오를 출력하는 비디오 디코더;를 포함하는 신호 수신 장치.
- 제 11항에 있어서,상기 시그널링 정보는,상기 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 식별하는 정보를 포함하는 신호 수신 장치.
- 제 11항에 있어서,상기 베이스 레이어 비디오 데이터 또는 상기 인핸스먼트 비디오 데이터는, 상기 스케일러블 Wide Color Gamut(WCG) 비디오 서비스를 구성(render)할 수 있는 컬러 개멋 매핑 정보, 컬러 비트 뎁스 정보 또는 컬러 매핑 정보를 가지는 메타데이터를 포함하는 신호 수신 장치.
- 제 13항에 있어서,상기 WCG 비디오는,상기 인핸스먼트 레이어 비디오 데이터를 기반으로, 상기 컬러 개멋 매핑 정보를 이용하여 상기 베이스 레이어 비디오 데이터를 컬러 개멋 매핑하거나, 상기 컬러 비트 뎁스 정보를 이용하여 상기 베이스 레이어 비디오 데이터의 컬러 비트 뎁스를 업 스케링하여 구성하는 신호 수신 장치.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020167014563A KR101832779B1 (ko) | 2013-11-21 | 2014-11-21 | 신호 송수신 장치 및 신호 송수신 방법 |
US15/034,735 US20160295220A1 (en) | 2013-11-21 | 2014-11-21 | Signal transceiving apparatus and signal transceiving method |
EP14863879.4A EP3073742A4 (en) | 2013-11-21 | 2014-11-21 | Signal transceiving apparatus and signal transceiving method |
JP2016533135A JP2017500787A (ja) | 2013-11-21 | 2014-11-21 | 信号送受信装置及び信号送受信方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361906899P | 2013-11-21 | 2013-11-21 | |
US61/906,899 | 2013-11-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015076616A1 true WO2015076616A1 (ko) | 2015-05-28 |
Family
ID=53179812
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2014/011262 WO2015076616A1 (ko) | 2013-11-21 | 2014-11-21 | 신호 송수신 장치 및 신호 송수신 방법 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20160295220A1 (ko) |
EP (1) | EP3073742A4 (ko) |
JP (1) | JP2017500787A (ko) |
KR (1) | KR101832779B1 (ko) |
WO (1) | WO2015076616A1 (ko) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106341574A (zh) * | 2016-08-24 | 2017-01-18 | 北京小米移动软件有限公司 | 色域映射方法及装置 |
WO2017053863A1 (en) * | 2015-09-23 | 2017-03-30 | Arris Enterprises Llc | Signaling high dynamic range and wide color gamut content in transport streams |
EP3285474A1 (en) * | 2016-08-16 | 2018-02-21 | Beijing Xiaomi Mobile Software Co., Ltd. | Colour gamut mapping method and apparatus, computer program and recording medium |
WO2018037985A1 (ja) * | 2016-08-22 | 2018-03-01 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6289900B2 (ja) * | 2013-12-27 | 2018-03-07 | 株式会社東芝 | 放送信号送信装置 |
US10645404B2 (en) | 2014-03-24 | 2020-05-05 | Qualcomm Incorporated | Generic use of HEVC SEI messages for multi-layer codecs |
US9729801B2 (en) * | 2014-10-02 | 2017-08-08 | Dolby Laboratories Licensing Corporation | Blending images using mismatched source and display electro-optical transfer functions |
US10306269B2 (en) * | 2014-10-10 | 2019-05-28 | Qualcomm Incorporated | Operation point for carriage of layered HEVC bitstream |
GB2534929A (en) * | 2015-02-06 | 2016-08-10 | British Broadcasting Corp | Method and apparatus for conversion of HDR signals |
KR102519209B1 (ko) * | 2015-06-17 | 2023-04-07 | 한국전자통신연구원 | 스테레오스코픽 비디오 데이터를 처리하기 위한 mmt 장치 및 방법 |
EP3119086A1 (en) * | 2015-07-17 | 2017-01-18 | Thomson Licensing | Methods and devices for encoding/decoding videos |
US9916638B2 (en) * | 2016-07-20 | 2018-03-13 | Dolby Laboratories Licensing Corporation | Transformation of dynamic metadata to support alternate tone rendering |
US10834153B2 (en) * | 2016-08-24 | 2020-11-10 | Qualcomm Incorporated | System level signaling of SEI tracks for media data streaming |
CN110915221B (zh) * | 2017-07-20 | 2022-06-07 | 索尼公司 | 发送装置、发送方法、接收装置、以及接收方法 |
CN111373760A (zh) * | 2017-11-30 | 2020-07-03 | 索尼公司 | 发送设备、发送方法、接收设备和接收方法 |
GB2583087B (en) * | 2019-04-11 | 2023-01-04 | V Nova Int Ltd | Decoding a video signal in a video decoder chipset |
BR112022019770A2 (pt) | 2020-03-30 | 2022-11-16 | Bytedance Inc | Método de processamento de vídeo, aparelho para processar dados de vídeo, meios de armazenamento e de gravação não transitórios legíveis por computador |
WO2022067805A1 (zh) * | 2020-09-30 | 2022-04-07 | Oppo广东移动通信有限公司 | 图像预测方法、编码器、解码器以及计算机存储介质 |
WO2023150074A1 (en) * | 2022-02-01 | 2023-08-10 | Dolby Laboratories Licensing Corporation | Beta scale dynamic display mapping |
CN115499078B (zh) * | 2022-08-05 | 2024-04-16 | 鹏城实验室 | 一种新型广播单频网组网方法、系统、介质及终端 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100097204A (ko) * | 2007-12-07 | 2010-09-02 | 에이티아이 테크놀로지스 유엘씨 | 와이드 컬러 개멋 디스플레이 시스템 |
KR101244576B1 (ko) * | 2008-07-10 | 2013-03-25 | 인텔 코포레이션 | 색역 스케일링 방법, 장치 및 시스템 |
US20130088644A1 (en) * | 2010-06-15 | 2013-04-11 | Dolby Laboratories Licensing Corporation | Encoding, Distributing and Displaying Video Data Containing Customized Video Content Versions |
US20130251027A1 (en) * | 2012-03-20 | 2013-09-26 | Dolby Laboratories Licensing Corporation | Complexity Scalable Multilayer Video Coding |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050254575A1 (en) * | 2004-05-12 | 2005-11-17 | Nokia Corporation | Multiple interoperability points for scalable media coding and transmission |
US8014445B2 (en) * | 2006-02-24 | 2011-09-06 | Sharp Laboratories Of America, Inc. | Methods and systems for high dynamic range video coding |
JP2008294870A (ja) * | 2007-05-25 | 2008-12-04 | Funai Electric Co Ltd | デジタル放送受信機 |
US9571856B2 (en) * | 2008-08-25 | 2017-02-14 | Microsoft Technology Licensing, Llc | Conversion operations in scalable video encoding and decoding |
US8731287B2 (en) * | 2011-04-14 | 2014-05-20 | Dolby Laboratories Licensing Corporation | Image prediction based on primary color grading model |
US9060180B2 (en) * | 2011-06-10 | 2015-06-16 | Dolby Laboratories Licensing Corporation | Drift-free, backwards compatible, layered VDR coding |
JP2013090296A (ja) * | 2011-10-21 | 2013-05-13 | Sharp Corp | 符号化装置、送信装置、符号化方法、復号装置、受信装置、復号方法、プログラム、および記録媒体 |
US9756353B2 (en) * | 2012-01-09 | 2017-09-05 | Dolby Laboratories Licensing Corporation | Hybrid reference picture reconstruction method for single and multiple layered video coding systems |
-
2014
- 2014-11-21 US US15/034,735 patent/US20160295220A1/en not_active Abandoned
- 2014-11-21 WO PCT/KR2014/011262 patent/WO2015076616A1/ko active Application Filing
- 2014-11-21 JP JP2016533135A patent/JP2017500787A/ja active Pending
- 2014-11-21 EP EP14863879.4A patent/EP3073742A4/en not_active Withdrawn
- 2014-11-21 KR KR1020167014563A patent/KR101832779B1/ko active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100097204A (ko) * | 2007-12-07 | 2010-09-02 | 에이티아이 테크놀로지스 유엘씨 | 와이드 컬러 개멋 디스플레이 시스템 |
KR101244576B1 (ko) * | 2008-07-10 | 2013-03-25 | 인텔 코포레이션 | 색역 스케일링 방법, 장치 및 시스템 |
US20130088644A1 (en) * | 2010-06-15 | 2013-04-11 | Dolby Laboratories Licensing Corporation | Encoding, Distributing and Displaying Video Data Containing Customized Video Content Versions |
US20130251027A1 (en) * | 2012-03-20 | 2013-09-26 | Dolby Laboratories Licensing Corporation | Complexity Scalable Multilayer Video Coding |
Non-Patent Citations (1)
Title |
---|
LOUIS KEROFSKY ET AL.: "Color Gamut Scalable Video Coding", DATA COMPRESSION CONFERENCE (DCC), 2013, 22 March 2013 (2013-03-22), SNOWBIRD, UT, pages 211 - 220, XP055121596, Retrieved from the Internet <URL:http://ieeexpiore.ieee.org/xpl/login.jsp?tp=&arnumber=6543057&ur1=http%3A%2F%2Fieeexplore.ieee.mg%2Fxpls%2Fabs_all.jsp%3Famumber%3D6543057> * |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102061707B1 (ko) * | 2015-09-23 | 2020-01-02 | 애리스 엔터프라이지즈 엘엘씨 | 전송 스트림들 내의 하이 다이내믹 레인지 및 와이드 컬러 가무트 콘텐츠의 시그널링 |
CN108141614B (zh) * | 2015-09-23 | 2021-06-18 | 艾锐势有限责任公司 | 信号传送传输流中的高动态范围和宽色域内容 |
KR20200003254A (ko) * | 2015-09-23 | 2020-01-08 | 애리스 엔터프라이지즈 엘엘씨 | 전송 스트림들 내의 하이 다이내믹 레인지 및 와이드 컬러 가무트 콘텐츠의 시그널링 |
JP7066786B2 (ja) | 2015-09-23 | 2022-05-13 | アリス エンタープライジズ エルエルシー | トランスポートストリームにおける高ダイナミックレンジおよび広色域コンテンツの伝達 |
GB2556810A (en) * | 2015-09-23 | 2018-06-06 | Arris Entpr Inc | Signaling high dynamic range and wide color gamut content in transport streams |
CN108141614A (zh) * | 2015-09-23 | 2018-06-08 | 艾锐势有限责任公司 | 信号传送传输流中的高动态范围和宽色域内容 |
JP2018530237A (ja) * | 2015-09-23 | 2018-10-11 | アリス エンタープライジズ エルエルシーArris Enterprises Llc | トランスポートストリームにおける高ダイナミックレンジおよび広色域コンテンツの伝達 |
KR102347034B1 (ko) | 2015-09-23 | 2022-01-04 | 애리스 엔터프라이지즈 엘엘씨 | 전송 스트림들 내의 하이 다이내믹 레인지 및 와이드 컬러 가무트 콘텐츠의 시그널링 |
US11146807B2 (en) | 2015-09-23 | 2021-10-12 | Arris Enterprises Llc | Signaling high dynamic range and wide color gamut content in transport streams |
GB2588736B (en) * | 2015-09-23 | 2021-09-08 | Arris Entpr Llc | Signaling high dynamic range and wide color gamut content in transport streams |
JP2020182233A (ja) * | 2015-09-23 | 2020-11-05 | アリス エンタープライジズ エルエルシーArris Enterprises Llc | トランスポートストリームにおける高ダイナミックレンジおよび広色域コンテンツの伝達 |
US10432959B2 (en) | 2015-09-23 | 2019-10-01 | Arris Enterprises Llc | Signaling high dynamic range and wide color gamut content in transport streams |
KR102271252B1 (ko) | 2015-09-23 | 2021-06-30 | 애리스 엔터프라이지즈 엘엘씨 | 전송 스트림들 내의 하이 다이내믹 레인지 및 와이드 컬러 가무트 콘텐츠의 시그널링 |
US11695947B2 (en) | 2015-09-23 | 2023-07-04 | Arris Enterprises Llc | Signaling high dynamic range and wide color gamut content in transport streams |
KR20210082266A (ko) * | 2015-09-23 | 2021-07-02 | 애리스 엔터프라이지즈 엘엘씨 | 전송 스트림들 내의 하이 다이내믹 레인지 및 와이드 컬러 가무트 콘텐츠의 시그널링 |
WO2017053863A1 (en) * | 2015-09-23 | 2017-03-30 | Arris Enterprises Llc | Signaling high dynamic range and wide color gamut content in transport streams |
US10869053B2 (en) | 2015-09-23 | 2020-12-15 | Arris Enterprises Llc | Signaling high dynamic range and wide color gamut content in transport streams |
GB2556810B (en) * | 2015-09-23 | 2021-03-24 | Arris Entpr Llc | Signaling high dynamic range and wide color gamut content in transport streams |
GB2588736A (en) * | 2015-09-23 | 2021-05-05 | Arris Entpr Llc | Signaling high dynamic range and wide color gamut content in transport streams |
KR102189189B1 (ko) * | 2016-08-16 | 2020-12-09 | 베이징 시아오미 모바일 소프트웨어 컴퍼니 리미티드 | 색 영역 매핑 방법 및 장치 |
EP3285474A1 (en) * | 2016-08-16 | 2018-02-21 | Beijing Xiaomi Mobile Software Co., Ltd. | Colour gamut mapping method and apparatus, computer program and recording medium |
US10325569B2 (en) | 2016-08-16 | 2019-06-18 | Beijing Xiaomi Mobile Software Co., Ltd. | Method and apparatus for coding image information for display |
JP2018537870A (ja) * | 2016-08-16 | 2018-12-20 | ベイジン シャオミ モバイル ソフトウェア カンパニーリミテッド | 色域マッピング方法および装置 |
KR20180132095A (ko) * | 2016-08-16 | 2018-12-11 | 베이징 시아오미 모바일 소프트웨어 컴퍼니 리미티드 | 색 영역 매핑 방법 및 장치 |
RU2671763C1 (ru) * | 2016-08-16 | 2018-11-06 | Бейджин Сяоми Мобайл Софтвеа Ко., Лтд. | Способ и устройство для отображения на цветовое пространство |
WO2018037985A1 (ja) * | 2016-08-22 | 2018-03-01 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
CN106341574A (zh) * | 2016-08-24 | 2017-01-18 | 北京小米移动软件有限公司 | 色域映射方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
US20160295220A1 (en) | 2016-10-06 |
EP3073742A1 (en) | 2016-09-28 |
JP2017500787A (ja) | 2017-01-05 |
KR101832779B1 (ko) | 2018-04-13 |
KR20160086349A (ko) | 2016-07-19 |
EP3073742A4 (en) | 2017-06-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015076616A1 (ko) | 신호 송수신 장치 및 신호 송수신 방법 | |
WO2015126144A1 (ko) | 파노라마 서비스를 위한 방송 신호 송수신 방법 및 장치 | |
WO2014073927A1 (ko) | 신호 송수신 장치 및 신호 송수신 방법 | |
WO2014073853A1 (ko) | 신호 송수신 장치 및 신호 송수신 방법 | |
WO2015072754A1 (ko) | Hdr 방송 서비스 제공을 위한 방송 신호 송수신 방법 및 장치 | |
WO2014084564A1 (ko) | 신호 송수신 장치 및 신호 송수신 방법 | |
WO2015152635A1 (ko) | 신호 송수신 장치 및 신호 송수신 방법 | |
WO2018004291A1 (ko) | 방송 신호 송신 방법, 방송 신호 수신 방법, 방송 신호 송신 장치 및 방송 신호 수신 장치 | |
WO2015102449A1 (ko) | 컬러 개멋 리샘플링을 기반으로 하는 방송 신호 송수신 방법 및 장치 | |
WO2014025213A1 (ko) | 신호 송수신 장치 및 신호 송수신 방법 | |
WO2015178598A1 (ko) | 디스플레이 적응적 영상 재생을 위한 비디오 데이터 처리 방법 및 장치 | |
WO2012036532A2 (en) | Method and apparatus for processing a broadcast signal for 3d (3-dimensional) broadcast service | |
WO2012023789A2 (ko) | 디지털 방송 신호 수신 장치 및 방법 | |
WO2016024794A1 (ko) | 방송신호 전송방법, 방송신호 수신방법, 방송신호 전송장치, 방송신호 수신장치 | |
WO2011021822A2 (en) | Method and apparatus for processing signal for three-dimensional reproduction of additional data | |
WO2015034306A1 (ko) | 디지털 방송 시스템에서 고화질 uhd 방송 컨텐츠 송수신 방법 및 장치 | |
WO2011059290A2 (en) | Method and apparatus for generating multimedia stream for adjusitng depth of 3-dimensional additional video reproduction information, and method and apparatus for receiving multimedia stream for adjusitng depth of 3-dimensional additional video reproduction information | |
WO2009151265A2 (ko) | 방송 신호 수신 방법 및 수신 시스템 | |
WO2016018066A1 (ko) | 방송신호 전송방법, 방송신호 수신방법, 방송신호 전송장치, 방송신호 수신장치 | |
WO2010021525A2 (en) | A method for processing a web service in an nrt service and a broadcast receiver | |
WO2016204481A1 (ko) | 미디어 데이터 전송 장치, 미디어 데이터 수신 장치, 미디어 데이터 전송 방법, 및 미디어 데이터 수신 방법 | |
WO2012030158A2 (en) | Method and apparatus for processing and receiving digital broadcast signal for 3-dimensional display | |
WO2015080414A1 (ko) | 트릭 플레이 서비스 제공을 위한 방송 신호 송수신 방법 및 장치 | |
WO2012077987A2 (ko) | 디지털 방송 신호 수신 장치 및 방법 | |
WO2015065037A1 (ko) | Hevc 기반의 ip 방송 서비스 제공을 위한 방송 신호 송수신 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14863879 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15034735 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2016533135 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20167014563 Country of ref document: KR Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014863879 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014863879 Country of ref document: EP |