KR20150006340A - Method and apparatus for providing three-dimensional video - Google Patents

Method and apparatus for providing three-dimensional video Download PDF

Info

Publication number
KR20150006340A
KR20150006340A KR1020140057614A KR20140057614A KR20150006340A KR 20150006340 A KR20150006340 A KR 20150006340A KR 1020140057614 A KR1020140057614 A KR 1020140057614A KR 20140057614 A KR20140057614 A KR 20140057614A KR 20150006340 A KR20150006340 A KR 20150006340A
Authority
KR
South Korea
Prior art keywords
image
additional
receiver
network
reference image
Prior art date
Application number
KR1020140057614A
Other languages
Korean (ko)
Inventor
이진영
윤국진
정원식
이광순
허남호
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to US14/326,262 priority Critical patent/US20150009289A1/en
Publication of KR20150006340A publication Critical patent/KR20150006340A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/86Arrangements characterised by the broadcast information itself
    • H04H20/88Stereophonic broadcast systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/178Metadata, e.g. disparity information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals

Abstract

A method and apparatus for providing a three-dimensional (3D) video are provided. The method and the apparatus for providing a three-dimensional (3D) video, transmits a base view video of the 3D video to a receiver via a first network which is a terrestrial network, transmits an additional view video of the 3D video to the receiver via a second network, and provides the 3D video based on the base view video and the additional view video.

Description

METHOD AND APPARATUS FOR PROVIDING THREE-DIMENSIONAL VIDEO [0002]

TECHNICAL FIELD The present invention relates to a three-dimensional image providing technique, and more particularly, to a three-dimensional image providing method and apparatus using a broadcast network.

Through binocular parallax, one can feel things close or far. Three - dimensional images provide stereoscopic effect to the viewer 's eyes by the principle of recognizing stereoscopic effect by visual.

The three-dimensional image can be provided using a plurality of two-dimensional images. For example, a three-dimensional image can be generated using a two-dimensional image corresponding to the left eye of the viewer and a two-dimensional image corresponding to the right eye.

Existing broadcasting environments may be suitable for transmitting two-dimensional images. The two-dimensional image in the existing broadcasting environment may be the reference image of the three-dimensional image. If an additional image is provided with the reference image, a three-dimensional image can be provided to the viewer.

Additional bandwidth may be required to transmit additional images in existing broadcast environments.

One embodiment may provide an apparatus and method for providing a three-dimensional image.

One embodiment can provide an apparatus and method for providing a three-dimensional image through a heterogeneous network.

According to an aspect of the present invention, a broadcasting apparatus for providing a three-dimensional image through a heterogeneous network includes a first processing unit for encoding a reference image of the three-dimensional image, and multiplexing the encoded reference image with a transport stream (TS) A first transmitter for transmitting a reference image multiplexed with the TS to a receiver via a first network, a second processor for encoding an additional image of the three-dimensional image, and a second processor for transmitting the encoded additional image to the receiver via a second network Wherein the first network is a terrestrial network and the three-dimensional image is based on the reference image and the additional image.

The first network may be an Advanced Television System Committee (ATSC) terrestrial network.

The second network may be a broadband network.

The reference image and the additional image are transmitted to the receiver in real time, so that the 3D image can be provided in real time broadcasting.

The second transmitter may transmit the entire data of the additional image to the receiver before the data of the reference image is transmitted to the receiver.

The 3D image may be provided in a non-real-time manner.

Wherein the second transmission unit transmits some data of the additional image to the receiver before the data of the reference image is transmitted to the receiver and transmits the remaining data of the additional image to the receiver when the reference image is transmitted to the receiver Lt; / RTI >

The 3D image may be provided in a non-real-time manner.

The reference image multiplexed with the TS may include a first PTS (Presentation Time Stamp) indicating the reproduction time of the reference image.

The encoded additional image may include a second PTS indicating the reproduction time of the additional image.

The first PTS and the second PTS may be used for synchronization of the reference image and the additional image.

The first processing unit may multiplex the encoded reference image into the TS based on the encoded reference image and the metadata related to the 3D image.

The metadata may include pairing information for synchronizing the reference image and the additional image.

The pairing information may include a Uniform Resource Identifier (URI) that provides the additional image.

And the additional image may be transmitted to the receiver through the URI.

The second processing unit can format the encoded additional image by converting the encoded additional image into MPEG-2 or MPEG-4 format.

And the second transmitting unit may transmit the formatted additional image to the receiver.

According to another aspect, a broadcasting apparatus for providing a three-dimensional image includes a first processing unit for encoding a reference image of the three-dimensional image and multiplexing the encoded reference image with a transport stream (TS) A second processor for encoding the additional image of the three-dimensional image, multiplexing the encoded additional image, and a multiplexer for multiplexing the multiplexed additional image, Wherein the first network terrestrial network is a first network terrestrial network and the second network is a non-real-time second network, and wherein the three-dimensional image is provided based on the reference image and the additional image.

The first network may be an Advanced Television System Committee (ATSC) terrestrial network.

The second network may be an ATSC NRT network.

The reference image multiplexed with the TS may include a first PTS (Presentation Time Stamp) indicating the reproduction time of the reference image.

The multiplexed additional image may include a second PTS indicating the reproduction time of the additional image.

The first PTS and the second PTS may be used for synchronization of the reference image and the additional image.

Playback information for the 3D image may be provided based on the first PTS.

The first processing unit may multiplex the encoded reference image to TS based on the encoded reference image and the metadata related to the three-dimensional image.

The second processing unit can format the encoded additional image by converting the encoded additional image into MPEG-2 or MPEG-4 format.

The second processing unit may multiplex the formatted additional images.

And the second processing unit may signal the additional image to transmit the multiplexed additional image to the receiver.

The signaling may use a Service Signaling Channel (SSC) to transmit a Service Map Table (SMT) and a Non-Real-Time Information Table (NRT-IT).

Wherein the SMT provides information about a service providing the 3D image,

The NRT-IT may provide the item of content item to configure the service.

According to another aspect, a method for providing a three-dimensional image through a heterogeneous network includes encoding a reference image of the three-dimensional image, multiplexing the encoded reference image with a transport stream (TS) , Transmitting a reference image multiplexed with the TS to a receiver via a first network, encoding an additional image of the 3D image, and transmitting the encoded additional image to the receiver via a second network Wherein the first network is a terrestrial network, and the three-dimensional image is provided based on the reference image and the additional image.

According to another aspect, a method of providing a three-dimensional image includes encoding a reference image of the three-dimensional image, multiplexing the encoded reference image into a transport stream (TS) Transmitting a multiplexed reference image in real time to a receiver over a first network, encoding an additional image of the three-dimensional image, multiplexing the encoded additional image, and encoding the encoded additional image in non- And transmitting the three-dimensional image to the receiver through a second network, wherein the three-dimensional image is provided based on the multiplexed reference image and the multiplexed additional image.

The method and apparatus may provide a compatible broadcast service between two-dimensional and three-dimensional images.

The method and apparatus according to an exemplary embodiment may provide a three-dimensional image through a heterogeneous network.

FIG. 1 shows a conceptual diagram for providing a three-dimensional image according to an example.
FIG. 2 illustrates a first scenario for providing a three-dimensional image according to an example.
FIG. 3 illustrates a second scenario for providing a three-dimensional image according to an example.
FIG. 4 shows a third scenario for providing a three-dimensional image according to an example.
FIG. 5 illustrates a fourth scenario for providing a three-dimensional image according to an example.
6 is a block diagram of a broadcasting apparatus according to an embodiment of the present invention.
FIG. 7 illustrates a flow diagram of a method for providing a three-dimensional image in accordance with one embodiment.
FIG. 8 shows a flow diagram of a method for transmitting a reference image to a receiver according to an example.
FIG. 9 illustrates a system for providing a three-dimensional image according to an example.
FIGS. 10A to 10D show a flow chart of a method in which a receiver that receives an additional image through a DASH outputs a three-dimensional image.
FIGS. 11A to 11D show a flowchart of a method of a receiver receiving an additional image through download and outputting a three-dimensional image.
Figure 12 shows a flow diagram of a method for providing a three-dimensional image in accordance with an embodiment.
FIG. 13 illustrates a system for providing a three-dimensional image according to an example.
14A to 14D show a flow chart of a method in which a receiver receiving an additional image through an ATSC NRT network outputs a three-dimensional image.

In the following, embodiments will be described in detail with reference to the accompanying drawings. Like reference symbols in the drawings denote like elements.

Various modifications may be made to the embodiments described below. It is to be understood that the embodiments described below are not intended to limit the embodiments, but include all modifications, equivalents, and alternatives to them.

The terms used in the examples are used only to illustrate specific embodiments and are not intended to limit the embodiments. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises" or "having" and the like refer to the presence of stated features, integers, steps, operations, elements, components, or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this embodiment belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.

In the following description of the present invention with reference to the accompanying drawings, the same components are denoted by the same reference numerals regardless of the reference numerals, and redundant explanations thereof will be omitted. In the following description of the embodiments, a detailed description of related arts will be omitted if it is determined that the gist of the embodiments may be unnecessarily blurred.

FIG. 1 shows a conceptual diagram for providing a three-dimensional image according to an example.

Compared to a broadcasting service providing a 2D image, a broadcasting service providing 3D images may require a larger bandwidth to provide a 3D image. In the case of a broadcast service compatible with a 2D image and a 3D image, a bandwidth twice as high as that of a broadcast service providing a 2D image may be required.

A compatible broadcast service may be a broadcast service, which is an existing 2D image having at least one resolution of the plurality of images having the same resolution as the production resolution when the 3D image is composed of a plurality of compressed images. The plurality of images may include a base view video and an additional view video.

Broadband networks and ATSC NRT (Advanced Television System Committee) non-real time (NRT) networks can be used to provide 3D images in broadcast systems that provide conventional 2D images without requiring additional bandwidth.

A method of providing a 3D image using a broadband network can be named as SCHCBB (Service Compatible Hybrid Coded 3D using BroadBand).

A method of providing a 3D image using an ATSC NRT network may be named as SCHCNRT (Service Compatible Hybrid Coded 3D using ATSC NRT).

The 3D image may be provided based on the reference image and the additional image. The reference image may be an existing 2D image. The additional image may be an image added to the reference image to provide a 3D image. For example, the viewpoint of the additional image may be different from the viewpoint of the reference image.

The reference image of the 3D image may be transmitted to the receiver via the first network. For example, the first network may be a terrestrial network. The terrestrial network may be a broadcast network. The terrestrial network can be an ATSC terrestrial network.

The additional image of the 3D image can be transmitted to the receiver via the second network.

For example, the second network may be a broadband network. The broadband network may be a network over the Internet. That is, the broadband network may be a data communication network rather than a broadcast network.

As another example, the second network may be a terrestrial network. The terrestrial network may be an ATSC NRT network.

If the reference image is transmitted to the receiver over the ATSC terrestrial network and the additional image is transmitted over the broadband network or the ATSC NRT network to the receiver, additional bandwidth for transmitting the additional image may not be required in the system transmitting the reference image have.

A service scenario for providing a 3D image will be described in detail below with reference to Figs. 2 to 5.

Reference can be made to the references to describe the following examples. References are as follows.

References [1]: IEEE / ASTM: "Use of the International Systems of Units (SI): The Modern Metric System," Doc. SI 10-2002, Institute of Electrical and Electronics Engineers, New York, N.Y.

[2]: ATSC: "ATSC Digital Television Standard, Part 3 - Service Multiplex and Transport Subsystem Characteristics," Doc. A / 53, Part 3: 201x, Advanced Television Systems Committee, Washington, D.C., [TBD].

[3]: ATSC: "ATSC Digital Television Standard, Part 4 - MPEG-2 Video System Characteristics," Doc. A / 53, Part 4: 201x, Advanced Television Systems Committee, Washington, D.C., [TBD].

[4]: ATSC: "Use of AVC in the ATSC Digital Television System, Part 1 - Video System Characteristics," Doc. A / 72, Part 1, Advanced Television Systems Committee, Washington, D.C., 29 July 2008.

Reference [5]: ATSC: "Program and System Information Protocol for Terrestrial Broadcast and Cable," Doc. A / 65: 2009, Advanced Television Systems Committee, Washington, D.C., 3 August 2009.

Reference [6]: ATSC: "ATSC Parameterized Services Standard," Doc. A / 71, Advanced Television Systems Committee, Washington, D.C., 26 March 2007

Reference [7]: ITU-T Recommendation H.262 | ISO / IEC 13818-2: "Information technology - Generic coding of moving pictures and associated audio information: Video"

Reference [8]: ITU-T Recommendation H.264 | ISO / IEC 14496-10: 2010: "Information technology - Coding of audio-visual objects - Part 10: Advanced Video Coding."

Reference [9]: ITU-T Recommendation H.222.0 (2012) | ISO / IEC 13818-1: 2012, "Information technology - Generic coding of moving pictures and associated audio information: Systems"

[10]: CEA: CEA-708.1, "Digital Television Closed Captioning: 3D Extensions," Consumer Electronics Association, Arlington, VA, 2012

[11]: ATSC: "Use of AVC in the ATSC Digital Television System, Part 2 - Transport Subsystem Characteristics," Doc. A / 72, Part 2, Advanced Television Systems Committee, Washington, D.C., 29 July 2008.

[12]: ATSC: "Non-Real-Time Content Delivery," Doc. A / 103, Advanced Television Systems Committee, Washington, D.C., 9 May 2012.

[13]: ISO / IEC 14496-14: 2003: "Information technology - Coding of audio-visual objects - Part 14: MP4 file format".

References [14]: ISO / IEC 14496-12: 2008: "Information technology - Coding of audio-visual objects - Part 12: ISO base media file format".

[15]: ATSC: "3D-TV Terrestrial Broadcasting, Part 2 - Service Compatible Hybrid Coding Using Real-Time Delivery" Doc. A / 104, Advanced Television Systems Committee, Washington, D.C., 26 December 2012.

Reference [16]: ATSC: "Non-Real-Time Content Delivery" Doc. A / 103: 2012, Advanced Television Systems Committee, Washington, DC, 9 May 2012.

FIG. 2 illustrates a first scenario for providing a three-dimensional image according to an example.

The first scenario 200 may be a service scenario for providing live content.

Live content can be stored in the 3D content server. The 3D content server can provide a 3D image to a receiver using a terrestrial network and a broadband network.

The first scenario 200 may be named SCHCBB streaming scheme.

When the 3D image is live content, the method of providing the 3D image will be described in detail with reference to Figs. 6 to 10 below.

The technical contents described above with reference to FIG. 1 can be applied as it is, so a detailed description will be omitted below.

FIG. 3 illustrates a second scenario for providing a three-dimensional image according to an example.

The second scenario 300 may be a service scenario for providing prerecorded content.

The reference image and the additional image can be stored in the reference image server and the additional image server, respectively.

The second scenario 300 may further transmit the additional image to the receiver before the reference image is transmitted to the receiver. Additional images can be transmitted to the receiver over the broadband network. The second scenario 300 may be named SCHCBB downloading scheme.

For example, if the reference image is to be transmitted to the receiver from 09:00, the additional image may be transmitted from 08:55. Some data of the additional image may be transmitted to the receiver before the data of the reference image is transmitted to the receiver.

In the second scenario 300, if the reference image is transmitted to the receiver, the remaining data of the additional image may be transmitted to the receiver. For example, when a reference image is transmitted to a receiver, an additional image output later than a reference image to be transmitted may be transmitted to a receiver.

A method of providing a 3D image by transmitting some data of an additional image to a receiver using a broadband network before data of the reference image is transmitted to a receiver will be described in detail with reference to FIGS. 6 to 9 and 11 do.

The technical contents described above with reference to Figs. 1 and 2 can be applied as they are, so that a more detailed description will be omitted below.

FIG. 4 shows a third scenario for providing a three-dimensional image according to an example.

The third scenario 400 may be a service scenario for providing prerecorded content.

The reference image and the additional image can be stored in the reference image server and the additional image server, respectively.

The third scenario 400 may be pre-transmitted to the receiver before the reference image is transmitted to the receiver. Additional images can be transmitted to the receiver over the broadband network. The third scenario 400 may be included in the SCHCBB download method.

For example, if the reference image is to be transmitted to the receiver from 09:00, the entire data of the additional image may be transmitted to the receiver before 09:00.

A method of providing a 3D image by transmitting the entire data of the additional image to the receiver using the wide-band network before the data of the reference image is transmitted to the receiver will be described in detail with reference to FIGS. 6 to 9 and 11 do.

The technical contents described with reference to Figs. 1 to 3 can be applied as it is, and a detailed description will be omitted below.

FIG. 5 illustrates a fourth scenario for providing a three-dimensional image according to an example.

The fourth scenario 500 may be a service scenario for providing prerecorded content.

The reference image and the additional image can be stored in the reference image server and the additional image server, respectively.

The fourth scenario 500 may be transmitted in advance to the receiver before the reference image is transmitted to the receiver. The additional image can be transmitted to the receiver via the terrestrial network. The terrestrial network may be an ATSC NRT network. The fourth scenario 500 may be the SCHCNRT scheme.

For example, if the reference image is to be transmitted to the receiver from 09:00, the entire data of the additional image may be transmitted to the receiver before 09:00.

A method of providing a 3D image by transmitting the entire data of the additional image to the receiver using the terrestrial network before the data of the reference image is transmitted to the receiver will be described in detail with reference to FIG. 12 to FIG.

The technical contents described above with reference to Figs. 1 to 4 can be applied as they are, so that a more detailed description will be omitted below.

6 is a block diagram of a broadcasting apparatus according to an embodiment of the present invention.

The broadcasting apparatus 600 may include a first processing unit 610, a first transmitting unit 620, a second processing unit 630, a second transmitting unit 640, and a storage unit 650.

The broadcast apparatus 600 may perform the first scenario 200, the second scenario 300, and the third scenario 400 described above.

The broadcasting apparatus 600 can provide the 3D image to the receiver through the heterogeneous network. The heterogeneous network may be a plurality of different networks. For example, the heterogeneous network may be a terrestrial network and a broadband network.

The first processing unit 610, the first transmitting unit 620, the second processing unit 630, the second transmitting unit 640, and the storage unit 650 will be described below in detail with reference to FIGS. 7 through 11. FIG.

The technical contents described above with reference to Figs. 1 to 5 may be applied as they are, so that a more detailed description will be omitted below.

FIG. 7 illustrates a flow diagram of a method for providing a three-dimensional image in accordance with one embodiment.

In step 710, the first processing unit 610 may encode the reference image of the 3D image. The encoding of the reference image may be compression of the reference image. The storage unit 650 may store the reference image. For example, the storage unit 650 may be the 3D content server of FIG. 2, the reference video server of FIG. 3, or the reference video server of FIG.

The first processing unit 610 may encode the reference image using an encoder. For example, the encoder may be MPEG-2. The first processing unit 610 can encode the reference image in compliance with the MPEG-2 video Main Profile @ High Level or the Main Profile @ Main Level in Reference [7]. The first processor 610 can encode the reference image in compliance with the ATSC A / 53 Part 4 and the ATSC A / 72 Part 1, respectively.

The encoding format of the encoded reference image may be one of the formats shown in [Table 1] below.

Vertical size Horizontal size Display aspect ratio / Sample aspect ratio Frame rate Progressive
/ Interview
1080 1920 16: 9 / square sample 23.976, 24, 29.97, 30 P 1080 1920 16: 9 / square sample 29.97, 30 I 720 1280 16: 9 / square sample 23.976, 24, 29.97, 30, 59.94, 60 P

According to one aspect, the first processing unit 610 may encode the reference image using ancillary data. For example, the additional data may be Closed Captioning data. Subtitle broadcast data can be included in the reference video by observing ATSC A / 53 Part 4 of [3]. Captioning broadcast commands (eg, parallax information) to support the position of the z-axis of the caption window can comply with CEA-708.1 of [10]. Caption broadcast commands may be carried in cc_data () specified in section 6.2.3.1 of ATSC A / 53 Part 4 of [3].

The format of the encoded reference image may be an elementary stream (ES).

In step 720, the first processing unit 610 may multiplex the encoded reference images. The first processing unit 610 may multiplex the encoded reference image using the encoded audio.

In one aspect, the first processing unit 610 can multiplex the encoded reference image into a transport stream (TS).

The first processing unit 610 may multiplex the encoded reference image using a multiplexer (MuX). For example, multiplexing may be program multiplexing.

The multiplexing of reference images can comply with ATSC A / 53 Part 3 of [2].

According to an aspect, the first processing unit 610 can multiplex the reference image encoded using the encoded reference image and the metadata into TS. The metadata may include information used to generate the 3D image.

The metadata may include Program Specific Information (PSI).

The PSI includes a PAT (Program Association Table) for holding a program information list in a channel, a CAT (Conditional Access Table) including access control information such as scrambling, a PMT (Program Map Table), and an NIT (Network Information Table) including information on a network used for transmitting MPEG information.

The PMT can provide program_number and information about each program present in the TS. The PMT can list the elementary streams that make up the MPEG-2 program. The PMT may provide location information for optional descriptors for each elementary stream as well as for optional descriptors that describe a complete MPEG-2 stream. Each elementary stream can be identified by a stream_type value.

For example, the reference image may be signaled using the stream_type value 0x02. As another example, the image frame synchronization metadata of the SCHCBB, which will be described below, can be signaled using the stream_type value 0x06.

The stereoscopic_program_info_descriptor () and stereoscopic_video_info_descriptor () specified in reference [9] can be used to signal the program provided via SCHCBB (outlined below as the SCHCBB program). Hereinafter, what is displayed by the method of a_b_c () or a_b_c may represent a field of programming code or code.

The stereoscopic_program_info_descriptor () specified in reference [9] can be present in the loop after the program_info_length field of the PMT to inform the receiver of the existence of the SCHCBB program. For SCHCBB, stereoscopic_service_type can be specified as '011'.

stereoscopic_service_type is described with reference to [Table 2] below.

value Explanation 000 Not specified 001 2D (monoscopic) service 010 Frame-compatible Stereoscopic 3D Service 011 Services - Compatible Stereoscopic 3D Service 100 - 111 Rec. ITU-T H.222.0 | ISO / IEC 13818-1 Reserved

The stereoscopic_service_type may be designated as '001' to indicate that the reference video stream and the additional video stream of the SCHCBB program transmit the same video (program or content).

The stereoscopic_program_info_descriptor () specified in reference [9] can exist in the loop following the field of ES_info_length in the PMT to identify the view element of the SCHCBB program (ie, the primary video stream and the additional video stream).

The values of horizontal_upsampling_factor and vertical_upsampling_factor may be used to signal the upsampling factor of the additional image.

The reference image may include a first PTS (Presentation Time Stamp) indicating the reproduction time of the reference image. Playback information for the 3D image may be provided based on the first PTS.

And the additional image may include a second PTS indicating the reproduction time of the additional image. The first PTS and the second PTS may be used for synchronization of the reference image and the additional image.

If the first PTS and the second PTS are different, data may be required to pair the frames of the reference image and the additional image. The frames of the reference image and the additional image are paired so that the 3D image can be synchronized.

The metadata may include pairing information for synchronizing the reference image and the additional image. For example, metadata may include data for pairing frames of a reference image and an additional image in media_pairing_information (). The media_pairing_information () can be multiplexed with a reference image by being loaded in a PES (Packetized Elementary Stream) packet.

When the additional image is transmitted in the streaming, media_pairing_information () may be multiplexed with the reference image for the additional image.

The PES packet will be described with reference to Table 3 below.

Stream_id (stream_id) 0xBD data_alignment_indicator 'One'
Note: '1', which means the start of media pairing information, should match the start of the PES payload
PES_packet_data_byte Are consecutive bytes of data from the ES indicated by the stream_id or PID (Program IDentifier) of the packet.

The PES_data_field () for PES_packet_data_byte is described in detail in Table 4 below.

Syntax Number of bits form PES_data_field () {
data_identifier
media_pairing_information ()
}

8
var (various)

unisbf

unisbf can represent unsigned integer, most significant bit first. That is, unisbf may be the best bit of an unsigned integer.

The data_identifier field in Table 4 may identify the private stream PES for the SCHCBB. For example, the value of the field of data_identifier may be 0x33.

The media_pairing_information () in Table 4 can provide media pairing information used for synchronization of the reference image and the additional image.

The syntax for media_pairing_information () is described with reference to Table 5 below.

construction Number of bits form media_pairing_information () {
referenced_media_filename_length
for (i = 0; i <referenced_media_filename_length; i ++) {
referenced_media_filename_byte
}
reserved
frame_number
}

8

8

7
25

unisbf

unisbf

unisbf
unisbf

The field of referenced_media_filename_length in [Table 5] can provide the length of the bytes in the field of referenced_media_filename_byte.

The field referenced_media_filename_byte in Table 5 may provide a Uniform Resource Identifier (URI) of the reference media. For example, the reference media may be a file of additional video, additional video stream, or additional video. That is, the pairing information may include a URI that provides an additional image.

The additional image can be transmitted to the receiver through the URI by identifying the URI that the receiver provides additional images.

The field of frame_number in Table 5 may indicate the frame number of streams of SCHCBB. The frame number can start with 0 at the start. The frame number may increase sequentially.

The metadata may include the referenced media information.

The referenced media information may be listed in referenced_media_information (). The referenced_media_information () may provide access, synchronization and playback information of the additional image associated with the reference image. referenced_media_information () may be provided with stream_type 0x05 specified in reference [9] for additional video stream information.

The structure of private_section () containing referenced_media_information () is described in detail in Table 6 below. The restrictions of private_section () are described in detail in Table 7 below.

construction Number of bits form private_section () {
table_id
section_syntax_indicator
private_indicator
reserved
private_section_length
if (section_syntax_indicator = '0') {
for (i = 0; i &lt;N; i ++) {
private_data_byte
}
}
}

8
One
One
2
12


8



'0x41'
bslbf
bslbf
bslbf
unisbf


bslbf


bslbf can represent a bit string, left bit first. That is, bslbf may be the leftmost bit in the bit string.

Table_id
(table_id)
0x41 (user only)
section_syntax_indicator '0'
Referenced media information () follows private_section_length.
private_indicator 'One' private_data_byte Follow referenced_media_information () in [Table 8] below.

construction Number of bits form referenced_media_information () {
version_number
num_hybrid_service_programs
for (i = 0; i <num_hybrid_service_programs; i ++) {
hybrid_delivery_protocol
additionalview_availability_indicator
hybrid_service_sync_type
reserved
num_reference_media_files
for (i = 0; i <num_reference_media_files; i ++) {
referenced_media_play_start_time
referenced_media_expiration_time
referenced_media_filesize
referenced_media_type
referenced_media_codec_info
referenced_media_files_URI_length
for (i = 0; i &lt;referenced_media_files_URI_length; i ++) {
referenced_media_ files_URI_byte
}
}
}

8
8

2
2
2
2
8

32
32
32
4
4
8

there



uimsbf
uimsbf

uimsbf
bslbf
uimsbf
'11'
uimsbf

uimsbf
uimsbf
uimsbf
uimsbf
uimsbf
uimsbf

8 * N


The field of version_number in [Table 8] can provide the version number of referenced_media_information (). The value of the field version_number may increase monotonically as the information of referenced_media_information () changes. For example, the value of the field of version_number may be incremented by one.

The num_hybrid_service_programs field in Table 8 can represent the number of SCHCBBs provided.

The field of hybrid_delivery_protocol in Table 8 can provide the transmission protocol of the SCHCBB additional image as defined in [Table 9] below.

value Explanation 0x00 Forbidden 0x01 DASH 0x02 HTTP without DASH 0x03 FTP 0x04 to 0xFF Reserved for future use

The type of method by which additional images are sent to the receiver can be provided by hybrid_delivery_protocol. For example, the additional image may be provided to the receiver using a Dynamic Adaptive Streaming over Http (DASH) protocol, an HTTP protocol that does not use DASH, or an FTP protocol.

The values of the fields of additionalview_availability_indicator in [Table 8] are described in detail in Table 10 below.

value Explanation 00 Additional video by streaming available at program start time 01 Additional images available for download and streaming before program start time (partial download) 10 All additional images are downloaded before the program start time (full download) 11 Reserved for future use

The additionalview_availability_indicator may indicate the effectiveness of the additional image. The value '00' of the field of additionalview_availability_indicator may indicate that the additional image is provided at the same time as the reference image. If there is a delay in the network transmitting the additional image, the reference image may need to be buffered in the receiver to be synchronized with the additional image.

The value '01' of the field of the additionalview_availability_indicator may indicate that the additional image is provided before the reference image. The value '01' of the field may indicate that the data of part of the additional image is transmitted to the receiver before the start time of the program (for example, 3D image), but the entire data of the additional image is not downloaded before the program start time . The data of the partially downloaded additional image may be buffered in the receiver to be synchronized with the reference image.

The value '10' of the field of additionalview_availability_indicator may indicate that the entire data of the additional image is downloaded to the receiver to be synchronized with the reference image before the program start time.

The values of the fields of the hybrid_service_sync_type in Table 8 are described in Table 11 below.

value Explanation 00 Forbidden 01 SMPTE_timecode in ES 10 Media_pairing_information in PES packet 11 Reserved for future use

The field of hybrid_service_sync_type may indicate information about the synchronization method of the reference image and the additional image.

Synchronization of the reference image and the additional image can be performed at the ES level or the PES level. For the ES level synchronization, the value of the field of the hybrid_service_sync_type field may be '01'. For the PES level synchronization, the value of the field of hybrid_service_sync_type may be '10'. In case of PES level synchronization, the field of media_pairing_information described above can be used. In the case of ES level synchronization, information on SMPTE_timecode may be present in a Group Of Pictures (GOP) header of MPEG-2 and Picture Sequencing Supplemental Enhancement Information (SEI) of Advanced Video Coding (AVC).

SMPTE_timecode will be described in detail with reference to Table 12 below.

Time_code Range of values Number of bits form Drop_frame_flag
Time_code_hours
Time_code_minutes
Market_bit
Time_code_seconds
Time_code_pictures

0 to 23
0 to 59
One
0 to 59
0 to 59
One
5
6
One
6
6
bslbf
uimsbf
uimsbf
bslbf
uimsbf
uimsbf

For example, only '1' may be used for Drop_frame_flag in National Television System Committee (NTSC).

The num_reference_media_files field in Table 8 may indicate the number of reference media files constituting the additional image. In the case of the streaming service, since only one MPD (Media Presentation Description) file exists, this field can be designated as '1'.

The field of reference_media_play_start_time in Table 8 may indicate the start time of the additional video stream in the case of a streaming service.

In the case of the streaming service, the value of the field of reference_media_play_start_time may indicate the start time of providing the MPD file. The start time of the MPD file can be the same as the start time of the program.

In the case of the download service, the value of the field of reference_media_play_start_time may indicate the start time of each reference media file constituting the additional image. The value of the field of reference_media_play_start_time may be provided in UTC.

The field of reference_media_expiration_time in [Table 8] can indicate the expiration time of each reference media file. In the case of streaming service, the value of the field of reference_media_expiration_time may match the end time of the program. The value of the field of reference_media_play_start_time may be provided in UTC.

The fields of reference_media_filesize in Table 8 can indicate the size of each reference media file in bytes. In the case of streaming service, the value of the field of reference_media_filesize may be '0'.

The field of referenced_media_type in Table 8 may indicate the type of additional video file (e.g., MP4 or ISOBMFF) or stream (e.g., MPEG-2 TS).

The fields of referenced_media_type are described in detail in Table 13 below.

value Explanation 00 Reserved 01 MPEG-2 TS 10 ISOBMFF 11 MP4

The referenced_media_codec_info field in Table 8 can provide the codec information of the additional video.

The fields of referenced_media_codec_info are described in detail in Table 14 below.

value Explanation 00 AVC / H.264 Main Profile @ Level 4.0 01 AVC / H.264 High Profile @ Level 4.0 10 to 11 Reserved for future use

The referenced_media_files_URI_length field in Table 8 can provide the URI length of the reference media or MPD file.

The referenced_media_files_URI_byte field in [Table 8] can provide the URI information of the reference media or MPD file.

In step 730, the first processor 610 can perform channel multiplexing on the first processor 610 using the multiplexed reference image and the channel multiplex information (channel MuX info). The first processor 610 can generate a channel multiplexed TS by channel multiplexing. For example, the channel multiplexing information may include PSIP (Program and System Information Protocol) data.

According to one aspect, the first processing unit 610 can channel multiplex the reference image using the PSIP data.

The PSIP data includes a STT (System Time Table) indicating current time information, a MGT (Master Guide Table) serving as a pointer to other PSIP tables, a VCT (Virtual Channel Table) allocating a number per channel, An RAT (Rating Region Table) indicating a rating, a program title, an EIT (Event Information Table) as guide data, and an ETT (Extended Text Table) as detailed information on a channel and a program.

The VCT may be a TVCT (Territorial VCT). The TVCT may include information about the various channels that are delivered in the physical terrestrial broadcast channel.

The virtual channel providing SCHCBB may be identified by the service_type specified as 0x09 in the TVCT. The following descriptors may be described following the terrestrial_virtual_channel_table_section () field of the descriptor loop or the descriptors_length field of the cable_virtual_channel_table_section (). The following descriptors may be at least one of Service Location Descriptor and Parameterized Service Descriptor (PSD) of reference [6].

The PSD is described in detail in [Table 15] to [Table 17] below. The PSD may be a field of parameterized_service_descriptor ().

Parameterized_service_descriptor () may be provided in a virtual channel with service_type value 0x09 to convey specific information that the receiver can use to determine whether it can create a meaningful presentation of services in the channel. The PSD can load the payload. The payload syntax and semantics may be application-specific. A field called application_tag can identify the application to which the payload is applied.

construction Number of bits Associative sign parameterized_service_descriptor () {
descriptor_tag
descriptor_length
application_tag
application_data ()
}

8
8
8
there

uimsbf
uimsbf
bslbf

The unsigned 8-bit integer of the field of descriptor_tag in Table 15 may have a value of 0x8D identifying that the descriptor is parameterized_service_descriptor ().

In the unsigned 8-bit integer of the descriptor_length field in Table 15, the length after the above-described field (byte) can be specified at the end of the above descriptor. The maximum length may be 255.

An unsigned 8-bit integer in the field of application_tag in Table 15 can identify the application associated with application_data () below. The values of application_tag may be specified in the present invention and other ATSC standards.

The syntax and semantics of the fields of application_data () in [Table 15] can be specified in the ATSC standard established in association with the application_tag value.

The parameterized_service_descriptor () defined above can be used for delivery of a specific parameter to a specific application. For channels containing 3D content, the value of application_tage may be 0x01. The application_data () for the application_tage value 0x01 is described in detail in Table 16 below.

construction Number of bits form application_data (0x01) {
reserved
3D_channel_type
for (i = 0; i &lt;N; i ++) {
reserved
}
}

3
5

8


uimsbf
uimsbf

bslbf

The 5-bit unsigned integer of the field of 3D_channel_type in Table 16 can indicate the type of 3D service carried in the virtual channel associated with parameterized_service_descriptor (). The coding for 3D_channel_type is described in Table 17 below. SCHCBB can use 0x04. The SCHCNRT to be described later can use 0x05.

3D_channel_type Explanation 0x00 Frame-compatible stereoscopic 3D service - side by side 0x01 Frame-compatible stereoscopic 3D service - top and bottom 0x02 Reserved 0x03 Full-frame stereoscopic 3D service-reference video and additional video streams; Additional view in-band 0x04 Full-Frame Stereoscopic 3D Service - Broadcast and Broadband Hybrid 0x05 Full-frame stereoscopic 3D service - Broadcast and NRT 0x06 - 0x1F Reserved

Table 18 below is an example of TVCT for SCHCBB.

TVCT
...
for (i <num_channels_in_section) {

major_channel_number = 0x003
minor_channel_number = 0x002

program_number = 0x0002

service_type = 0x09 (extended parameterized service)

service_location_descriptor ()
parameterized_service_descriptor ()

}

The field of service_location_descriptor () in Table 18 may indicate the PID of the supplemental image elementary stream of the SCHCBB. The parameterized_service_descriptor () having the application_tag 0x01 value can provide the information of the 3D service type transmitted to the receiver. This information can facilitate the operation of a 3DTV receiver that reproduces stereoscopic 3D images.

The stereoscopic_program_info_descriptor () specified in reference [9] can be located in the 3D event loop in the EIT to indicate that the event is 3D in the future.

The 3D event loop in the EIT is shown in Table 19 below.

EIT
...
for (j <num_events_in_section) {
event_id
start_time

length_in_seconds

stereoscopic_program_info_descriptor ()
linkage_info_descriptor ()

}

The linkage_info_descriptor () in Table 19 can be located after the stereoscopic_program_info_descriptor () in the EIT to provide information about reference media files.

linkage_info_descriptor () will be described with reference to Table 20 below.

construction Number of bits form linkage_info_descriptor () {
descriptor_tag
descriptor_length
hybrid_delivery_protocol
additionalview_availability_indicator
reserved
num_referenced_media_files
for (i = 0; i <num_referenced_media_files; i ++) {
referenced_media_files_URI_length
for (i = 0; i &lt;referenced_media_files_URI_length; i ++) {
referenced_media_files_URI_byte
}
}
}

8
8
2
One
'11111'
8

8

there



uimsbf
uimsbf
uimsbf
bslbf
bslbf
uimsbf

uimsbf




The field of descriptor_tag in Table 20 may be 8 bits. The field of descriptor_tag can identify each descriptor.

The descriptor_length field in Table 20 may be 8 bits. The descriptor_length field may specify the length of the descriptor byte immediately following the field of the descriptor_length.

The field of hybrid_delivery_protocol in Table 20 can provide the type of SCHCBB defined in [Table 9] above.

The field of additionalview_availability_indicator in [Table 20] can provide the availability of additional images defined in [Table 10].

The num_referenced_media_files field in Table 20 may provide the number of reference media files or the number of MPD files.

The referenced_media_files_URI_length field in Table 20 can provide the length of the URI of the reference media file or MPD file in bytes.

The referenced_media_files_URI_byte field in [Table 20] can provide the URI information of each reference media file or MPD file.

2D / 3D boundary signaling can comply with section 4.6.3 of Part A / 104 of [15].

According to one aspect, channel multiplexing can comply with ATSC A / 53 Part 3 of [2].

In step 740, the first transmitter 620 may transmit the reference image to the receiver over the first network. For example, the first transmitter 620 may transmit the reference image multiplexed to the TS to the receiver through the first network.

According to one aspect, the first network may be a terrestrial network. The terrestrial network can be an ATSC terrestrial network. The first network can comply with the ATSC A / 53 scheme.

Step 740 is described in detail below with reference to FIG.

In step 750, the second processing unit 630 may encode an additional image of the 3D image. The encoding of the additional image may be compression of the additional image. The storage unit 650 may store additional images. For example, the storage unit 650 may be the 3D content server of FIG. 2, the additional video server of FIG. 3, or the additional video server of FIG.

For example, the second processor 630 can encode the additional image according to AVC / H.264 Main Profile @ Level 4.0 or High Profile @ Level 4.0 of Reference [8]. The second processing unit 630 can encode the additional image in accordance with the ATSC A / 53 Part 4 and the ATSC A / 72 Part 1, respectively.

The encoding format of the encoded additional image may be one of the formats shown in [Table 1] above.

For example, the second processing unit 630 may generate an additional video stream for streaming by encoding the additional video. The additional video stream may be a TS.

As another example, the second processing unit 630 may generate an additional image file for downloading by encoding the additional image. The additional video file may be in the format of MP4, ISOBMFF or MPEG-2 TS.

According to an aspect, an encoding format of a reference image and an encoding format of an additional image for providing a 3D image may be the same.

According to one aspect, the additional image may not be multiplexed since only the image is transmitted.

In step 760, the second processing unit 630 may format the encoded additional image.

Regardless of the protocol of the broadband network used for the transmission of the additional video, the additional video can be formatted into any of the MPEG-2 TS, reference [13] or reference [14] file formats of the restrictions defined in ATSC A / 53 .

For example, the second processing unit 630 can format the additional image by converting the encoded additional image into the MPEG-2 or MPEG-4 format.

When formatted into an MPEG-2 TS, the additional image of SCHCBB can be signaled using the stream_type value 0x23 as defined in reference [9]. The referencing information of the additional image can be signaled using stream_type 0x05 defined in [9].

In step 770, the second transmitter 640 may transmit the encoded additional image to the receiver over the second network.

In one aspect, the second transmitting unit 640 may transmit the additional video stream to the receiver through the second network. When transmitting an additional video stream, the second network may be a broadband network. When the second transmitting unit 640 transmits the additional video stream, the 3D video can be provided to the receiver in real time. When the second transmitting unit 640 transmits the additional video stream, the reference video and the additional video are transmitted to the receiver in real time, so that the 3D video can be provided in real time.

In another aspect, the second transmitting unit 640 may transmit the additional image file to the receiver through the second network. When transmitting the additional image file, the second network may be a broadband network. When the second transmitting unit 640 transmits the additional image file, the 3D image may be provided to the receiver in a non-real time manner.

For example, the second transmitter 640 may transmit the entire data of the additional image to the receiver before the data of the reference image is transmitted to the receiver. In this case, the 3D image can be provided in non-real time.

As another example, the second transmitter 640 may transmit some data of the additional image to the receiver before the data of the reference image is transmitted to the receiver. The second transmitter 640 may transmit the remaining data of the additional image to the receiver when the reference image is transmitted to the receiver. In this case, the 3D image can be provided in non-real time.

The broadcasting apparatus 600 may transmit the 3D image to the receiver by transmitting the reference image and the additional image receiver through the steps 740 and 770. The receiver can generate the 3D image using the received reference image and the additional image.

If the encoded additional image is formatted, the second transmitter 640 can transmit the formatted additional image to the receiver.

The technical contents described above with reference to Figs. 1 to 6 can be applied as they are, so that a more detailed description will be omitted below.

FIG. 8 shows a flow diagram of a method for transmitting a reference image to a receiver according to an example.

The above-described step 740 may include the following steps 810 to 820.

In step 810, the first transmitter 620 may transport the multiplexed reference image.

In step 820, the first transmitter 620 may modulate the transported reference image.

The first transmitter 620 may transmit the modulated reference image to the receiver. For example, the first transmitter 620 may transmit the modulated reference TS to the receiver.

The technical contents described with reference to Figs. 1 to 7 can be applied as they are, so that a more detailed description will be omitted below.

FIG. 9 illustrates a system for providing a three-dimensional image according to an example.

The system of FIG. 9 may provide the first scenario 200 to the third scenario 400 described above.

The 3D content server of FIG. 9 may correspond to the storage unit 650. FIG.

The first processing unit 610 may include an encoder for encoding audio, an encoder for encoding the reference image, a program multiplexer, and a channel multiplexer.

The first transmitter 620 can transport and modulate the TS generated by the first processor 610. The first transmitter 620 may transmit the transported and modulated TS to the receiver.

The second processing unit 630 may include at least one of an encoder (AVC) used for the streaming service and an encoder (AVC) used for the download service.

The second transmitting unit 640 may include at least one of a streaming web server and a download server.

In the case of the SCHCBB streaming method of the first scenario 200, the second processing unit 630 encodes the additional video using the encoder AVC, and the second transmitting unit 640 encodes the additional video stream using the streaming web server. Lt; / RTI &gt;

The second processing unit 630 encodes the additional image using the encoder AVC and the second transmitting unit 640 encodes the additional image using the download server in the case of the SCHCBB download method of the second scenario 300 and the third scenario 400 The additional image file can be transmitted to the receiver.

The technical contents described with reference to Figs. 1 to 8 can be applied as they are, so that a more detailed description will be omitted below.

FIGS. 10A to 10D show a flow chart of a method in which a receiver that receives an additional image through a DASH outputs a three-dimensional image.

The following steps 1002 to 1034 may be a method by which a receiver generates a 3D image using the reference image and the additional image provided by the first scenario 200. [

In step 1002, the receiver may obtain a PAT using a PSI parser. The receiver can obtain the PMT_ID by parsing the PAT.

In step 1004, the receiver may use the PSI parser to obtain a table with PID = PMT_PID. The receiver can obtain the SCHCBB scheme provided by stereoscopic_program_info_descriptor () and stereoscopic_video_info_descriptor () by parsing the PMT. The receiver can obtain the PID 1010 of the tables that carry the frame synchronization information (0x05 - referenced_media_information () and 0x06 - media_pairing_information ()) by parsing the PMT.

In Fig. 10B, the receiver can acquire TVCT and EIT using the PSIP parser.

In step 1012, the receiver can identify the virtual channel that provides the SCHCBB by parsing the TVCT by the service_type. The receiver can obtain terrestrial_virtual_channel_table_section () by parsing the TVCT. terrestrial_virtual_channel_table_section () can provide service_location_descriptor and parameterized_serive_descriptor (PSD).

A PID 1020 may be obtained from the service_location_descriptor.

At step 1014, the receiver can obtain the stereoscopic_program_info_descriptor and linkage_info_descriptor by parsing the EIT. linkage_info_descriptor can provide information about reference media files. For example, when an additional image is transmitted using DASH, the value of hybrid_delivery_protocol may be 0x01 indicating DASH.

The following steps 1022-1024 may be performed on the additional image.

In step 1022, the receiver may parse the MPD using the MPD parser. The receiver can request MPD with MPEG-TS carrying PAT and PMT using the information provided by linkage_info_descriptor.

In steps 1024 and 1026, the receiver can analyze each by parsing the PAT and PMT to obtain information about the additional image.

In step 1028, the receiver may request a TS with PID_V2 and a TS with PID 0x06 to obtain media_pairing_information.

The additional image can be decoded through the AVC decoder.

In step 1030, the receiver may provide the decoded supplementary image to a 3D image formatter.

In FIG. 10D, the receiver may receive PID 1010 and PID 1020.

At step 1032, the receiver may process each PID using a TS filter. The receiver can filter the TS with PID = PID_V1 for the reference image. The receiver can filter the TS with PID = PID_PS for reference media information. The receiver may filter the TS with PID = PID_PD for media pairing information. The receiver can filter the TS for PID = PID-A for AC-3 audio information.

In step 1034, the receiver may generate the 3D image using the 3D image formatter on the decoded reference image and the decoded supplementary image through the MPEG-2 decoders.

The technical contents described with reference to FIGS. 1 to 9 can be applied as they are, so that a more detailed description will be omitted below.

FIGS. 11A to 11D show a flowchart of a method of a receiver receiving an additional image through download and outputting a three-dimensional image.

The following steps 1102 to 1134 may be a method by which the receiver generates a 3D image using the reference image and the additional image provided by the second scenario 300 or the third scenario 400. [

Steps 1102 and 1104 may correspond to steps 1002 and 1004 described above, respectively. That is, the description of step 1102 and step 1104 may be replaced with the description of step 1002 and step 1004, respectively.

Step 1112 may correspond to step 1012 described above. That is, the description of step 1102 may be replaced with a description of step 1012, respectively.

In step 1114, the receiver may obtain the stereoscopic_program_info_descriptor and linkage_info_descriptor by parsing the EIT. linkage_info_descriptor can provide information about reference media files. For example, when the additional image is transmitted using the download method, the value of hybrid_delivery_protocol may be 0x02 or 0x03 indicating the download method. In the case of the second scenario 300, the value of hybrid_delivery_protocol is 0x02. In the case of the third scenario 400, the value of hybrid_delivery_protocol may be 0x03.

The following step 1122 may be performed on the additional image.

In step 1022, the receiver may decode the additional image. The decoded additional image can be transmitted to the 3D image formatting device.

In FIG. 11D, the receiver may receive PID 1110 and PID 1120.

Steps 1132 through 1134 may correspond to steps 1032 through 1034 described above. That is, the description of steps 1132 through 1134 may be replaced with a description of steps 1032 through 1034. [

The technical contents described above with reference to Figs. 1 to 10 can be applied as they are, so that a more detailed description will be omitted below.

12 illustrates a flow diagram of a method for providing a three-dimensional image in accordance with one embodiment.

The following steps 1210 to 1270 may perform the fourth scenario 500 described above. That is, steps 1210 through 1270 may provide SCHCNRT.

Step 1210 may correspond to step 710 described above. That is, the description of step 1210 may be replaced with the description of step 710. [

Step 1220 may correspond to step 720 described above. That is, the description of step 1220 may be replaced with the description of step 720. [

For example, in the case of SCHCNRT, the value of the stream_type of the PSI may be 0x02, and the value of the stream_type of the additional image may be 0x23. The reference image and the additional image can be signaled using the value of each stream_type.

As another example, in the case of SCHCNRT, the additional image is the section of reference [16]. DSMCC-Addressable). The &lt; RTI ID = 0.0 &gt;

As another example, in the case of SCHCNRT, the video frame synchronization information (media_pairing_information ()) can be signaled using the stream_type value 0x06 specified in reference [9].

Step 1230 may correspond to step 730 described above. That is, the description of step 1230 may be replaced with the description of step 730. [

In step 1230, the value of hybrid_delivery_protocol of the linkage_info_descriptor for SCHCNRT may be set to 0x00.

The value of additionalview_availability_indicator of linkage_info_descriptor for SCHCNRT may be set to 10.

The signaling at 2D and 3D boundaries can comply with A / 104 Part 2 of section 4.6.3 of [15].

In step 1240, the first transmitter 620 may transmit the reference image multiplexed in the TS to the receiver through the first network in real time.

Step 1240 may correspond to step 740 described above. That is, the description of step 1240 may be replaced with the description of step 740. [

Step 1250 may correspond to step 750 described above. That is, the description of step 1250 may be replaced with the description of step 750. [

In step 1260, the second processing unit 630 may format the encoded additional image.

The second processing unit 630 can format additional video encoded with MPEG-TS, which is a file format specified in reference [16]. For example, the second processing unit 630 can convert the encoded additional image into the MPEG-2 or MPEG-4 format, thereby formatting the encoded additional image.

Step 1260 may correspond to step 760 described above. That is, the description of step 1260 may be replaced by the description of step 760. [

In step 1265, the second processing unit 630 may multiplex the encoded or formatted additional images. Multiplexing may be channel multiplexing.

The second processing unit 630 can multiplex the additional image in compliance with ATSC A / 103: 102 of reference [16].

In step 1265, the second processing unit 630 may signal the additional image to transmit the multiplexed additional image to the receiver. Signaling can use Service Signaling Channel (SSC) to transmit Service Map Table (SMT) and Non-Real-Time Information Table (NRT-IT).

The SMT can provide information about the service providing the 3D image.

The NRT-IT can provide the item of content item for constituting the above-mentioned service.

For SSC, SMT and NRT-IT, it can comply with ATSC A / 103: 2012 of reference [16].

Additional images can be multiplexed to use terrestrial networks.

In step 1270, the second transmitter 640 may transmit the multiplexed additional image to the receiver on the second network in non-real time. The second network may be an ATSC NRT network.

The technical contents described above with reference to Figs. 1 to 11 can be applied as they are, so that a more detailed description will be omitted below.

FIG. 13 illustrates a system for providing a three-dimensional image according to an example.

The 3D content server of FIG. 13 may correspond to the storage unit 650.

The first processing unit 610 may include an encoder for encoding audio, an encoder for encoding the reference image, a program multiplexer, and a channel multiplexer.

The first transmitter 620 can transport and modulate the TS generated by the first processor 610. The first transmitter 620 may transmit the transported and modulated TS to the receiver.

The second processing unit 630 may include an encoder (AVC) and an NRT encoder.

In one aspect, the second processing unit 630 may further include a channel multiplexer.

The technical contents described above with reference to Figs. 1 to 12 can be applied as they are, so a detailed description will be omitted below.

14A to 14D show a flow chart of a method in which a receiver receiving an additional image through an ATSC NRT network outputs a three-dimensional image.

The following steps 1402 through 1434 may be a method by which the receiver generates a 3D image using the reference image and the additional image provided by the fourth scenario 500. [

Steps 1402 and 1404 may correspond to steps 1002 and 1004 described above, respectively. That is, the description of step 1102 and step 1104 may be replaced with the description of step 1002 and step 1004, respectively.

Step 1412 may correspond to step 1012 described above. That is, the description of step 1402 may be replaced with a description of step 1012, respectively.

In step 1412, the service_location_descriptor for SCHCNRT may not be assigned a field for PID_PS differently from SCHCBB.

For SCHCNRT, the value of 3D_channel_type may be 0x05.

For SCHCNRT, the value of hybrid_delivery_protocol may be 0x00.

For SCHCNRT, the value of additionalview_availability_indicator may be 10.

The following steps 1424-1428 may be steps performed on the additional image.

In step 1424, the receiver may parse the PAT obtained via the PSI parser. The PMT_PID can be obtained by parsing the PAT.

In step 1426, the receiver may parse the PMT. SMT and NRT-IT can be obtained by parsing the PMT.

The receiver can download the additional image file via the ATSC NRT network based on the acquired SMT and NRT-IT.

At step 1428, the receiver may download and extract the TS file of the additional image.

The receiver can decode the TS of the extracted additional image.

In step 1430, the receiver may transmit the decoded supplementary image to the 3D image formatter.

In FIG. 14D, the receiver may receive PID 1410 and PID 1420.

Steps 1432 through 1434 may correspond to steps 1032 through 1034 described above. That is, the description of steps 1432 through 1434 may be replaced with a description of steps 1432 through 1434. [

The technical contents described above with reference to FIGS. 1 to 13 can be applied as they are, so that a more detailed description will be omitted below.

The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable array (FPA) A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI &gt; or equivalents, even if it is replaced or replaced.

600: Broadcasting device
610:
620: first transmission section
630:
640:
650:

Claims (20)

A broadcast apparatus for providing a three-dimensional image through a heterogeneous network,
A first processor for encoding the reference image of the three-dimensional image and multiplexing the encoded reference image into a transport stream (TS);
A first transmitter for transmitting a reference image multiplexed with the TS to a receiver through a first network;
A second processor for encoding an additional image of the 3D image; And
A second transmitter for transmitting the encoded additional image to the receiver via a second network,
Lt; / RTI &gt;
Wherein the first network is a terrestrial network,
Wherein the three-dimensional image is provided based on the reference image and the additional image.
The method according to claim 1,
Wherein the first network is an Advanced Television System Committee (ATSC) terrestrial network.
The method according to claim 1,
Wherein the second network is a broadband network.
The method according to claim 1,
Wherein the reference image and the additional image are transmitted to the receiver in real time so that the 3D image is provided in real time broadcasting.
The method according to claim 1,
Wherein the second transmitting unit transmits the entire data of the additional image to the receiver before the data of the reference image is transmitted to the receiver,
Wherein the 3D image is provided in non-real time.
The method according to claim 1,
Wherein the second transmission unit transmits some data of the additional image to the receiver before the data of the reference image is transmitted to the receiver and transmits the remaining data of the additional image to the receiver when the reference image is transmitted to the receiver Lt; / RTI &
Wherein the 3D image is provided in non-real time.
The method according to claim 1,
A reference image multiplexed with the TS includes a first PTS (Presentation Time Stamp) indicating a reproduction time of the reference image,
Wherein the encoded additional image includes a second PTS indicating a reproduction time of the additional image,
Wherein the first PTS and the second PTS are used for synchronization of the reference image and the additional image.
The method according to claim 1,
Wherein the first processing unit multiplexes the encoded reference image into the TS based on the encoded reference image and the metadata related to the 3D image.
9. The method of claim 8,
Wherein the meta data includes pairing information for synchronizing the reference image and the additional image.
10. The method of claim 9,
Wherein the pairing information includes a Uniform Resource Identifier (URI) for providing the additional video,
And the additional image is transmitted to the receiver via the URI.
The method according to claim 1,
The second processing unit converts the encoded additional image into an MPEG-2 or MPEG-4 format,
And the second transmitting unit transmits the formatted additional image to the receiver.
A broadcast apparatus for providing a three-dimensional image,
A first processor for encoding the reference image of the three-dimensional image and multiplexing the encoded reference image into a transport stream (TS);
A first transmitter for transmitting a reference image multiplexed with the TS to a receiver through a first network in real time;
A second processor for encoding the additional image of the 3D image and multiplexing the encoded additional image; And
And a second transmitter for transmitting the multiplexed additional image to the receiver through a second network in non-
Lt; / RTI &gt;
The first network terrestrial network,
Wherein the three-dimensional image is provided based on the reference image and the additional image.
13. The method of claim 12,
The first network is an Advanced Television System Committee (ATSC) terrestrial network,
And the second network is an ATSC NRT network.
13. The method of claim 12,
A reference image multiplexed with the TS includes a first PTS (Presentation Time Stamp) indicating a reproduction time of the reference image,
Wherein the multiplexed additional image includes a second PTS indicating a reproduction time of the additional image,
Wherein the first PTS and the second PTS are used for synchronization of the reference image and the additional image.
15. The method of claim 14,
Wherein playback information for the 3D image is provided based on the first PTS.
13. The method of claim 12,
Wherein the first processing unit multiplexes the encoded reference image into a TS based on the encoded reference image and the metadata related to the 3D image.
13. The method of claim 12,
The second processing unit converts the encoded additional image into an MPEG-2 or MPEG-4 format,
And the second processing unit multiplexes the formatted additional images.
13. The method of claim 12,
The second processing unit may signal the additional image to transmit the multiplexed additional image to the receiver,
The signaling uses an SSC (Service Signaling Channel) to transmit a Service Map Table (SMT) and a Non-Real-Time Information Table (NRT-IT)
Wherein the SMT provides information about a service providing the 3D image,
Wherein the NRT-IT provides content item information for configuring the service.
A method for providing a three-dimensional image through a heterogeneous network,
Encoding a reference image of the 3D image;
Multiplexing the encoded reference image into a transport stream (TS);
Transmitting a reference image multiplexed with the TS to a receiver via a first network, the first network being a terrestrial network;
Encoding an additional image of the 3D image; And
Transmitting the encoded additional image to the receiver over a second network
Lt; / RTI &gt;
Wherein the three-dimensional image is provided based on the reference image and the additional image.
A method for providing a three-dimensional image,
Encoding a reference image of the 3D image;
Multiplexing the encoded reference image into a transport stream (TS);
Transmitting a reference image multiplexed with the TS to a receiver through a first network in real time;
Encoding an additional image of the 3D image;
Multiplexing the encoded additional image; And
Transmitting the encoded additional image in non-real time to the receiver over a second network
Lt; / RTI &gt;
Wherein the three-dimensional image is provided based on the multiplexed reference image and the multiplexed additional image.
KR1020140057614A 2013-07-08 2014-05-14 Method and apparatus for providing three-dimensional video KR20150006340A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/326,262 US20150009289A1 (en) 2013-07-08 2014-07-08 Method and apparatus for providing three-dimensional (3d) video

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361843633P 2013-07-08 2013-07-08
US61/843,633 2013-07-08
US201361844676P 2013-07-10 2013-07-10
US61/844,676 2013-07-10

Publications (1)

Publication Number Publication Date
KR20150006340A true KR20150006340A (en) 2015-01-16

Family

ID=52569768

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140057614A KR20150006340A (en) 2013-07-08 2014-05-14 Method and apparatus for providing three-dimensional video

Country Status (1)

Country Link
KR (1) KR20150006340A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160094000A (en) * 2015-01-30 2016-08-09 (주)아이피티브이코리아 Apparatus for receiving network independent type convergence broadcast

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160094000A (en) * 2015-01-30 2016-08-09 (주)아이피티브이코리아 Apparatus for receiving network independent type convergence broadcast

Similar Documents

Publication Publication Date Title
US9392256B2 (en) Method and apparatus for generating 3-dimensional image datastream including additional information for reproducing 3-dimensional image, and method and apparatus for receiving the 3-dimensional image datastream
KR101683119B1 (en) Broadcast transmitter, Broadcast receiver and 3D video processing method thereof
US8289998B2 (en) Method and apparatus for generating three (3)-dimensional image data stream, and method and apparatus for receiving three (3)-dimensional image data stream
US9019343B2 (en) Image data transmission device, image data transmission method, and image data reception device
EP2744214A2 (en) Transmitting device, receiving device, and transceiving method thereof
US20130258054A1 (en) Transmitter and receiver for transmitting and receiving multimedia content, and reproduction method therefor
US9516086B2 (en) Transmitting device, receiving device, and transceiving method thereof
KR101781887B1 (en) Method and apparatus for transceiving broadcast signal
MX2012008818A (en) Method and apparatus for transmitting digital broadcasting stream using linking information about multi-view video stream, and method and apparatus for receiving the same.
US20130276046A1 (en) Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof
KR20110101099A (en) Method and appatus for providing 3 dimension tv service relating plural transmitting layer
KR20120103510A (en) Transmission apparatus and method, and reception apparatus and method for providing program associated stereoscopic broadcasting service
US20150009289A1 (en) Method and apparatus for providing three-dimensional (3d) video
EP2822282A1 (en) Signal processing device and method for 3d service
US9270972B2 (en) Method for 3DTV multiplexing and apparatus thereof
KR101844236B1 (en) Method and apparatus for transmitting/receiving broadcast signal for 3-dimentional (3d) broadcast service
KR20150006340A (en) Method and apparatus for providing three-dimensional video
EP3280147A1 (en) Method and apparatus for transmitting and receiving broadcast signal
EP2648407A2 (en) Method and apparatus for transmitting stereoscopic video information
KR102204830B1 (en) Method and apparatus for providing three-dimensional territorial brordcasting based on non real time service
Lee et al. Delivery system and receiver for service-compatible 3DTV broadcasting
KR101277267B1 (en) Coding method and apparatus for 3D broadcasting
KR20120036255A (en) Method for transmitting information relating 3d video service in digital broadcast
KR20130116154A (en) Receiving device for a plurality of signals through different paths and method for processing the signals thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination