KR20150006340A - Method and apparatus for providing three-dimensional video - Google Patents
Method and apparatus for providing three-dimensional video Download PDFInfo
- Publication number
- KR20150006340A KR20150006340A KR1020140057614A KR20140057614A KR20150006340A KR 20150006340 A KR20150006340 A KR 20150006340A KR 1020140057614 A KR1020140057614 A KR 1020140057614A KR 20140057614 A KR20140057614 A KR 20140057614A KR 20150006340 A KR20150006340 A KR 20150006340A
- Authority
- KR
- South Korea
- Prior art keywords
- image
- additional
- receiver
- network
- reference image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H20/00—Arrangements for broadcast or for distribution combined with broadcast
- H04H20/86—Arrangements characterised by the broadcast information itself
- H04H20/88—Stereophonic broadcast systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
Abstract
Description
TECHNICAL FIELD The present invention relates to a three-dimensional image providing technique, and more particularly, to a three-dimensional image providing method and apparatus using a broadcast network.
Through binocular parallax, one can feel things close or far. Three - dimensional images provide stereoscopic effect to the viewer 's eyes by the principle of recognizing stereoscopic effect by visual.
The three-dimensional image can be provided using a plurality of two-dimensional images. For example, a three-dimensional image can be generated using a two-dimensional image corresponding to the left eye of the viewer and a two-dimensional image corresponding to the right eye.
Existing broadcasting environments may be suitable for transmitting two-dimensional images. The two-dimensional image in the existing broadcasting environment may be the reference image of the three-dimensional image. If an additional image is provided with the reference image, a three-dimensional image can be provided to the viewer.
Additional bandwidth may be required to transmit additional images in existing broadcast environments.
One embodiment may provide an apparatus and method for providing a three-dimensional image.
One embodiment can provide an apparatus and method for providing a three-dimensional image through a heterogeneous network.
According to an aspect of the present invention, a broadcasting apparatus for providing a three-dimensional image through a heterogeneous network includes a first processing unit for encoding a reference image of the three-dimensional image, and multiplexing the encoded reference image with a transport stream (TS) A first transmitter for transmitting a reference image multiplexed with the TS to a receiver via a first network, a second processor for encoding an additional image of the three-dimensional image, and a second processor for transmitting the encoded additional image to the receiver via a second network Wherein the first network is a terrestrial network and the three-dimensional image is based on the reference image and the additional image.
The first network may be an Advanced Television System Committee (ATSC) terrestrial network.
The second network may be a broadband network.
The reference image and the additional image are transmitted to the receiver in real time, so that the 3D image can be provided in real time broadcasting.
The second transmitter may transmit the entire data of the additional image to the receiver before the data of the reference image is transmitted to the receiver.
The 3D image may be provided in a non-real-time manner.
Wherein the second transmission unit transmits some data of the additional image to the receiver before the data of the reference image is transmitted to the receiver and transmits the remaining data of the additional image to the receiver when the reference image is transmitted to the receiver Lt; / RTI >
The 3D image may be provided in a non-real-time manner.
The reference image multiplexed with the TS may include a first PTS (Presentation Time Stamp) indicating the reproduction time of the reference image.
The encoded additional image may include a second PTS indicating the reproduction time of the additional image.
The first PTS and the second PTS may be used for synchronization of the reference image and the additional image.
The first processing unit may multiplex the encoded reference image into the TS based on the encoded reference image and the metadata related to the 3D image.
The metadata may include pairing information for synchronizing the reference image and the additional image.
The pairing information may include a Uniform Resource Identifier (URI) that provides the additional image.
And the additional image may be transmitted to the receiver through the URI.
The second processing unit can format the encoded additional image by converting the encoded additional image into MPEG-2 or MPEG-4 format.
And the second transmitting unit may transmit the formatted additional image to the receiver.
According to another aspect, a broadcasting apparatus for providing a three-dimensional image includes a first processing unit for encoding a reference image of the three-dimensional image and multiplexing the encoded reference image with a transport stream (TS) A second processor for encoding the additional image of the three-dimensional image, multiplexing the encoded additional image, and a multiplexer for multiplexing the multiplexed additional image, Wherein the first network terrestrial network is a first network terrestrial network and the second network is a non-real-time second network, and wherein the three-dimensional image is provided based on the reference image and the additional image.
The first network may be an Advanced Television System Committee (ATSC) terrestrial network.
The second network may be an ATSC NRT network.
The reference image multiplexed with the TS may include a first PTS (Presentation Time Stamp) indicating the reproduction time of the reference image.
The multiplexed additional image may include a second PTS indicating the reproduction time of the additional image.
The first PTS and the second PTS may be used for synchronization of the reference image and the additional image.
Playback information for the 3D image may be provided based on the first PTS.
The first processing unit may multiplex the encoded reference image to TS based on the encoded reference image and the metadata related to the three-dimensional image.
The second processing unit can format the encoded additional image by converting the encoded additional image into MPEG-2 or MPEG-4 format.
The second processing unit may multiplex the formatted additional images.
And the second processing unit may signal the additional image to transmit the multiplexed additional image to the receiver.
The signaling may use a Service Signaling Channel (SSC) to transmit a Service Map Table (SMT) and a Non-Real-Time Information Table (NRT-IT).
Wherein the SMT provides information about a service providing the 3D image,
The NRT-IT may provide the item of content item to configure the service.
According to another aspect, a method for providing a three-dimensional image through a heterogeneous network includes encoding a reference image of the three-dimensional image, multiplexing the encoded reference image with a transport stream (TS) , Transmitting a reference image multiplexed with the TS to a receiver via a first network, encoding an additional image of the 3D image, and transmitting the encoded additional image to the receiver via a second network Wherein the first network is a terrestrial network, and the three-dimensional image is provided based on the reference image and the additional image.
According to another aspect, a method of providing a three-dimensional image includes encoding a reference image of the three-dimensional image, multiplexing the encoded reference image into a transport stream (TS) Transmitting a multiplexed reference image in real time to a receiver over a first network, encoding an additional image of the three-dimensional image, multiplexing the encoded additional image, and encoding the encoded additional image in non- And transmitting the three-dimensional image to the receiver through a second network, wherein the three-dimensional image is provided based on the multiplexed reference image and the multiplexed additional image.
The method and apparatus may provide a compatible broadcast service between two-dimensional and three-dimensional images.
The method and apparatus according to an exemplary embodiment may provide a three-dimensional image through a heterogeneous network.
FIG. 1 shows a conceptual diagram for providing a three-dimensional image according to an example.
FIG. 2 illustrates a first scenario for providing a three-dimensional image according to an example.
FIG. 3 illustrates a second scenario for providing a three-dimensional image according to an example.
FIG. 4 shows a third scenario for providing a three-dimensional image according to an example.
FIG. 5 illustrates a fourth scenario for providing a three-dimensional image according to an example.
6 is a block diagram of a broadcasting apparatus according to an embodiment of the present invention.
FIG. 7 illustrates a flow diagram of a method for providing a three-dimensional image in accordance with one embodiment.
FIG. 8 shows a flow diagram of a method for transmitting a reference image to a receiver according to an example.
FIG. 9 illustrates a system for providing a three-dimensional image according to an example.
FIGS. 10A to 10D show a flow chart of a method in which a receiver that receives an additional image through a DASH outputs a three-dimensional image.
FIGS. 11A to 11D show a flowchart of a method of a receiver receiving an additional image through download and outputting a three-dimensional image.
Figure 12 shows a flow diagram of a method for providing a three-dimensional image in accordance with an embodiment.
FIG. 13 illustrates a system for providing a three-dimensional image according to an example.
14A to 14D show a flow chart of a method in which a receiver receiving an additional image through an ATSC NRT network outputs a three-dimensional image.
In the following, embodiments will be described in detail with reference to the accompanying drawings. Like reference symbols in the drawings denote like elements.
Various modifications may be made to the embodiments described below. It is to be understood that the embodiments described below are not intended to limit the embodiments, but include all modifications, equivalents, and alternatives to them.
The terms used in the examples are used only to illustrate specific embodiments and are not intended to limit the embodiments. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises" or "having" and the like refer to the presence of stated features, integers, steps, operations, elements, components, or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this embodiment belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.
In the following description of the present invention with reference to the accompanying drawings, the same components are denoted by the same reference numerals regardless of the reference numerals, and redundant explanations thereof will be omitted. In the following description of the embodiments, a detailed description of related arts will be omitted if it is determined that the gist of the embodiments may be unnecessarily blurred.
FIG. 1 shows a conceptual diagram for providing a three-dimensional image according to an example.
Compared to a broadcasting service providing a 2D image, a broadcasting service providing 3D images may require a larger bandwidth to provide a 3D image. In the case of a broadcast service compatible with a 2D image and a 3D image, a bandwidth twice as high as that of a broadcast service providing a 2D image may be required.
A compatible broadcast service may be a broadcast service, which is an existing 2D image having at least one resolution of the plurality of images having the same resolution as the production resolution when the 3D image is composed of a plurality of compressed images. The plurality of images may include a base view video and an additional view video.
Broadband networks and ATSC NRT (Advanced Television System Committee) non-real time (NRT) networks can be used to provide 3D images in broadcast systems that provide conventional 2D images without requiring additional bandwidth.
A method of providing a 3D image using a broadband network can be named as SCHCBB (Service Compatible Hybrid Coded 3D using BroadBand).
A method of providing a 3D image using an ATSC NRT network may be named as SCHCNRT (Service Compatible Hybrid Coded 3D using ATSC NRT).
The 3D image may be provided based on the reference image and the additional image. The reference image may be an existing 2D image. The additional image may be an image added to the reference image to provide a 3D image. For example, the viewpoint of the additional image may be different from the viewpoint of the reference image.
The reference image of the 3D image may be transmitted to the receiver via the first network. For example, the first network may be a terrestrial network. The terrestrial network may be a broadcast network. The terrestrial network can be an ATSC terrestrial network.
The additional image of the 3D image can be transmitted to the receiver via the second network.
For example, the second network may be a broadband network. The broadband network may be a network over the Internet. That is, the broadband network may be a data communication network rather than a broadcast network.
As another example, the second network may be a terrestrial network. The terrestrial network may be an ATSC NRT network.
If the reference image is transmitted to the receiver over the ATSC terrestrial network and the additional image is transmitted over the broadband network or the ATSC NRT network to the receiver, additional bandwidth for transmitting the additional image may not be required in the system transmitting the reference image have.
A service scenario for providing a 3D image will be described in detail below with reference to Figs. 2 to 5.
Reference can be made to the references to describe the following examples. References are as follows.
References [1]: IEEE / ASTM: "Use of the International Systems of Units (SI): The Modern Metric System," Doc. SI 10-2002, Institute of Electrical and Electronics Engineers, New York, N.Y.
[2]: ATSC: "ATSC Digital Television Standard, Part 3 - Service Multiplex and Transport Subsystem Characteristics," Doc. A / 53, Part 3: 201x, Advanced Television Systems Committee, Washington, D.C., [TBD].
[3]: ATSC: "ATSC Digital Television Standard, Part 4 - MPEG-2 Video System Characteristics," Doc. A / 53, Part 4: 201x, Advanced Television Systems Committee, Washington, D.C., [TBD].
[4]: ATSC: "Use of AVC in the ATSC Digital Television System, Part 1 - Video System Characteristics," Doc. A / 72, Part 1, Advanced Television Systems Committee, Washington, D.C., 29 July 2008.
Reference [5]: ATSC: "Program and System Information Protocol for Terrestrial Broadcast and Cable," Doc. A / 65: 2009, Advanced Television Systems Committee, Washington, D.C., 3 August 2009.
Reference [6]: ATSC: "ATSC Parameterized Services Standard," Doc. A / 71, Advanced Television Systems Committee, Washington, D.C., 26 March 2007
Reference [7]: ITU-T Recommendation H.262 | ISO / IEC 13818-2: "Information technology - Generic coding of moving pictures and associated audio information: Video"
Reference [8]: ITU-T Recommendation H.264 | ISO / IEC 14496-10: 2010: "Information technology - Coding of audio-visual objects - Part 10: Advanced Video Coding."
Reference [9]: ITU-T Recommendation H.222.0 (2012) | ISO / IEC 13818-1: 2012, "Information technology - Generic coding of moving pictures and associated audio information: Systems"
[10]: CEA: CEA-708.1, "Digital Television Closed Captioning: 3D Extensions," Consumer Electronics Association, Arlington, VA, 2012
[11]: ATSC: "Use of AVC in the ATSC Digital Television System, Part 2 - Transport Subsystem Characteristics," Doc. A / 72,
[12]: ATSC: "Non-Real-Time Content Delivery," Doc. A / 103, Advanced Television Systems Committee, Washington, D.C., 9 May 2012.
[13]: ISO / IEC 14496-14: 2003: "Information technology - Coding of audio-visual objects - Part 14: MP4 file format".
References [14]: ISO / IEC 14496-12: 2008: "Information technology - Coding of audio-visual objects - Part 12: ISO base media file format".
[15]: ATSC: "3D-TV Terrestrial Broadcasting, Part 2 - Service Compatible Hybrid Coding Using Real-Time Delivery" Doc. A / 104, Advanced Television Systems Committee, Washington, D.C., 26 December 2012.
Reference [16]: ATSC: "Non-Real-Time Content Delivery" Doc. A / 103: 2012, Advanced Television Systems Committee, Washington, DC, 9 May 2012.
FIG. 2 illustrates a first scenario for providing a three-dimensional image according to an example.
The
Live content can be stored in the 3D content server. The 3D content server can provide a 3D image to a receiver using a terrestrial network and a broadband network.
The
When the 3D image is live content, the method of providing the 3D image will be described in detail with reference to Figs. 6 to 10 below.
The technical contents described above with reference to FIG. 1 can be applied as it is, so a detailed description will be omitted below.
FIG. 3 illustrates a second scenario for providing a three-dimensional image according to an example.
The
The reference image and the additional image can be stored in the reference image server and the additional image server, respectively.
The
For example, if the reference image is to be transmitted to the receiver from 09:00, the additional image may be transmitted from 08:55. Some data of the additional image may be transmitted to the receiver before the data of the reference image is transmitted to the receiver.
In the
A method of providing a 3D image by transmitting some data of an additional image to a receiver using a broadband network before data of the reference image is transmitted to a receiver will be described in detail with reference to FIGS. 6 to 9 and 11 do.
The technical contents described above with reference to Figs. 1 and 2 can be applied as they are, so that a more detailed description will be omitted below.
FIG. 4 shows a third scenario for providing a three-dimensional image according to an example.
The
The reference image and the additional image can be stored in the reference image server and the additional image server, respectively.
The
For example, if the reference image is to be transmitted to the receiver from 09:00, the entire data of the additional image may be transmitted to the receiver before 09:00.
A method of providing a 3D image by transmitting the entire data of the additional image to the receiver using the wide-band network before the data of the reference image is transmitted to the receiver will be described in detail with reference to FIGS. 6 to 9 and 11 do.
The technical contents described with reference to Figs. 1 to 3 can be applied as it is, and a detailed description will be omitted below.
FIG. 5 illustrates a fourth scenario for providing a three-dimensional image according to an example.
The
The reference image and the additional image can be stored in the reference image server and the additional image server, respectively.
The
For example, if the reference image is to be transmitted to the receiver from 09:00, the entire data of the additional image may be transmitted to the receiver before 09:00.
A method of providing a 3D image by transmitting the entire data of the additional image to the receiver using the terrestrial network before the data of the reference image is transmitted to the receiver will be described in detail with reference to FIG. 12 to FIG.
The technical contents described above with reference to Figs. 1 to 4 can be applied as they are, so that a more detailed description will be omitted below.
6 is a block diagram of a broadcasting apparatus according to an embodiment of the present invention.
The
The
The
The
The technical contents described above with reference to Figs. 1 to 5 may be applied as they are, so that a more detailed description will be omitted below.
FIG. 7 illustrates a flow diagram of a method for providing a three-dimensional image in accordance with one embodiment.
In
The
The encoding format of the encoded reference image may be one of the formats shown in [Table 1] below.
/ Interview
According to one aspect, the
The format of the encoded reference image may be an elementary stream (ES).
In
In one aspect, the
The
The multiplexing of reference images can comply with ATSC A / 53
According to an aspect, the
The metadata may include Program Specific Information (PSI).
The PSI includes a PAT (Program Association Table) for holding a program information list in a channel, a CAT (Conditional Access Table) including access control information such as scrambling, a PMT (Program Map Table), and an NIT (Network Information Table) including information on a network used for transmitting MPEG information.
The PMT can provide program_number and information about each program present in the TS. The PMT can list the elementary streams that make up the MPEG-2 program. The PMT may provide location information for optional descriptors for each elementary stream as well as for optional descriptors that describe a complete MPEG-2 stream. Each elementary stream can be identified by a stream_type value.
For example, the reference image may be signaled using the stream_type value 0x02. As another example, the image frame synchronization metadata of the SCHCBB, which will be described below, can be signaled using the stream_type value 0x06.
The stereoscopic_program_info_descriptor () and stereoscopic_video_info_descriptor () specified in reference [9] can be used to signal the program provided via SCHCBB (outlined below as the SCHCBB program). Hereinafter, what is displayed by the method of a_b_c () or a_b_c may represent a field of programming code or code.
The stereoscopic_program_info_descriptor () specified in reference [9] can be present in the loop after the program_info_length field of the PMT to inform the receiver of the existence of the SCHCBB program. For SCHCBB, stereoscopic_service_type can be specified as '011'.
stereoscopic_service_type is described with reference to [Table 2] below.
The stereoscopic_service_type may be designated as '001' to indicate that the reference video stream and the additional video stream of the SCHCBB program transmit the same video (program or content).
The stereoscopic_program_info_descriptor () specified in reference [9] can exist in the loop following the field of ES_info_length in the PMT to identify the view element of the SCHCBB program (ie, the primary video stream and the additional video stream).
The values of horizontal_upsampling_factor and vertical_upsampling_factor may be used to signal the upsampling factor of the additional image.
The reference image may include a first PTS (Presentation Time Stamp) indicating the reproduction time of the reference image. Playback information for the 3D image may be provided based on the first PTS.
And the additional image may include a second PTS indicating the reproduction time of the additional image. The first PTS and the second PTS may be used for synchronization of the reference image and the additional image.
If the first PTS and the second PTS are different, data may be required to pair the frames of the reference image and the additional image. The frames of the reference image and the additional image are paired so that the 3D image can be synchronized.
The metadata may include pairing information for synchronizing the reference image and the additional image. For example, metadata may include data for pairing frames of a reference image and an additional image in media_pairing_information (). The media_pairing_information () can be multiplexed with a reference image by being loaded in a PES (Packetized Elementary Stream) packet.
When the additional image is transmitted in the streaming, media_pairing_information () may be multiplexed with the reference image for the additional image.
The PES packet will be described with reference to Table 3 below.
Note: '1', which means the start of media pairing information, should match the start of the PES payload
The PES_data_field () for PES_packet_data_byte is described in detail in Table 4 below.
data_identifier
media_pairing_information ()
}
8
var (various)
unisbf
unisbf can represent unsigned integer, most significant bit first. That is, unisbf may be the best bit of an unsigned integer.
The data_identifier field in Table 4 may identify the private stream PES for the SCHCBB. For example, the value of the field of data_identifier may be 0x33.
The media_pairing_information () in Table 4 can provide media pairing information used for synchronization of the reference image and the additional image.
The syntax for media_pairing_information () is described with reference to Table 5 below.
referenced_media_filename_length
for (i = 0; i <referenced_media_filename_length; i ++) {
referenced_media_filename_byte
}
reserved
frame_number
}
8
8
7
25
unisbf
unisbf
unisbf
unisbf
The field of referenced_media_filename_length in [Table 5] can provide the length of the bytes in the field of referenced_media_filename_byte.
The field referenced_media_filename_byte in Table 5 may provide a Uniform Resource Identifier (URI) of the reference media. For example, the reference media may be a file of additional video, additional video stream, or additional video. That is, the pairing information may include a URI that provides an additional image.
The additional image can be transmitted to the receiver through the URI by identifying the URI that the receiver provides additional images.
The field of frame_number in Table 5 may indicate the frame number of streams of SCHCBB. The frame number can start with 0 at the start. The frame number may increase sequentially.
The metadata may include the referenced media information.
The referenced media information may be listed in referenced_media_information (). The referenced_media_information () may provide access, synchronization and playback information of the additional image associated with the reference image. referenced_media_information () may be provided with stream_type 0x05 specified in reference [9] for additional video stream information.
The structure of private_section () containing referenced_media_information () is described in detail in Table 6 below. The restrictions of private_section () are described in detail in Table 7 below.
table_id
section_syntax_indicator
private_indicator
reserved
private_section_length
if (section_syntax_indicator = '0') {
for (i = 0; i <N; i ++) {
private_data_byte
}
}
}
8
One
One
2
12
8
'0x41'
bslbf
bslbf
bslbf
unisbf
bslbf
bslbf can represent a bit string, left bit first. That is, bslbf may be the leftmost bit in the bit string.
(table_id)
Referenced media information () follows private_section_length.
version_number
num_hybrid_service_programs
for (i = 0; i <num_hybrid_service_programs; i ++) {
hybrid_delivery_protocol
additionalview_availability_indicator
hybrid_service_sync_type
reserved
num_reference_media_files
for (i = 0; i <num_reference_media_files; i ++) {
referenced_media_play_start_time
referenced_media_expiration_time
referenced_media_filesize
referenced_media_type
referenced_media_codec_info
referenced_media_files_URI_length
for (i = 0; i <referenced_media_files_URI_length; i ++) {
referenced_media_ files_URI_byte
}
}
}
8
8
2
2
2
2
8
32
32
32
4
4
8
there
uimsbf
uimsbf
uimsbf
bslbf
uimsbf
'11'
uimsbf
uimsbf
uimsbf
uimsbf
uimsbf
uimsbf
uimsbf
8 * N
The field of version_number in [Table 8] can provide the version number of referenced_media_information (). The value of the field version_number may increase monotonically as the information of referenced_media_information () changes. For example, the value of the field of version_number may be incremented by one.
The num_hybrid_service_programs field in Table 8 can represent the number of SCHCBBs provided.
The field of hybrid_delivery_protocol in Table 8 can provide the transmission protocol of the SCHCBB additional image as defined in [Table 9] below.
The type of method by which additional images are sent to the receiver can be provided by hybrid_delivery_protocol. For example, the additional image may be provided to the receiver using a Dynamic Adaptive Streaming over Http (DASH) protocol, an HTTP protocol that does not use DASH, or an FTP protocol.
The values of the fields of additionalview_availability_indicator in [Table 8] are described in detail in Table 10 below.
The additionalview_availability_indicator may indicate the effectiveness of the additional image. The value '00' of the field of additionalview_availability_indicator may indicate that the additional image is provided at the same time as the reference image. If there is a delay in the network transmitting the additional image, the reference image may need to be buffered in the receiver to be synchronized with the additional image.
The value '01' of the field of the additionalview_availability_indicator may indicate that the additional image is provided before the reference image. The value '01' of the field may indicate that the data of part of the additional image is transmitted to the receiver before the start time of the program (for example, 3D image), but the entire data of the additional image is not downloaded before the program start time . The data of the partially downloaded additional image may be buffered in the receiver to be synchronized with the reference image.
The value '10' of the field of additionalview_availability_indicator may indicate that the entire data of the additional image is downloaded to the receiver to be synchronized with the reference image before the program start time.
The values of the fields of the hybrid_service_sync_type in Table 8 are described in Table 11 below.
The field of hybrid_service_sync_type may indicate information about the synchronization method of the reference image and the additional image.
Synchronization of the reference image and the additional image can be performed at the ES level or the PES level. For the ES level synchronization, the value of the field of the hybrid_service_sync_type field may be '01'. For the PES level synchronization, the value of the field of hybrid_service_sync_type may be '10'. In case of PES level synchronization, the field of media_pairing_information described above can be used. In the case of ES level synchronization, information on SMPTE_timecode may be present in a Group Of Pictures (GOP) header of MPEG-2 and Picture Sequencing Supplemental Enhancement Information (SEI) of Advanced Video Coding (AVC).
SMPTE_timecode will be described in detail with reference to Table 12 below.
Time_code_hours
Time_code_minutes
Market_bit
Time_code_seconds
Time_code_pictures
0 to 23
0 to 59
One
0 to 59
0 to 59
5
6
One
6
6
uimsbf
uimsbf
bslbf
uimsbf
uimsbf
For example, only '1' may be used for Drop_frame_flag in National Television System Committee (NTSC).
The num_reference_media_files field in Table 8 may indicate the number of reference media files constituting the additional image. In the case of the streaming service, since only one MPD (Media Presentation Description) file exists, this field can be designated as '1'.
The field of reference_media_play_start_time in Table 8 may indicate the start time of the additional video stream in the case of a streaming service.
In the case of the streaming service, the value of the field of reference_media_play_start_time may indicate the start time of providing the MPD file. The start time of the MPD file can be the same as the start time of the program.
In the case of the download service, the value of the field of reference_media_play_start_time may indicate the start time of each reference media file constituting the additional image. The value of the field of reference_media_play_start_time may be provided in UTC.
The field of reference_media_expiration_time in [Table 8] can indicate the expiration time of each reference media file. In the case of streaming service, the value of the field of reference_media_expiration_time may match the end time of the program. The value of the field of reference_media_play_start_time may be provided in UTC.
The fields of reference_media_filesize in Table 8 can indicate the size of each reference media file in bytes. In the case of streaming service, the value of the field of reference_media_filesize may be '0'.
The field of referenced_media_type in Table 8 may indicate the type of additional video file (e.g., MP4 or ISOBMFF) or stream (e.g., MPEG-2 TS).
The fields of referenced_media_type are described in detail in Table 13 below.
The referenced_media_codec_info field in Table 8 can provide the codec information of the additional video.
The fields of referenced_media_codec_info are described in detail in Table 14 below.
The referenced_media_files_URI_length field in Table 8 can provide the URI length of the reference media or MPD file.
The referenced_media_files_URI_byte field in [Table 8] can provide the URI information of the reference media or MPD file.
In
According to one aspect, the
The PSIP data includes a STT (System Time Table) indicating current time information, a MGT (Master Guide Table) serving as a pointer to other PSIP tables, a VCT (Virtual Channel Table) allocating a number per channel, An RAT (Rating Region Table) indicating a rating, a program title, an EIT (Event Information Table) as guide data, and an ETT (Extended Text Table) as detailed information on a channel and a program.
The VCT may be a TVCT (Territorial VCT). The TVCT may include information about the various channels that are delivered in the physical terrestrial broadcast channel.
The virtual channel providing SCHCBB may be identified by the service_type specified as 0x09 in the TVCT. The following descriptors may be described following the terrestrial_virtual_channel_table_section () field of the descriptor loop or the descriptors_length field of the cable_virtual_channel_table_section (). The following descriptors may be at least one of Service Location Descriptor and Parameterized Service Descriptor (PSD) of reference [6].
The PSD is described in detail in [Table 15] to [Table 17] below. The PSD may be a field of parameterized_service_descriptor ().
Parameterized_service_descriptor () may be provided in a virtual channel with service_type value 0x09 to convey specific information that the receiver can use to determine whether it can create a meaningful presentation of services in the channel. The PSD can load the payload. The payload syntax and semantics may be application-specific. A field called application_tag can identify the application to which the payload is applied.
descriptor_tag
descriptor_length
application_tag
application_data ()
}
8
8
8
there
uimsbf
uimsbf
bslbf
The unsigned 8-bit integer of the field of descriptor_tag in Table 15 may have a value of 0x8D identifying that the descriptor is parameterized_service_descriptor ().
In the unsigned 8-bit integer of the descriptor_length field in Table 15, the length after the above-described field (byte) can be specified at the end of the above descriptor. The maximum length may be 255.
An unsigned 8-bit integer in the field of application_tag in Table 15 can identify the application associated with application_data () below. The values of application_tag may be specified in the present invention and other ATSC standards.
The syntax and semantics of the fields of application_data () in [Table 15] can be specified in the ATSC standard established in association with the application_tag value.
The parameterized_service_descriptor () defined above can be used for delivery of a specific parameter to a specific application. For channels containing 3D content, the value of application_tage may be 0x01. The application_data () for the application_tage value 0x01 is described in detail in Table 16 below.
reserved
3D_channel_type
for (i = 0; i <N; i ++) {
reserved
}
}
3
5
8
uimsbf
uimsbf
bslbf
The 5-bit unsigned integer of the field of 3D_channel_type in Table 16 can indicate the type of 3D service carried in the virtual channel associated with parameterized_service_descriptor (). The coding for 3D_channel_type is described in Table 17 below. SCHCBB can use 0x04. The SCHCNRT to be described later can use 0x05.
Table 18 below is an example of TVCT for SCHCBB.
...
for (i <num_channels_in_section) {
major_channel_number = 0x003
minor_channel_number = 0x002
program_number = 0x0002
service_type = 0x09 (extended parameterized service)
service_location_descriptor ()
parameterized_service_descriptor ()
}
The field of service_location_descriptor () in Table 18 may indicate the PID of the supplemental image elementary stream of the SCHCBB. The parameterized_service_descriptor () having the application_tag 0x01 value can provide the information of the 3D service type transmitted to the receiver. This information can facilitate the operation of a 3DTV receiver that reproduces stereoscopic 3D images.
The stereoscopic_program_info_descriptor () specified in reference [9] can be located in the 3D event loop in the EIT to indicate that the event is 3D in the future.
The 3D event loop in the EIT is shown in Table 19 below.
...
for (j <num_events_in_section) {
event_id
start_time
length_in_seconds
stereoscopic_program_info_descriptor ()
linkage_info_descriptor ()
}
The linkage_info_descriptor () in Table 19 can be located after the stereoscopic_program_info_descriptor () in the EIT to provide information about reference media files.
linkage_info_descriptor () will be described with reference to Table 20 below.
descriptor_tag
descriptor_length
hybrid_delivery_protocol
additionalview_availability_indicator
reserved
num_referenced_media_files
for (i = 0; i <num_referenced_media_files; i ++) {
referenced_media_files_URI_length
for (i = 0; i <referenced_media_files_URI_length; i ++) {
referenced_media_files_URI_byte
}
}
}
8
8
2
One
'11111'
8
8
there
uimsbf
uimsbf
uimsbf
bslbf
bslbf
uimsbf
uimsbf
The field of descriptor_tag in Table 20 may be 8 bits. The field of descriptor_tag can identify each descriptor.
The descriptor_length field in Table 20 may be 8 bits. The descriptor_length field may specify the length of the descriptor byte immediately following the field of the descriptor_length.
The field of hybrid_delivery_protocol in Table 20 can provide the type of SCHCBB defined in [Table 9] above.
The field of additionalview_availability_indicator in [Table 20] can provide the availability of additional images defined in [Table 10].
The num_referenced_media_files field in Table 20 may provide the number of reference media files or the number of MPD files.
The referenced_media_files_URI_length field in Table 20 can provide the length of the URI of the reference media file or MPD file in bytes.
The referenced_media_files_URI_byte field in [Table 20] can provide the URI information of each reference media file or MPD file.
2D / 3D boundary signaling can comply with section 4.6.3 of Part A / 104 of [15].
According to one aspect, channel multiplexing can comply with ATSC A / 53
In
According to one aspect, the first network may be a terrestrial network. The terrestrial network can be an ATSC terrestrial network. The first network can comply with the ATSC A / 53 scheme.
Step 740 is described in detail below with reference to FIG.
In
For example, the
The encoding format of the encoded additional image may be one of the formats shown in [Table 1] above.
For example, the
As another example, the
According to an aspect, an encoding format of a reference image and an encoding format of an additional image for providing a 3D image may be the same.
According to one aspect, the additional image may not be multiplexed since only the image is transmitted.
In
Regardless of the protocol of the broadband network used for the transmission of the additional video, the additional video can be formatted into any of the MPEG-2 TS, reference [13] or reference [14] file formats of the restrictions defined in ATSC A / 53 .
For example, the
When formatted into an MPEG-2 TS, the additional image of SCHCBB can be signaled using the stream_type value 0x23 as defined in reference [9]. The referencing information of the additional image can be signaled using stream_type 0x05 defined in [9].
In
In one aspect, the
In another aspect, the
For example, the
As another example, the
The
If the encoded additional image is formatted, the
The technical contents described above with reference to Figs. 1 to 6 can be applied as they are, so that a more detailed description will be omitted below.
FIG. 8 shows a flow diagram of a method for transmitting a reference image to a receiver according to an example.
The above-described
In
In
The
The technical contents described with reference to Figs. 1 to 7 can be applied as they are, so that a more detailed description will be omitted below.
FIG. 9 illustrates a system for providing a three-dimensional image according to an example.
The system of FIG. 9 may provide the
The 3D content server of FIG. 9 may correspond to the
The
The
The
The
In the case of the SCHCBB streaming method of the
The
The technical contents described with reference to Figs. 1 to 8 can be applied as they are, so that a more detailed description will be omitted below.
FIGS. 10A to 10D show a flow chart of a method in which a receiver that receives an additional image through a DASH outputs a three-dimensional image.
The following
In
In
In Fig. 10B, the receiver can acquire TVCT and EIT using the PSIP parser.
In
A
At
The following steps 1022-1024 may be performed on the additional image.
In
In
In
The additional image can be decoded through the AVC decoder.
In
In FIG. 10D, the receiver may receive
At
In
The technical contents described with reference to FIGS. 1 to 9 can be applied as they are, so that a more detailed description will be omitted below.
FIGS. 11A to 11D show a flowchart of a method of a receiver receiving an additional image through download and outputting a three-dimensional image.
The following
In
The following
In
In FIG. 11D, the receiver may receive
The technical contents described above with reference to Figs. 1 to 10 can be applied as they are, so that a more detailed description will be omitted below.
12 illustrates a flow diagram of a method for providing a three-dimensional image in accordance with one embodiment.
The following
For example, in the case of SCHCNRT, the value of the stream_type of the PSI may be 0x02, and the value of the stream_type of the additional image may be 0x23. The reference image and the additional image can be signaled using the value of each stream_type.
As another example, in the case of SCHCNRT, the additional image is the section of reference [16]. DSMCC-Addressable). The < RTI ID = 0.0 >
As another example, in the case of SCHCNRT, the video frame synchronization information (media_pairing_information ()) can be signaled using the stream_type value 0x06 specified in reference [9].
In
The value of additionalview_availability_indicator of linkage_info_descriptor for SCHCNRT may be set to 10.
The signaling at 2D and 3D boundaries can comply with A / 104
In
In
The
In
The
In
The SMT can provide information about the service providing the 3D image.
The NRT-IT can provide the item of content item for constituting the above-mentioned service.
For SSC, SMT and NRT-IT, it can comply with ATSC A / 103: 2012 of reference [16].
Additional images can be multiplexed to use terrestrial networks.
In
The technical contents described above with reference to Figs. 1 to 11 can be applied as they are, so that a more detailed description will be omitted below.
FIG. 13 illustrates a system for providing a three-dimensional image according to an example.
The 3D content server of FIG. 13 may correspond to the
The
The
The
In one aspect, the
The technical contents described above with reference to Figs. 1 to 12 can be applied as they are, so a detailed description will be omitted below.
14A to 14D show a flow chart of a method in which a receiver receiving an additional image through an ATSC NRT network outputs a three-dimensional image.
The following
In
For SCHCNRT, the value of 3D_channel_type may be 0x05.
For SCHCNRT, the value of hybrid_delivery_protocol may be 0x00.
For SCHCNRT, the value of additionalview_availability_indicator may be 10.
The following steps 1424-1428 may be steps performed on the additional image.
In
In
The receiver can download the additional image file via the ATSC NRT network based on the acquired SMT and NRT-IT.
At
The receiver can decode the TS of the extracted additional image.
In
In FIG. 14D, the receiver may receive
Steps 1432 through 1434 may correspond to
The technical contents described above with reference to FIGS. 1 to 13 can be applied as they are, so that a more detailed description will be omitted below.
The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable array (FPA) A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.
The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.
600: Broadcasting device
610:
620: first transmission section
630:
640:
650:
Claims (20)
A first processor for encoding the reference image of the three-dimensional image and multiplexing the encoded reference image into a transport stream (TS);
A first transmitter for transmitting a reference image multiplexed with the TS to a receiver through a first network;
A second processor for encoding an additional image of the 3D image; And
A second transmitter for transmitting the encoded additional image to the receiver via a second network,
Lt; / RTI >
Wherein the first network is a terrestrial network,
Wherein the three-dimensional image is provided based on the reference image and the additional image.
Wherein the first network is an Advanced Television System Committee (ATSC) terrestrial network.
Wherein the second network is a broadband network.
Wherein the reference image and the additional image are transmitted to the receiver in real time so that the 3D image is provided in real time broadcasting.
Wherein the second transmitting unit transmits the entire data of the additional image to the receiver before the data of the reference image is transmitted to the receiver,
Wherein the 3D image is provided in non-real time.
Wherein the second transmission unit transmits some data of the additional image to the receiver before the data of the reference image is transmitted to the receiver and transmits the remaining data of the additional image to the receiver when the reference image is transmitted to the receiver Lt; / RTI &
Wherein the 3D image is provided in non-real time.
A reference image multiplexed with the TS includes a first PTS (Presentation Time Stamp) indicating a reproduction time of the reference image,
Wherein the encoded additional image includes a second PTS indicating a reproduction time of the additional image,
Wherein the first PTS and the second PTS are used for synchronization of the reference image and the additional image.
Wherein the first processing unit multiplexes the encoded reference image into the TS based on the encoded reference image and the metadata related to the 3D image.
Wherein the meta data includes pairing information for synchronizing the reference image and the additional image.
Wherein the pairing information includes a Uniform Resource Identifier (URI) for providing the additional video,
And the additional image is transmitted to the receiver via the URI.
The second processing unit converts the encoded additional image into an MPEG-2 or MPEG-4 format,
And the second transmitting unit transmits the formatted additional image to the receiver.
A first processor for encoding the reference image of the three-dimensional image and multiplexing the encoded reference image into a transport stream (TS);
A first transmitter for transmitting a reference image multiplexed with the TS to a receiver through a first network in real time;
A second processor for encoding the additional image of the 3D image and multiplexing the encoded additional image; And
And a second transmitter for transmitting the multiplexed additional image to the receiver through a second network in non-
Lt; / RTI >
The first network terrestrial network,
Wherein the three-dimensional image is provided based on the reference image and the additional image.
The first network is an Advanced Television System Committee (ATSC) terrestrial network,
And the second network is an ATSC NRT network.
A reference image multiplexed with the TS includes a first PTS (Presentation Time Stamp) indicating a reproduction time of the reference image,
Wherein the multiplexed additional image includes a second PTS indicating a reproduction time of the additional image,
Wherein the first PTS and the second PTS are used for synchronization of the reference image and the additional image.
Wherein playback information for the 3D image is provided based on the first PTS.
Wherein the first processing unit multiplexes the encoded reference image into a TS based on the encoded reference image and the metadata related to the 3D image.
The second processing unit converts the encoded additional image into an MPEG-2 or MPEG-4 format,
And the second processing unit multiplexes the formatted additional images.
The second processing unit may signal the additional image to transmit the multiplexed additional image to the receiver,
The signaling uses an SSC (Service Signaling Channel) to transmit a Service Map Table (SMT) and a Non-Real-Time Information Table (NRT-IT)
Wherein the SMT provides information about a service providing the 3D image,
Wherein the NRT-IT provides content item information for configuring the service.
Encoding a reference image of the 3D image;
Multiplexing the encoded reference image into a transport stream (TS);
Transmitting a reference image multiplexed with the TS to a receiver via a first network, the first network being a terrestrial network;
Encoding an additional image of the 3D image; And
Transmitting the encoded additional image to the receiver over a second network
Lt; / RTI >
Wherein the three-dimensional image is provided based on the reference image and the additional image.
Encoding a reference image of the 3D image;
Multiplexing the encoded reference image into a transport stream (TS);
Transmitting a reference image multiplexed with the TS to a receiver through a first network in real time;
Encoding an additional image of the 3D image;
Multiplexing the encoded additional image; And
Transmitting the encoded additional image in non-real time to the receiver over a second network
Lt; / RTI >
Wherein the three-dimensional image is provided based on the multiplexed reference image and the multiplexed additional image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/326,262 US20150009289A1 (en) | 2013-07-08 | 2014-07-08 | Method and apparatus for providing three-dimensional (3d) video |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361843633P | 2013-07-08 | 2013-07-08 | |
US61/843,633 | 2013-07-08 | ||
US201361844676P | 2013-07-10 | 2013-07-10 | |
US61/844,676 | 2013-07-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20150006340A true KR20150006340A (en) | 2015-01-16 |
Family
ID=52569768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020140057614A KR20150006340A (en) | 2013-07-08 | 2014-05-14 | Method and apparatus for providing three-dimensional video |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20150006340A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160094000A (en) * | 2015-01-30 | 2016-08-09 | (주)아이피티브이코리아 | Apparatus for receiving network independent type convergence broadcast |
-
2014
- 2014-05-14 KR KR1020140057614A patent/KR20150006340A/en not_active Application Discontinuation
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160094000A (en) * | 2015-01-30 | 2016-08-09 | (주)아이피티브이코리아 | Apparatus for receiving network independent type convergence broadcast |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9392256B2 (en) | Method and apparatus for generating 3-dimensional image datastream including additional information for reproducing 3-dimensional image, and method and apparatus for receiving the 3-dimensional image datastream | |
KR101683119B1 (en) | Broadcast transmitter, Broadcast receiver and 3D video processing method thereof | |
US8289998B2 (en) | Method and apparatus for generating three (3)-dimensional image data stream, and method and apparatus for receiving three (3)-dimensional image data stream | |
US9019343B2 (en) | Image data transmission device, image data transmission method, and image data reception device | |
EP2744214A2 (en) | Transmitting device, receiving device, and transceiving method thereof | |
US20130258054A1 (en) | Transmitter and receiver for transmitting and receiving multimedia content, and reproduction method therefor | |
US9516086B2 (en) | Transmitting device, receiving device, and transceiving method thereof | |
KR101781887B1 (en) | Method and apparatus for transceiving broadcast signal | |
MX2012008818A (en) | Method and apparatus for transmitting digital broadcasting stream using linking information about multi-view video stream, and method and apparatus for receiving the same. | |
US20130276046A1 (en) | Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof | |
KR20110101099A (en) | Method and appatus for providing 3 dimension tv service relating plural transmitting layer | |
KR20120103510A (en) | Transmission apparatus and method, and reception apparatus and method for providing program associated stereoscopic broadcasting service | |
US20150009289A1 (en) | Method and apparatus for providing three-dimensional (3d) video | |
EP2822282A1 (en) | Signal processing device and method for 3d service | |
US9270972B2 (en) | Method for 3DTV multiplexing and apparatus thereof | |
KR101844236B1 (en) | Method and apparatus for transmitting/receiving broadcast signal for 3-dimentional (3d) broadcast service | |
KR20150006340A (en) | Method and apparatus for providing three-dimensional video | |
EP3280147A1 (en) | Method and apparatus for transmitting and receiving broadcast signal | |
EP2648407A2 (en) | Method and apparatus for transmitting stereoscopic video information | |
KR102204830B1 (en) | Method and apparatus for providing three-dimensional territorial brordcasting based on non real time service | |
Lee et al. | Delivery system and receiver for service-compatible 3DTV broadcasting | |
KR101277267B1 (en) | Coding method and apparatus for 3D broadcasting | |
KR20120036255A (en) | Method for transmitting information relating 3d video service in digital broadcast | |
KR20130116154A (en) | Receiving device for a plurality of signals through different paths and method for processing the signals thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |