KR20130056829A - Transmitter/receiver for 3dtv broadcasting, and method for controlling the same - Google Patents

Transmitter/receiver for 3dtv broadcasting, and method for controlling the same Download PDF

Info

Publication number
KR20130056829A
KR20130056829A KR1020120132306A KR20120132306A KR20130056829A KR 20130056829 A KR20130056829 A KR 20130056829A KR 1020120132306 A KR1020120132306 A KR 1020120132306A KR 20120132306 A KR20120132306 A KR 20120132306A KR 20130056829 A KR20130056829 A KR 20130056829A
Authority
KR
South Korea
Prior art keywords
additional
video
information
service
metadata
Prior art date
Application number
KR1020120132306A
Other languages
Korean (ko)
Inventor
권형진
윤국진
임현정
이진영
이광순
정원식
허남호
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to PCT/KR2012/009872 priority Critical patent/WO2013077629A1/en
Publication of KR20130056829A publication Critical patent/KR20130056829A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Disclosed is a broadcast program linked 3D video service in digital broadcasting. For example, a signaling technique method and a synchronization method for providing a 3D service by interworking the broadcasted basic content and additional content transmitted through a network capable of bidirectional communication are specifically defined.

Figure P1020120132306

Description

Transceiver and control method for 3DTV broadcasting {TRANSMITTER / RECEIVER FOR 3DTV BROADCASTING, AND METHOD FOR CONTROLLING THE SAME}

The present invention relates to a 3DTV service, and more particularly, to a transmission and reception apparatus for 3DTV broadcasting and a control method thereof.

In relation to the current state of 3D technology, there is an increasing demand for a video object with a sense of depth, or a service capable of giving a degree of freedom to a viewpoint in order to show a more lively scene.

Meanwhile, in order to transmit 3D content through a broadcasting network, more bandwidth is required than a conventional 2D AV program. As one method for transmitting such a large amount of 3D content, a technique for transmitting 3D data added to an existing two-dimensional AV program in advance has been discussed. For example, the 3D data may be transmitted using an extra bandwidth different from that of the existing transport channel, the receiver may store the transmitted 3D data, and then may provide 3D service in synchronization with the transmission of the real time broadcast stream. .

However, the above-mentioned method currently being discussed has the following problems.

First, in order to link 3D additional content with a real-time AV program, the 3D additional content is transmitted in advance, while the receiver has to store all the entire files.

Furthermore, even if the 3D additional content portion after the point in time is to be downloaded in order to receive the 3D linked service after the start of the real-time program, the additional 3D content is one file, so the entire file must be downloaded.

In view of the above-described problems, the present invention proposes a solution for providing 3D service in conjunction with an existing real-time broadcast stream even in a scenario in which additional content is transmitted in a streaming format as well as a file format. Furthermore, a signaling method for interworking two contents transmitted through different channels and a synchronization method for synchronizing the two contents on a frame basis will be described.

One embodiment of the present invention, by providing additional content that requires a lot of bandwidth for the 3DTV service, by transmitting the data additionally necessary to the existing AV program through a network capable of bidirectional communication rather than a broadcasting network, 3D service will be provided when requested.

In addition, another embodiment of the present invention, to provide additional data for the 3D service in the form of streaming content as well as a file through a bidirectional communication network. For example, it is intended to provide 3D broadcast linked service by receiving only necessary data through random access to additional content. As one way to achieve the above object, an interworking relationship between two video contents is additionally specified in order to link two contents which are transmitted through different channels and whose reception time is not the same.

According to yet another embodiment of the present invention, after starting a real-time program, it detects which part of the whole program corresponds to synchronizing the frames of the two contents, and corresponds to the additional streaming content. We want to provide a linked service that finds the location and synchronizes the two contents.

According to an embodiment of the present invention, a digital transmission apparatus for providing a 3D service using a plurality of different networks includes: a first encoder for encoding a reference video into a preset first format; A multiplexer for multiplexing metadata and the encoded reference video, a first transmission module for transmitting the multiplexed data to a digital receiver through a unidirectional first network, and encoding the additional video in a preset second format And a second transmission module for transmitting the encoded additional video to the digital receiver through a second encoder and a bidirectional second network, wherein the metadata is additional video transmitted through the bidirectional second network. Corresponds to the information needed to process.

According to another embodiment of the present invention, a digital transmission apparatus for providing a 3D service using a plurality of different networks includes: a first encoder for encoding a reference video into a preset first format; Media paring information (media paring information) for the synchronization of the reference video and the multiplexer for multiplexing the encoded reference video (the media pairing information, a time code (frame code) to selectively define) And a first transmission module for transmitting the multiplexed data to a digital receiver through a unidirectional first network, a second encoder for encoding the additional video into a second predetermined format, and a bidirectional second And a second transmission module for transmitting the encoded additional view video to the digital receiver through a network.

A digital receiving apparatus for providing a 3D service using a plurality of different networks according to an embodiment of the present invention may include: a first receiving a metadata for an additional view and an encoded reference view through a unidirectional first network; A communication interface module (the reference image is encoded in a first format), a demultiplexer for demultiplexing the received metadata and the encoded reference image, a decoder for decoding the encoded reference image, and the metadata A second communication interface module for receiving the additional video using the metadata, the metadata including information necessary for processing the additional video transmitted through the bidirectional second network, and the processed reference video and the additional video. Based on the controller, the controller providing the 3D service.

The digital receiving apparatus for providing 3D service using a plurality of different networks according to another embodiment of the present invention provides a method for synchronizing an additional view video with a reference view video through a unidirectional first network. A first communication interface module for receiving media paring information and an encoded reference video (the media pairing information is included in a packetized elementary stream (PES), and the media pairing information is a time code). Or depacketizing a frame number), a demultiplexer for demultiplexing the media pairing information and the encoded reference video included in the PES, and a PES including the media pairing information. A PES parsing module, a decoder for decoding the encoded reference video, and a second bidirectional net And a second communication interface module configured to receive the additional view video, and a controller to provide a 3D service based on the obtained media pairing information, the decoded reference video, and the additional view.

According to an embodiment of the present invention, in order to provide additional content that requires a lot of bandwidth for the 3DTV service by transmitting data additionally required for the existing AV program through a network capable of bidirectional communication, not a broadcasting network, When the user requests, there is an effect that can provide a 3D service.

In addition, according to another embodiment of the present invention, by transmitting the additional content for the 3D service by streaming to eliminate the delay caused by the download time compared to the download method, there is an advantage in the delay sensitive services such as AV services. Furthermore, since it is sufficient to stream and play only the content part after the time when the viewer accesses the 3D service, there is an advantage that it is not necessary to download all unnecessary parts.

Furthermore, according to another embodiment of the present invention, in a hybrid network environment using both a terrestrial broadcasting network and a two-way communication network (ex: internet network), a reference relationship and synchronization information between two contents are added, and stand alone through two contents. By using both networks instead of services, interlocked services can be provided that synchronize frame to frame. Accordingly, in order to configure 3D video, a specific signaling technique and a synchronization technique for providing a 3D service by interworking a base view video of a real-time broadcast and an additional view video stream transmitted in a different channel from the base view are defined.

1 is a block diagram illustrating a method of interlocking contents of a 3DTV service according to an exemplary embodiment of the present invention.
2 is a block diagram of a transmitting device for a 3DTV hybrid linked service according to an embodiment of the present invention.
3 defines the syntax of a linkage information descriptor according to an embodiment of the present invention.
4 defines syntax of pairing information according to an embodiment of the present invention.
5 is a block diagram of a receiving apparatus for a 3DTV hybrid linked service according to an embodiment of the present invention.
6 is a diagram illustrating the transmission apparatus shown in FIG. 2 in more detail.
FIG. 7 illustrates the transmission apparatus illustrated in FIG. 2 or 6 when a real-time broadcasting service is provided among 3DTV hybrid services.
FIG. 8 illustrates the transmission apparatus illustrated in FIG. 2 or 6 when a download broadcast service is provided among 3DTV hybrid services.
FIG. 9 illustrates the transmission apparatus illustrated in FIG. 2 or 6 when a VOD service is provided among 3DTV hybrid services.
10 is a diagram illustrating the receiving apparatus illustrated in FIG. 5 more briefly.

1 is a block diagram illustrating a method of interlocking contents of a 3DTV service according to an exemplary embodiment of the present invention.

As shown in Figure 1, the reference video is transmitted to the real-time broadcast program and the additional video is transmitted in a streaming form through a separate broadcast network or communication network due to the limitation of the bandwidth, so in order to link the two contents (base video, additional video) Synchronization is required.

Therefore, in order to synchronize the two contents, synchronization information to be commonly applied to the base video and the additional video should be included. For example, if there is additional content (additional video, additional view) linked to the base content (base view, base view), the synchronization information is generated through the metadata generator 102 from the start point of encoding of the reference content. . In addition, the synchronization information is sufficient to synchronize any two contents. Further, as shown in FIG. 1, the basic video (basic content) is encoded by the basic content encoder 101, and the broadcast stream multiplexer 103 multiplexes the encoded elementary stream and metadata stream. .

For example, the synchronization information may be a frame number indicating a specific frame position of the content or may be a time code according to a certain time unit. Furthermore, the synchronization information may be included in, for example, an MPEG-2 video stream, or may be included in a private data section of an MPEG-2 PES header, or may be defined as a separate new stream and have a separate PID. It may be transmitted in the form of a packet.

In addition, the period in which the synchronization information is generated is an integer multiple of the coding unit (access unit, AU) of the reference content (basic content). Furthermore, regardless of whether the synchronization information is transmitted by the three transmission methods exemplified above, the PES packet including the synchronization information indicates an AU PES that encodes a frame to which reference is made. stamp) to have the same PTS value.

Meanwhile, in the transmission process of the additional content illustrated in FIG. 1, synchronization information is generated similarly to the above description. Of course, the content of the synchronization information should be the same as the content when the frame corresponding to the reference content (basic content) is generated. Further, as shown in FIG. 1, the additional video (additional content) is encoded in the additional content encoder 104, the synchronization information is generated in the metadata generator 105, and the multiplexing and streaming generator 106. ) Multiplexes the encoded elementary stream and metadata stream.

The synchronization information may not be necessary when the additional content is transmitted and stored in advance in the form of a file than the real-time reference content, but is required when the additional content is transmitted in the streaming form. This is because, when the streaming content is broadcasted or multicasted through a communication network, the reception time of the interlocked content is different from the broadcast stream including the reference content. Because it is necessary. In particular, as shown in FIG. 1, synchronization information is necessary to find the first position of the associated additional content when several programs are continuously streamed instead of one program.

This design has the added advantage of not having to use unicast in terms of services on the network. That is, the initial entry delay can be reduced compared to the unicast method of streaming content so that each terminal (receiver, receiver) has a different starting position, and the overall network traffic is reduced because there is no need to transmit streams point-to-point. have.

In addition, when multiplexing the elementary stream and the metadata stream of the additional content uses the MPEG-2 TS, reference is made to the PTS in the same manner as the multiplexing of the reference content. Of course, the scope of the present invention is not limited to the above contents.

The multiplexing illustrated above may vary depending on the streaming protocol used.

For example, when using the RTP protocol, the time stamp used to synchronize audio-video in that protocol can be used to synchronize metadata and additional content. However, multiplexing may provide synchronization of metadata and additional content in a streaming service, and after multiplexing, the metadata stream may be transmitted as a separate URL from the additional content stream.

2 is a block diagram of a transmitting device for a 3DTV hybrid linked service according to an embodiment of the present invention.

As shown in FIG. 2, the broadcast content is composed of basic content delivered through a broadcasting network and additional content delivered through a different network from a channel through which the basic content is transmitted.

The basic content is generated in the content server 201 and encoded in the basic content encoder 202. The signaling information and metadata generated by the signaling information and metadata generator 203 are transmitted to the broadcast stream multiplexer 204. In addition, the broadcast stream multiplexer 204 multiplexes at least one or more of encoded basic content, signaling information, or metadata. The multiplexed data is broadcasted to the broadcast network 206 through the transmitter 205.

At this time, reference information (ex: streaming content access URI, etc.) related to additional content linked to metadata of a program including basic content and synchronization information for linking the two contents are also broadcasted to the receiver.

The additional content generated by the content server 201 is encoded by the additional content encoder 207, and the metadata generator 208 generates metadata. The multiplexing and streaming generator 209 multiplexes the encoded additional content and metadata, and converts the encoded additional content and metadata into information for streaming. The protocol processor 210 packetizes the converted information in accordance with a preset protocol and transmits the packetized information to the bidirectional communication network 211.

The metadata includes, for example, information for a coordinated service (access information on basic content in additional content, synchronization information identical to the basic content, and the like). In addition, the additional content may be delivered to the digital receiver in a push or pull manner according to a service type. For example, additional content synchronized to a real-time broadcasting service may be delivered in a push form to a receiver subscribed to a 3DTV linked service in a multicast form. On the other hand, for example, when a viewer requests as a form of service providing different contents to each receiver, the service may be provided in a pull form in response to the additional content. The push or pull method described above is intended to help the understanding of the present invention, and does not limit the scope of the right.

Meanwhile, the base content (reference content, base video, base view, base stream) used in the present specification may correspond to content transmitted through a broadcasting network and reproduced by a legacy broadcast receiver.

Furthermore, additional content (additional video, additional view, additional stream) may correspond to content transmitted in conjunction with the basic content and transmitted through another channel or network. Of course, the meaning should be construed to the extent understood by those skilled in the art throughout the specification and claims.

The basic content includes, for example, multimedia data such as still images, moving images, and CG, texts, and the like, and each data is prepared in the content server 201 and delivered to the basic content encoder 202 in accordance with a delivery schedule. Each data is encoded in a predetermined manner and output in the form of an elementary stream. Such elementary streams form one program and a plurality of programs are formed and transmitted as one transport stream through a multiplexer.

In order to find streams related to a program desired by a viewer in the transport stream, data including type and configuration information of each program provided by each service should be transmitted. Such data is transmitted to the receiver in the form of a table separate from the above-described transport stream. In the present invention, such data is generated by the signaling information and metadata generating unit 203, and includes a meta data about a program including access information about additional content for 3DTV service and interworking reproduction information (synchronization information, etc.) with the additional content. Information is newly defined and included.

First, the access information for the additional content means a URL (Uniform Resource Location) or URI (Uniform Resource Identifier) assuming an IP network as starting access information required for the receiver to receive the additional content. The access information may be location information of a direct streaming server and content or location information of a meta file having information for setting parameters for content streaming.

Further, the interworking reproduction information between the basic content and the additional content is information necessary for synchronizing and playing the two contents on a frame-by-frame basis. It is configured in such a way as to specify the location (AU). To implement this, at least one of a frame number or a time code for calculating the frame number may be used. As another embodiment, the coordinated reproduction information may be inserted in the header of the packetized elementary stream when multiplexing the elementary stream other than the signaling information and the metadata generator 203.

The broadcast stream multiplexer 204 multiplexes encoded content data streams and tables containing service related information. In this case, when metadata including synchronization information is input as a separate elementary stream, the same PTS as the PES encoding the AU of the referenced basic content should be added. Finally, the transmission unit 205 transmits through a broadcast channel through channel encoding, modulation, and the like.

Unlike the basic content transmitted through the broadcasting network 206, the additional content is delivered in a file or streaming form. The additional content is also transmitted from the content server 201 to the additional content encoder 207 according to a service schedule, encoded, and output to the multiplexing and streaming generator 209. The output is in the form of a stream such as TS or wrapped in a specific file format. In addition, the encoder 207 may output a plurality of streams according to an encoding rate in order to improve reproduction stability of a receiver connected to the bidirectional communication network 211.

The metadata generator 208 defines the access information and the coordinated reproduction information as new metadata so as to access the broadcast program linked from the additional content as the new metadata. The multiplexing and streaming generation unit 209 receives metadata information for interworking with the encoded additional content and, as described above with reference to FIG. 1, adds a PTS or a timestamp having such a function to synchronize two streams. Multiplex, also convert to information for streaming and generate payload of streaming packets.

The protocol processor 210 is in charge of processing a protocol used for streaming, that is, generates and transmits a packet according to a protocol stack, or receives and interprets a packet.

On the other hand, when the protocol processor 210 receives a streaming request from a client (receiver, receiver), the multiplexing and streaming generator 209 receives stream data and generates and transmits a packet. In addition, when a control signal packet indicating that there is an error about a transmitted packet is detected or automatically detected, the corresponding transmission packet is retransmitted.

Furthermore, the main transmission operation of the streaming server will be described as follows.

First, in response to a streaming request from a client, data generated by the streaming generator 209 is transmitted to set a streaming environment. According to the protocol, the transmission process takes place in the control channel if the logical channel for data and control is set aside.

Second, if the environment parameter and encoding rate are set, the streaming packet is transmitted. At this time, the meta data which is related information is transmitted before the encoding data of the additional content, and then the media data is transmitted. The metadata includes not only information necessary for playing content at the receiver, but also related information for an interlocking service added for the present invention. Also, the encoded data stream is transmitted from the time point at which the content requested from the client is played. If there is a retransmission request for both the metadata and the additional content data, it is retransmitted.

Third, when a client receives a stream with a different encoding rate from the client, it switches to that stream and sends it. Depending on the protocol, it is also possible to divide the data-encoding stream into base and enhanced layers and transmit them to different ports (logical channels), especially to support the bandwidth of various clients in push services.

In the present invention, in order to provide 3DTV service, the linked content information and the synchronization information for accessing the linked content are conceptually required. Therefore, as an embodiment for expressing this, the former is described in the form of a reference information descriptor (see FIG. 3 below), and the latter is described in a synchronization information structure (see FIG. 4 below). However, the presentation form is not necessarily limited to this and can be flexibly changed depending on the application.

First, the linked content information is divided into access information for additional content and access information for basic content according to the displayed content. The former is a Virtual Map Table (VCT), an Event Information Table (EIT) and MPEG-2 Transport Stream (TST) Program Specific Information (PMT) information of the Program and System Information Protocol (PSIP) of a real-time broadcast stream. It can also be located inside the. In addition, when the latter additional content is transmitted to the MPEG-2 TS, it is the same position where the access information of the former additional content can be placed, and the case where the additional content is designed in another location is also within the scope of the present invention.

3 defines the syntax of a linkage information descriptor according to an embodiment of the present invention.

The reference information descriptor includes at least one of the number of contents to be linked included in the descriptor, URL information of the contents to be linked, and the type of contents to be linked. The content may be an elementary stream in the form of a file or streaming.

The descriptor_tag of FIG. 3 identifies that the descriptor is a reference information descriptor of the linked content. descriptor_length represents the length of the corresponding descriptor. linkage_content_number represents the number of content to be linked included in the descriptor. wakeup_time represents the start time of the linked content.

Furthermore, URI information (linkage_content_URI) for accessing content linked through a streaming service is required. Since the URI information has a variable length, the URI information includes the length of the URI information of the content to be linked through linkage_content_URI_length and indicates the URI information of the content to be linked with the corresponding stream with linkage_content_URI. URI information can be in various forms depending on the protocol used. Some examples are: In case of HTTP streaming, it is a server address and content name. In case of multicast, it is a multicast IP address. In case of RTP streaming, it is an IP address and a port number of a streaming server. In addition, if metadata including synchronization information is delivered in a separate stream from the additional content, URI information (linkage_metadata_URL) is additionally required to access the metadata.

The linkage_content_type illustrated in FIG. 3 means a type of additional content to be linked with the basic content. In the present specification, the additional content to be used for the 3DTV service is defined as, for example, an MPEG-2 TS type. Since the additional content is composed of an elementary stream in the TS, the additional content also includes the ts_id and elementary_pid of the TS including the interlocked content to indicate its access. In addition, in consideration of the variety of content formats for streaming additional content in the future, it is configured as a field that can be extended to the type of content. In addition, in the present invention, the embodiment mainly describes the case of stereoscopic video, but may be extended to multiview video or other types of linked services.

Here, the descriptor of FIG. 3 may also be used to refer to the basic content in the MPEG-2 TS including the additional content. In this case, the descriptor is changed to information for accessing the basic content instead of the additional content. .

Although not shown in FIG. 3, it is also within the scope of the present invention to design to further include data indicating codec information of an additional video file or stream. It is a design considering the case in which the base view video and the additional view video are compressed using different encoding methods, and thus have a technical effect of improving the efficiency of data processing and reducing the possibility of error.

4 defines syntax of pairing information according to an embodiment of the present invention. The data structure shown in FIG. 4 may correspond to synchronization information, synchronization information, and the like described in the previous drawings. Of course, the scope of the present invention is not limited to including all the fields defined in FIG. 4, and addition / modification / deletion of some fields is also included in the scope of the present invention as required by those skilled in the art.

The identifier field shown in FIG. 4 means that timing information is included after the synchronization information identifier. In the linkage information descriptor of FIG. 3, the linked content exists as much as the value of linkage_content_number, and in order to distinguish each of the contents, the value of content_index_id is used in the sync information (pairing information) of FIG. 4.

In the for loop below the linkage_content_number field of the descriptor illustrated in FIG. 3, content_index_id is defined as 1, and this value is increased by 1 whenever the for loop is operated again.

Finally, sync_information illustrated in FIG. 4 selectively defines a frame number or time code value as a kind of marker (display) for finding a playback time for linking the basic content with the additional content.

5 is a block diagram of a receiving apparatus for a 3DTV hybrid linked service according to an embodiment of the present invention.

As shown in FIG. 5, the receiving device receives a portion (first part) for receiving basic content delivered through a broadcasting network and a portion for requesting and receiving additional content delivered through a channel or network different from the broadcasting network (second part). Part) and a part (third part) providing a service by interworking two contents.

First, the part (first part) for receiving basic content will be described.

The reception processor 501 shown in FIG. 5 performs RF signal reception, demodulation, channel decoding, and the like, and simultaneously extracts a service selected by a user from a multiplexed stream. In addition, the basic content received through the reception processor 501 is decoded in a predetermined manner by the basic content decoder 508.

The signaling information and metadata analyzing unit 502 determines whether there is a service linked with the broadcast program, and delivers the described coordinated service and additional content information to the service manager 503. The service manager 503 initiates connection with a transmission device for obtaining additional content so as to configure a broadcast program linked service (3DTV service), and manages a service configured through the content if the content is available. Meanwhile, configuration information of the service may be obtained from metadata or signaling information transmitted to a broadcasting network or a communication network.

The coordinated reproduction information of the additional content may be obtained from the signaling information and metadata analyzing unit 502 or the receiving processing unit 501, and this information is transmitted to the synchronization unit 504.

Next, a part (second part) for requesting and receiving additional content will be described. The protocol processor 505 generates the additional content request in the form of a packet using the additional content access information received from the service manager 503 and transmits the additional content request to the streaming server.

Access to the additional content can be accessed to the streaming server using linkage_content_URI or linkage_content_type and the like can exchange messages for joining the group containing the additional content when the additional content is multicasted from the streaming server.

In addition, the protocol processor 505 may transmit control information requesting retransmission when the received packet has an error. The payload of the packet without an error is delivered to the streaming recovery unit 506. The streaming decompression unit 506 checks and sets parameters for decoding and playing additional content to create a streaming and playback environment. In addition, according to the protocol, the encoding rate may be determined from the metadata for setting the streaming environment sent by the server, or information may be transmitted to the server so that the server determines the encoding rate. Thereafter, the received content data stream is transferred to the additional content decoder 507.

Among the received metadata, the information for the linked service and the access information of the broadcast program are transmitted to the service manager 503, and the coordinated reproduction information is transmitted to the synchronizer 504. In addition, the streaming recovery unit 506 may determine that the reception of the content data stream is delayed or is received at a too low rate due to a change in network traffic, and may request an encoded stream having a different data rate from the streaming server according to the traffic. This can be determined by the input rate and playback rate of the buffer.

Finally, the part (third part) providing a service by interworking the two contents will be described.

For example, in the process of implementing the linked service for 3D broadcasting, the basic content constituting the broadcast program is reproduced in real time in the same manner as the operation in the legacy receiver, and additionally, additional content is played in association.

The synchronizer 504 searches for the corresponding frame in order to synchronize and reproduce the additional content linked to the basic content on a frame-by-frame basis. This is necessary to find that both content_index_id and sync_information information included in the synchronization information (eg, FIG. 3 or FIG. 4) transmitted by multiplexing the two contents are matched.

If the reception times of the two contents are different, buffer the fast side to play the screen in 3D when the sync_information is matched. However, when the streaming method is used as a pull method, the frame position of the additional content synchronized to the basic content may be obtained from the synchronization information (for example, FIG. 3 or FIG. 4), and the server may be requested to stream from this position. In this case, the synchronization unit 504 sends a request control signal to the streaming recovery unit 506.

The renderer 509 may decode the two pieces of content decoded by the additional content decoder 507 and the basic content decoder 508 using pairing information and synchronization information received from the synchronization unit 504. Synthesize the image to render and output the data for the 3D service.

Meanwhile, referring to FIG. 1 to FIG. 5 described above, the digital transmission apparatus providing the 3D service using a plurality of different networks will be described.

First, the transmission device encodes a reference video using the first encoder. The first encoder may correspond to the basic content encoder 202 shown in FIG. 2.

The transmitting apparatus multiplexes the metadata for the additional view video and the encoded reference video view. This may be performed by the broadcast stream multiplexer 204 shown in FIG. 2.

The transmitting device transmits the multiplexed data to the digital receiver through a unidirectional first network (for example, the broadcasting network 206 or the ground network channel shown in FIG. 2). This may be performed by the transmitter 205 shown in FIG.

The transmission device encodes the additional video using a second encoder. The second encoder may correspond to the additional content encoder 207 shown in FIG. 2.

The receiving device transmits the encoded additional image to the digital receiver through a bidirectional second network (for example, the bidirectional communication network 211 or the IP network shown in FIG. 2). This may be performed by the protocol processor 210 shown in FIG. 2.

The metadata may include, for example, first information indicating a number associated with the additional video, second information identifying a type of a file or a stream of the additional video, and a URI length for a service associated with the additional video. Third information and fourth information defining a URI byte for a service related to the additional video. The first information corresponds to the linked content number field shown in FIG. 3, the second information corresponds to the linked content type field shown in FIG. 3, and the third information is linked to FIG. 3. corresponds to the content URI length field, and the fourth information corresponds to the linked URI byte field shown in FIG. 3.

Further, the first information is designed to identify the number of points for accessing the additional video associated with the reference video, for example.

For example, when the service is an additional video file download broadcasting service, the third information indicates a URI length of a file corresponding to the additional video. When the service is an additional video streaming broadcasting service, the third information may be a streaming service. Indicates the length of the URI.

The fourth information includes, for example, URI information for each file constituting the additional video when the service is an additional video file download broadcast service, and when the service is an additional video streaming broadcast service, the reference URI information for receiving the entire stream for interworking between the video and the additional video is displayed.

In addition, the above-described metadata further includes fifth information indicating a service start time associated with the additional view video. The fifth information corresponds to, for example, the wakeup time field shown in FIG. 3. The metadata means additional information necessary for processing additional video provided through, for example, an IP network, and may further include information for receiving and 3D playing the additional video provided through a network server.

The service start time corresponds to start time information of an additional video file, and corresponds to start time information of a program when the additional video is provided as a streaming service.

Furthermore, subtitle information among the auxiliary data is encoded by the first encoder (basic content encoder 202 of FIG. 2) together with the bit stream of the reference video before multiplexing of the multiplexer 204 shown in FIG. 2. In addition, signaling section information of the auxiliary data is multiplexed together with the metadata without passing through the first encoder (basic content encoder 202 of FIG. 2). The auxiliary data refers to data related to, for example, access control and closed captioning.

In addition, as described above, it is also within the scope of the present invention that the reference image and the additional image are synchronized in units of frames at the same time.

Meanwhile, referring to FIG. 1 to FIG. 5 described above, another digital transmission apparatus for providing 3D service using multiple different networks will be described.

First, the transmission device encodes a reference video using the first encoder. The first encoder may correspond to the basic content encoder 202 shown in FIG. 2.

The multiplexer of the transmitting apparatus multiplexes the media paring information and the encoded base view video for synchronization of the additional view video with the base view video. The multiplexer may correspond to the broadcast stream multiplexer 204 shown in FIG. 2. In addition, the media pairing information selectively defines a time code or a frame number. For example, some of the data shown in FIG. 4 may be used.

The transmitting device transmits the multiplexed data to the digital receiver through a unidirectional first network (for example, the broadcasting network 206 shown in FIG. 2). This may be designed to be performed by the transmitter 205 shown in FIG.

The transmission device encodes the additional video using a second encoder. The second encoder may correspond to the additional content encoder 207 shown in FIG. 2.

Furthermore, the transmitting device transmits the encoded additional video to the digital receiver through a bidirectional second network (for example, the bidirectional communication network 211 shown in FIG. 2). This may be designed to be performed by the protocol processor 210 shown in FIG. 2.

The media pairing information (for example, FIG. 4) is defined at a packetized elementary stream (PES) level. Therefore, there is an advantage that can increase the efficiency of data transmission.

Meanwhile, the transmission device may be designed to further transmit an event information table (EIT) including a linkage information descriptor that defines reference information of the additional video. As described above, the reference information descriptor is included in a descriptor loop for each event in the EIT. The reference information descriptor may be implemented using some or all of the syntax defined in FIG. 3.

The reference information descriptor may include, for example, first information indicating a number associated with the additional video (ex: linked content number in FIG. 2), and second information ex indicating a URI length for a service related to the additional video. FIG. 2 includes linked information URI length of FIG. 2 and third information (eg: linked URI byte of FIG. 2) that defines a URI byte for a service related to the additional view video.

The first information identifies the number of points for accessing the additional video associated with the reference video.

The second information indicates a URI length of a file corresponding to the additional video when the service is an additional video file download broadcast service, and indicates a URI length for a streaming service when the service is an additional video streaming broadcast service. Display.

The third information includes URI information for each file constituting the additional video when the service is an additional video file download broadcasting service, and when the service is an additional video streaming broadcast service, the reference video and the URI information for receiving the entire stream for interworking between additional images is displayed.

Therefore, according to embodiments of the present invention, regardless of whether the additional video is a file format or a streaming format, and furthermore, whether the additional video is transmitted in real time or non-real time, the basic video and additional video for 3DTV service There is a technical effect of pairing images.

6 to 12 will be described below with reference to the contents of FIGS. 1 to 5 described above. Those skilled in the art may infer or interpret FIGS. 6-12 with reference to the previous figures.

6 is a diagram illustrating the transmission apparatus shown in FIG. 2 in more detail.

Elements constituting the 3D broadcast program include a 3D image, an audio signal, and auxiliary data, and the 3D image basically includes a left image and a right image. The left video and the right video are divided into independent video elementary streams of the reference video and the additional video, respectively. The auxiliary data includes caption information, program / channel signaling section data, and the like. The caption information is transmitted together with the bitstream of the reference video signal, and the signaling section data is transmitted through multiplexing. In the case of the hybrid 3DTV broadcasting program, as shown in FIG. 6, the reference video is transmitted through the terrestrial channel, and the additional video is transmitted through the IP network.

First, the transmission process of the reference video signal will be described.

The 3D content server 601 generates an audio signal and a reference video signal. As shown in FIG. 6, the additional video signal may be generated by the 3D content server 601 or may be generated from another device. The additional video signal will be described later.

The sound signal generated by the 3D content server 601 is encoded by the sound signal encoder 602, and the reference video signal is encoded by the reference video encoder 603. In addition, the reference image encoder 603 may be designed to additionally encode some of the auxiliary data described above.

The program multiplexer 604 multiplexes two data encoded by the audio signal encoder 602 and the reference image encoder 603, and the channel multiplexer 605 multiplexes the data and channel multiplexed by the program multiplexer 604. Multiplex the multiplexed information and metadata for additional video.

The final data multiplexed by the channel multiplexer 605 is transmitted to the receiver 650 through the terrestrial channel transmitter 606.

Embodiments related to transmission of the additional video signal briefly described above can be classified into three types.

In an embodiment, the additional video signal generated by the 3D content server 601 is encoded by the additional video encoder 607 in real time. The above embodiment is suitable for a hybrid 3DTV real-time broadcasting service and a more detailed processing method will be described with reference to FIG. 7 below.

In another embodiment, the additional video signal generated by the 3D content server 601 is encoded by the additional video encoder 608 in non real time. Another embodiment of the present invention is suitable for a hybrid 3DTV download and playback service and a detailed processing method will be described with reference to FIG. 8 below.

In another embodiment, an additional video signal generated by a device other than the 3D content server 601 is transmitted to the receiver directly through the additional video VOD server 609 in non real time. Another embodiment of the present invention is suitable for a hybrid 3DTV VoD service and a detailed processing method will be described with reference to FIG. 9.

Furthermore, the additional video signals encoded by the additional video encoders 607 and 608 are transmitted to the receiver 650 through the streaming web server 610. Of course, as illustrated in FIG. 6, a switching module 611 is additionally designed for selecting at least two paths. However, omitting the switching module 611 according to the needs of those skilled in the art is also within the scope of the present invention.

FIG. 7 illustrates the transmission apparatus illustrated in FIG. 2 or 6 when a real-time broadcasting service is provided among 3DTV hybrid services. Hereinafter, a hybrid 3DTV real-time broadcast service will be described with reference to FIG. 7.

Under the control of the controller 701, the 3D content server 702 processes the left image and the right image separately. For example, the left picture (base picture) is encoded by the MPEG-2 encoder 703 and then passed to the TS multiplexer 704. In addition, the TS multiplexer 704 multiplexes the metadata generated by the PSIP generation unit 705 and the encoded left image (base image), and transmits the TS multiplexer to the receiver through a broadcasting network in a TS form. The metadata will not be duplicated since it has been sufficiently described in the previous drawings.

Meanwhile, the right image (additional image) is encoded by the streaming encoder 706 and then transmitted to the receiver by the streaming web server 707. Unlike the transmission of the left image, the transmission of the right image uses a two-way IP network. Thus, an embodiment of the present invention is applicable to a hybrid 3DTV real-time broadcasting service.

FIG. 8 illustrates the transmission apparatus illustrated in FIG. 2 or 6 when a download broadcast service is provided among 3DTV hybrid services. Hereinafter, a hybrid 3DTV download and playback service will be described with reference to FIG. 8.

Under the control of the first controller 801, the left content server 802 transmits a left image (basic image) to the MPEG-2 encoder 803. The first controller 801 is used for the left image.

In addition, the TS multiplexer 805 multiplexes the metadata generated by the PSIP generation unit 804 and the encoded left image (base image), and transmits the TS multiplexer to the receiver through a broadcasting network in a TS form. The metadata will not be duplicated since it has been sufficiently described in the previous drawings.

Under the control of the second controller 806, the light content server 807 transmits the right image (additional image) to the streaming encoder 808. The second controller 806 is used for the light image. In addition, the streaming web server 809 transmits the right image to the receiver through a two-way IP network. Unlike the transmission of the left image, the streaming web server 809 uses an two-way IP network. Is applicable to hybrid 3DTV download and playback services.

FIG. 9 illustrates the transmission apparatus illustrated in FIG. 2 or 6 when a VOD service is provided among 3DTV hybrid services. Hereinafter, a hybrid 3DTV Video On Demand (VoD) service will be described with reference to FIG. 9.

Under the control of the controller 901, the left content server 902 transmits the right image to the MPEG-2 encoder 903. In addition, the TS multiplexer 905 multiplexes the metadata generated by the PSIP generation unit 904 and the encoded left image (base image), and transmits the TS multiplexer to a receiver through a broadcasting network in a TS form. The metadata will not be duplicated since it has been sufficiently described in the previous drawings.

Meanwhile, the right image (additional image) is transmitted to the receiver through the light VOD server 906. Unlike the transmission of the left image, the transmission of the right image uses a two-way IP network, so that an embodiment of the present invention is applicable to a hybrid 3DTV VOD service.

FIG. 10 is a simplified view of the receiving device shown in FIG. 5.

The reception apparatus according to an embodiment of the present invention includes, for example, a reference image processing part 1010, an additional image processing part 1020, a data processor 1030, a display module 1040, and the like.

Furthermore, as shown in FIG. 10, the reference image processing part 1010 further includes a PSI / PSIP decoder 1011, a PES parsing module 1012, a first video decoder 1013, and the like. The reference image processing part 1010 generates a reference image of the 3D service by performing demultiplexing and decoding on the real-time reference image stream received in real time.

First, the PSI / PSIP decoder 1011 extracts a PSI / PSIP stream included in a real time reference video stream. The PSI / PSIP decoder 1011 extracts PES packets, synchronization information streams, interworking information, etc. related to the reference video by using configuration information and pairing descriptors of the reference video stream and the synchronization information stream.

The PES packet related to the reference image is transmitted to the PES parsing module 1012, and pairing information is transmitted to the reception / storage module 1021 of the additional image processing part 1020 under the control of the controller 1050. Configuration information of the reference video stream and the synchronization information stream is included in, for example, a Program Map Table (PMT).

Accordingly, the PSI / PSIP decoder 1011 analyzes a descriptor of the PMT and identifies whether the corresponding video is a reference video or an additional video, and whether the corresponding video is a left / right video.

The PES parsing module 1012 receives a PES packet related to the reference video from the PSI / PSIP decoder 1011 and also generates a reference video stream composed of the video ES by parsing the PES packet. That is, the PES parsing module 1012 configures the reference video stream as a video ES based on the PES packet, and when the values of the Decoding Time Stamp (DTS) and the Program Clock Reference (PCR) are the same as the existing broadcasting standard, the first video is performed. Transmit to decoder 1013. According to an embodiment of the present invention, for example, the reference video stream may be an MPEG-2 video stream. Meanwhile, the stream including the synchronization information may also be processed by the PES parsing module 1012 or a separate PES parsing module.

The additional image processing part 1020 may include a reception / storage module 1021, a file / stream parsing module 1022, and a second video decoder 1023. The additional image processing part 1020 generates an additional image by receiving and decoding a stream or a file related to the additional image that provides a 3D service in association with the reference image.

The additional video stream or additional video file is received and stored in the reception / storage module 1021. The stream can be decoded immediately without being received and stored in real time, and the file is received in advance and stored in the form of a file.

The reception / storage module 1021 receives the interworking information (pairing information, etc.) through the PSI / PSIP decoder 1011 and the controller 1050 and receives a stream or file indicated by the interworking information and receives the received additional video stream. Or matches a file.

The file / stream parsing module 1022 receives the file or stream identification information and the synchronization information from the PES parsing module 1012 and the controller 1050 of the reference image processing part 1010, and then matches an additional image that matches the reference image. Parse a file or stream into video ES. Then, it is transmitted to the second video decoder 1023. The file / stream parsing module 1022 parses the synchronization information to synchronize with the reference video, and then decodes the corresponding additional video at a time point when the reference video is decoded (extracted in consideration of DTS). Is transmitted to the second video decoder 1023. At this time, some of the syntax shown in FIG. 3 or FIG. 4 may be used, and this also belongs to the scope of the present invention.

The second video decoder 1023 generates the additional video by receiving and decoding the additional video stream or the video ES type stream generated based on the file from the file / stream parsing module 1022.

The data processor 1030 based on the reference image received from the first video decoder 1013 of the reference image processing part 1010 and the additional image received from the second video decoder 1023 of the additional image processing part 1020. A 3D image is configured and output through the display module 1040.

Furthermore, another embodiment of a digital receiving apparatus for providing 3D services using a plurality of different networks shown in FIG. 10 will be described as follows.

The first communication interface module receives metadata for the additional view video and the encoded reference video via the first unidirectional network. The reference picture is encoded, for example, in a first format. The demultiplexer demultiplexes the received metadata and the encoded reference picture. The PSI / PSIP decoder shown in FIG. 10 may be designed to perform the functions of the first communication interface module and the demultiplexer.

The decoder decodes the encoded reference video. The function of the decoder may be designed to be performed by the first video decoder 1013 illustrated in FIG. 10.

The second communication interface module receives the additional video by using the metadata. The metadata includes, for example, information necessary for processing an additional video transmitted through a bidirectional second network. The function of the second communication interface module may be designed to be performed by the reception / storage module 1021 shown in FIG. 10.

The controller provides a 3D service based on the processed reference image and the additional image. The controller may be designed to perform a function of the controller in the data processor 1030 illustrated in FIG. 10.

Referring to FIG. 10, another embodiment of a digital receiving apparatus for providing 3D service using multiple different networks will be described in detail as follows.

The first communication interface module receives media paring information and encoded reference picture for synchronizing the additional view video with the reference view video through the unidirectional first network. The media pairing information is included in a packetized elementary stream (PES), and the media pairing information selectively defines a time code or a frame number. The media pairing information will be omitted as described above sufficiently.

The demultiplexer demultiplexes the media pairing information and the encoded reference picture included in the PES. The PSI / PSIP decoder 1011 of FIG. 10 may be designed to perform the functions of the first communication interface module and the demultiplexer.

The PES parsing module 1012 depacketizes the PES including the media pairing information.

The first video decoder 1013 decodes the encoded reference video.

The second communication interface module receives the additional video via the bidirectional second network. The second communication interface module may correspond to the reception / storage module 1021 of FIG. 10.

The controller is designed to provide a 3D service based on the depacketized media pairing information and the decoded reference video and the additional video. The controller may be designed to correspond to the data processor 1030 of FIG. 10.

The first communication interface module receives an event information table (EIT) including a linkage information descriptor that defines reference information of the additional video, wherein the reference information descriptor is configured for each event in the EIT. It is included in the descriptor loop. The linkage information descriptor has been fully described above with reference to FIG. 3.

The apparatus and the control method according to an embodiment of the present invention may implement other embodiments by combining the above-described drawings and the drawings or by combining matters obvious to those skilled in the art, which also falls within the scope of the present invention.

Meanwhile, the method of operating the electronic device of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by the processor included in the electronic device. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

In addition, although the preferred embodiment of the present invention has been shown and described above, the present invention is not limited to the specific embodiments described above, but the technical field to which the invention belongs without departing from the spirit of the invention claimed in the claims. Of course, various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or the prospect of the present invention.

Claims (29)

In the control method of a digital transmission device providing a 3D service using a plurality of different networks (multiple networks),
Encoding the reference video using a first encoder;
Multiplexing metadata for an additional picture and the encoded reference picture;
Transmitting the multiplexed data to a digital receiver via a unidirectional first network; And
Transmitting the additional video to the digital receiver via a second bidirectional network;
And the metadata corresponds to information necessary for processing an additional video transmitted through the bidirectional second network.
The method of claim 1,
The metadata,
First information indicating a number associated with the additional video;
Second information for identifying a type of a file or a stream of the additional video;
Third information indicating a URI length for a service related to the additional video or
And at least one of fourth information defining a URI byte for a service related to the additional view video.
The method of claim 2,
The metadata,
And a data indicating codec information of an additional video file or stream.
The method of claim 2,
The additional video,
A method of controlling a digital transmitting apparatus, which is encoded through a second encoder, wherein the first encoder and the second encoder use different encoding schemes.
The method of claim 2,
Wherein the first information comprises:
And controlling the number of points for accessing the additional view video associated with the reference video.
The method of claim 5,
The third information is,
If the service is an additional video file download broadcasting service, the URI length of the file corresponding to the additional video is displayed.
And if the service is an additional video streaming broadcast service, displaying a URI length for a streaming service.
The method according to claim 6,
The fourth information is,
When the service is an additional video file download broadcasting service, URI information for each file constituting the additional video is included.
And when the service is an additional video streaming broadcast service, displaying URI information for receiving the entire stream for interworking between the reference video and the additional video.
The method of claim 2,
The metadata,
And a fifth information indicating a service start time associated with the additional view video.
9. The method of claim 8,
The service start time is,
Corresponds to start time information of the additional video file, and
And when the additional video is provided as a streaming service, corresponding to the start time information of a program.
The method of claim 1,
Subtitle information of the auxiliary data is encoded in the first encoder together with the bit stream of the reference video before the multiplexing step,
The signaling section information of the auxiliary data is multiplexed together with the metadata without passing through the first encoder.
The method of claim 1,
The unidirectional first network corresponds to a terrestrial channel,
And the second bidirectional network corresponds to an IP network.
The method of claim 1,
The reference video and the additional video,
A control method of a digital transmitting device, characterized in that the synchronization in the unit of a frame at the same time (rendering time).
In the digital transmission device for providing a 3D service using a plurality of different networks (multiple networks),
A first encoder for encoding the reference video into a predetermined first format;
A multiplexer for multiplexing metadata for an additional picture and the encoded reference picture;
A first transmission module for transmitting the multiplexed data to a digital receiver via a first unidirectional network;
A second encoder for encoding the additional video in a second preset format; And
A second transmission module configured to transmit the encoded additional view video to the digital receiver through a bidirectional second network;
And the metadata corresponds to information necessary for processing additional video transmitted through the bidirectional second network.
The method of claim 13,
The metadata,
First information indicating a number associated with the additional video;
Second information for identifying a type of a file or a stream of the additional video;
Third information indicating a URI length for a service related to the additional view video;
Fourth information defining a URI byte for a service related to the additional view video or
And at least one of fifth information indicating a service start time associated with the additional video.
15. The method of claim 14,
The metadata,
And a data indicating codec information of an additional video file or stream.
The method of claim 13,
The first format corresponds to the MPEG-2 format,
And the second format corresponds to an H.264 format.
15. The method of claim 14,
The service start time is,
Corresponds to start time information of the additional video file, and
And when the additional video is provided as a streaming service, corresponding to start time information of a program.
15. The method of claim 14,
The service start time is,
Corresponds to start time information of the additional video file, and
And when the additional video is provided as a streaming service, corresponding to start time information of a program.
The method of claim 13,
The reference video and the additional video,
Digital transmitting apparatus characterized in that the synchronization in the unit of frame (frame) at the same time (rendering time).
In the control method of a digital receiving device providing a 3D service using a plurality of different networks,
Receiving, via the unidirectional first network, metadata for the additional picture and the encoded reference picture, wherein the reference picture is encoded in the first format;
Demultiplexing the received metadata and an encoded reference picture;
Decoding the encoded reference video;
Receiving the additional video using the metadata, wherein the metadata includes information necessary for processing the additional video transmitted through the bidirectional second network; And
Providing a 3D service based on the processed reference image and the additional image;
Control method of a digital receiving device for providing a 3D service using a plurality of different networks comprising a.
21. The method of claim 20,
The additional video is encoded in a second format, and wherein the first format and the second format are different from each other.
The method of claim 21,
The first format corresponds to the MPEG-2 type,
And the second format corresponds to the H.264 type.
21. The method of claim 20,
The metadata,
First information indicating a number associated with the additional video;
Second information for identifying a type of a file or a stream of the additional video;
Third information indicating a URI length for a service related to the additional view video;
Fourth information defining a URI byte for a service related to the additional view video or
And at least one of fifth information indicating a service start time associated with the additional video.
24. The method of claim 23,
The metadata,
And a data indicating codec information of an additional video file or stream.
In the digital receiving device providing a 3D service using a plurality of different networks,
A first communication interface module for receiving, via the unidirectional first network, metadata for the additional picture and the encoded reference picture, wherein the reference picture is encoded in the first format;
A demultiplexer for demultiplexing the received metadata and the encoded reference picture;
A decoder for decoding the encoded reference video;
A second communication interface module for receiving the additional video using the metadata, wherein the metadata includes information necessary for processing the additional video transmitted through the bidirectional second network; And
A digital receiving apparatus for providing a 3D service using a plurality of different networks including a controller for providing a 3D service based on the processed reference image and the additional image.
26. The method of claim 25,
The additional video is encoded in a second format, wherein the first format and the second format is different from each other.
The method of claim 26,
The first format corresponds to the MPEG-2 type,
And the second format corresponds to an H.264 type.
26. The method of claim 25,
The metadata,
First information indicating a number associated with the additional video;
Second information for identifying a type of a file or a stream of the additional video;
Third information indicating a URI length for a service related to the additional view video;
Fourth information defining a URI byte for a service related to the additional view video or
And at least one of fifth information indicating a service start time associated with the additional video.
29. The method of claim 28,
The metadata,
And a data indicating codec information of an additional video file or stream.
KR1020120132306A 2011-11-22 2012-11-21 Transmitter/receiver for 3dtv broadcasting, and method for controlling the same KR20130056829A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2012/009872 WO2013077629A1 (en) 2011-11-22 2012-11-21 Transmitting and receiving apparatus for 3d tv broadcasting and method for controlling same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20110122001 2011-11-22
KR1020110122001 2011-11-22

Publications (1)

Publication Number Publication Date
KR20130056829A true KR20130056829A (en) 2013-05-30

Family

ID=48664756

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120132306A KR20130056829A (en) 2011-11-22 2012-11-21 Transmitter/receiver for 3dtv broadcasting, and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20130056829A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016028119A1 (en) * 2014-08-22 2016-02-25 엘지전자 주식회사 Broadcast signal transmitting method, broadcast signal transmitting device, broadcast signal receiving method, and broadcast signal receiving device
WO2016163603A1 (en) * 2015-04-05 2016-10-13 엘지전자 주식회사 Method and device for transmitting and receiving broadcast signal for broadcast service on basis of xml subtitle
WO2019112398A1 (en) * 2017-12-08 2019-06-13 주식회사 에어코드 Method and system for providing sign language broadcast service using companion screen service

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016028119A1 (en) * 2014-08-22 2016-02-25 엘지전자 주식회사 Broadcast signal transmitting method, broadcast signal transmitting device, broadcast signal receiving method, and broadcast signal receiving device
US10645674B2 (en) 2014-08-22 2020-05-05 Lg Electronics Inc. Method for transmitting broadcast signals, apparatus for transmitting broadcast signals, method for receiving broadcast signals and apparatus for receiving broadcast signals
US11212772B2 (en) 2014-08-22 2021-12-28 Lg Electronics Inc. Method for transmitting broadcast signals, apparatus for transmitting broadcast signals, method for receiving broadcast signals and apparatus for receiving broadcast signals
US11678346B2 (en) 2014-08-22 2023-06-13 Lg Electronics Inc. Method for transmitting broadcast signals, apparatus for transmitting broadcast signals, method for receiving broadcast signals and apparatus for receiving broadcast signals
WO2016163603A1 (en) * 2015-04-05 2016-10-13 엘지전자 주식회사 Method and device for transmitting and receiving broadcast signal for broadcast service on basis of xml subtitle
US10595099B2 (en) 2015-04-05 2020-03-17 Lg Electronics Inc. Method and device for transmitting and receiving broadcast signal for broadcast service on basis of XML subtitle
WO2019112398A1 (en) * 2017-12-08 2019-06-13 주식회사 에어코드 Method and system for providing sign language broadcast service using companion screen service

Similar Documents

Publication Publication Date Title
US11678022B2 (en) Transmission device, transmission method, reception device, and reception method
KR101639358B1 (en) Transmission apparatus and method, and reception apparatus and method for providing 3d service using the content and additional image seperately transmitted with the reference image transmitted in real time
CA2839553C (en) Media content transceiving method and transceiving apparatus using same
EP2690876A2 (en) Heterogeneous network-based linked broadcast content transmitting/receiving device and method
US20130271657A1 (en) Receiving apparatus for providing hybrid service, and hybrid service providing method thereof
US20130276046A1 (en) Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof
KR20130050953A (en) Method for transceiving media files and device for transmitting/receiving using same
CN103329551A (en) Reception device for receiving a plurality of real-time transfer streams, transmission device for transmitting same, and method for playing multimedia content
US20130271568A1 (en) Transmitting system and receiving apparatus for providing hybrid service, and service providing method thereof
KR20110123658A (en) Method and system for transmitting/receiving 3-dimensional broadcasting service
US20140204962A1 (en) Transmitting device, receiving device, and transceiving method thereof
KR20130076803A (en) Apparatus and method for transmitting and receiving contents based on internet
KR20120103510A (en) Transmission apparatus and method, and reception apparatus and method for providing program associated stereoscopic broadcasting service
KR20130069581A (en) Apparatus and method for transmitting and receiving contents based on internet
US20230188767A1 (en) Transmission device, transmission method, reception device, and reception method
KR20130069582A (en) Apparatus and method for transmitting and receiving contents based on internet
KR101790526B1 (en) Apparatus and method for transmitting and receiving contents based on internet
KR20130056829A (en) Transmitter/receiver for 3dtv broadcasting, and method for controlling the same
KR20130084971A (en) Apparatus and method for transmitting and receiving contents based on internet
KR102016674B1 (en) Receiving device for providing hybryd service and method thereof
KR101808672B1 (en) Transmission apparatus and method, and reception apparatus and method for providing 3d service using the content and additional image seperately transmitted with the reference image transmitted in real time
KR20130115975A (en) Transmitting system and receiving device for providing hybrid service, and methods thereof
KR20130116154A (en) Receiving device for a plurality of signals through different paths and method for processing the signals thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination