WO2013019042A1 - Appareil et procédé d'émission et appareil et procédé de réception permettant de fournir un service 3d par le biais d'une liaison avec une image de référence émise en temps réel ainsi qu'avec une image et un contenu supplémentaires émis séparément - Google Patents
Appareil et procédé d'émission et appareil et procédé de réception permettant de fournir un service 3d par le biais d'une liaison avec une image de référence émise en temps réel ainsi qu'avec une image et un contenu supplémentaires émis séparément Download PDFInfo
- Publication number
- WO2013019042A1 WO2013019042A1 PCT/KR2012/006045 KR2012006045W WO2013019042A1 WO 2013019042 A1 WO2013019042 A1 WO 2013019042A1 KR 2012006045 W KR2012006045 W KR 2012006045W WO 2013019042 A1 WO2013019042 A1 WO 2013019042A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- stream
- additional
- information
- real time
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/08—Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H20/00—Arrangements for broadcast or for distribution combined with broadcast
- H04H20/18—Arrangements for synchronising broadcast or distribution via plural systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H20/00—Arrangements for broadcast or for distribution combined with broadcast
- H04H20/40—Arrangements for broadcast specially adapted for accumulation-type receivers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/597—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
- H04N21/2353—Processing of additional data, e.g. scrambling of additional data or processing content descriptors specifically adapted to content descriptors, e.g. coding, compressing or processing of metadata
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2362—Generation or processing of Service Information [SI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2381—Adapting the multiplex stream to a specific network, e.g. an Internet Protocol [IP] network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43072—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/654—Transmission by server directed to the client
- H04N21/6543—Transmission by server directed to the client for forcing some client operations, e.g. recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4344—Remultiplexing of multiplex streams, e.g. by modifying time stamps or remapping the packet identifiers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6125—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
Definitions
- the present invention relates to a transmission device and method for providing a 3D service, and a reception device and a method, and more particularly, a transmission for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time.
- NRT Non-Real-Time
- the present invention proposes a system for delivering high-quality 3D service by delivering content using a transmission network other than the broadcast network and interworking the delivered content with the content delivered in real time to overcome the limitations of the existing broadcast network.
- An object of the present invention is to link the additional video and content transmitted separately from the reference video transmitted in real time to provide a high-quality 3D service by interlocking a previously received 2D video file and a real-time received stream 2D content to implement a 3D interlocking service.
- another object of the present invention is to provide a reference relationship between two images for interworking two different content at the time of reception, to provide a frame synchronization for providing a stereoscopic video service, which can be used in the conventional broadcast system 3D service by linking additional video and contents separately transmitted with reference video transmitted in real time constituting high quality 3D service by inserting signaling scheme for reference relationship between two images and time information for synchronization between frames to provide It is to provide a transmission apparatus and method, and a receiving apparatus and method for providing a.
- the transmission method for providing a 3D service of the present invention for achieving the above object is a transmission method for providing a 3D service by interworking the reference image transmitted in real time and the additional image and content transmitted separately from the reference image Generating a real-time reference video stream based on the reference video and transmitting the real-time reference video stream to a receiver in real time; And transmitting the additional video and content for providing a 3D service in association with the reference video separately from the reference video stream to the receiving side, wherein the real-time reference video stream is linked with the reference video. It may include interworking information that is information related to the additional video and content, and synchronization information for synchronizing the reference video with the additional video and content.
- the additional video and content may be transmitted in the form of a stream or a file in real time or non-real time.
- the interworking information may include a descriptor tag (descriptor_tag) for identifying an interworking descriptor that is a descriptor associated with the interworking information; Descriptor length information (descriptor_length) indicating the length of the interworking descriptor; Linkage media number information (linkage_media_number) indicating the number of files and streams to be linked included in the linkage descriptor; Media index id information (media_index_id) which is an id value for identifying the file and stream to be linked; Wake-up time information (start_time) indicating a service start time of the file and stream to be linked; Linked URL information (linkage_URL) indicating URL information of the file and stream to be linked; URL length information (linkage_URL_length) indicating the length of the URL information; And linkage media type information (linkage_media_type) indicating a type of the file and stream to be linked.
- descriptor tag descriptor_tag
- Descriptor_length indicating the length of the inter
- the synchronization information may include a synchronization information identifier which is information for identifying the synchronization information; A 3D classification flag (2D_3D_flag) for distinguishing whether a type of service supported by the current broadcast stream is 2D or 3D; Media index id information (media_index_id) which is an id value for identifying the file and stream to be linked; And frame number information (frame_number) indicating a counter value for finding a reproduction time point for interworking the reference video, the additional video, and the content.
- a synchronization information identifier which is information for identifying the synchronization information
- a 3D classification flag (2D_3D_flag) for distinguishing whether a type of service supported by the current broadcast stream is 2D or 3D
- Media index id information media_index_id
- frame_number frame number information indicating a counter value for finding a reproduction time point for interworking the reference video, the additional video, and the content.
- the generating of the real-time reference video stream may include a video encoding step of encoding the reference video to generate a reference video stream; PES packetization step of packetizing the reference video stream to generate a PES packet; A PSI / PSIP generation step of generating PSI / PSIP (Program Specific Information / Program and System Information Protocol) based on the interworking information; And a multiplexing step of generating the real-time reference video stream by multiplexing the PSI / PSIP and the PES packet.
- PSI / PSIP generation step of generating PSI / PSIP (Program Specific Information / Program and System Information Protocol) based on the interworking information
- a multiplexing step of generating the real-time reference video stream by multiplexing the PSI / PSIP and the PES packet.
- the video encoding step includes encoding the reference video to generate an MPEG-2 video stream, and the multiplexing step includes multiplexing the PSI / PSIP and the PES packet to generate an MPEG-2 TS stream. can do.
- the additional video and content transmission may include a video encoding step of encoding an additional video and content to generate an elementary stream; A file / stream generation step of generating an additional video file or an additional video stream based on the elementary stream according to a transmission type, wherein the video encoding step or the file / stream generation step comprises generating the synchronization information or the Generating interworking information.
- the generating of the file or stream may include generating the elementary stream in one of an MP4 format and a TS format, and the generated additional video file or additional video stream may be transmitted to the receiver in real time or non-real time.
- the synchronization information is packetized through a separate PES packetizing means different from the first PES packetizing means for packetizing the reference video stream and transmitted as a separate stream, or the PES via the first PES packetizing means. It may be included in the header of the packet to be packetized or may be encoded in the video sequence.
- the reference image may be packetized including information for identifying a start time of the 3D service in order to synchronize the reference image with the synchronization information.
- the interworking information may be included in at least one of a virtual channel table (VCT), an event information table (EIT), and a program map table (PMT) of an MPEG-2 TS PSI of the PSIP of the real-time reference video stream.
- VCT virtual channel table
- EIT event information table
- PMT program map table
- the transmission apparatus for providing a 3D service of the present invention for achieving the above object is a transmission apparatus for providing a 3D service by interworking the reference image transmitted in real time and the additional image and content transmitted separately from the reference image
- a real-time reference video stream generator for generating a real-time reference video stream based on the reference video and transmitting the real-time reference video stream to the receiver in real time;
- an additional video and content transmitter configured to transmit additional video and content for providing a 3D service in association with the reference video to the receiving side separately from the reference video stream, wherein the real-time reference video stream is additionally associated with the reference video. It may include interworking information that is information related to an image and content, and synchronization information for synchronizing the reference image with the additional image and content.
- the additional video and content may be transmitted in the form of a stream or a file in real time or non-real time.
- Receiving method for providing the 3D service of the present invention for achieving the above object is in the receiving method for providing a 3D service by interworking the reference image transmitted in real time and the additional image and content transmitted separately from the reference image Generating a reference image of the 3D service by performing demultiplexing and decoding on the real-time reference video stream received in real time; An additional image generation step of generating the additional image by separately receiving and decoding the additional image stream or the additional image file related to the additional image and the content providing the 3D service in association with the reference image; And reproducing a 3D stereoscopic image based on the reference image and the additional image, wherein the reference image generating step and the additional image generating step are included in the real-time reference video stream. And performing decoding based on synchronization based on the synchronization information, which is information related to the additional video and the content to be linked, and the synchronization information for synchronizing the reference video and the additional video.
- the generating of the reference video may include: decoding the PSI / PSIP (Program Specific Information / Program and System Information Protocol) included in the real time reference video stream to extract a PES packet and the interworking information; A PES parsing step of parsing the PES packet to generate a reference video stream composed of a video ES; And a video decoding step of decoding the reference video stream to generate the reference video.
- PSI / PSIP Program Specific Information / Program and System Information Protocol
- the synchronization information is obtained from the synchronization information stream through a parsing means separate from the first PES parsing means that parses the PES packet to generate the reference video stream, or the PES packet through the first PES parsing means.
- the header may be obtained by analyzing a header of or from the reference video stream.
- the PSI / PSIP decoding step information on whether a corresponding video is a reference video or an additional video by analyzing configuration information of the reference video stream included in a PMT (Program Map Table) of the PSI / PSIP included in the real-time reference video stream And extracting left / right image information and interworking information through an interworking descriptor included in at least one of a virtual channel table (VCT), an event information table (EIT), and a PMT of MPEG-2 TS PSI of the PSIP. Can be extracted.
- PMT Program Map Table
- the additional video generation step may include: a reception / storage step of receiving and storing the additional video stream or the additional video file and the linkage information; A file / stream parsing step of receiving the synchronization information generated in the reference video generation step and generating an elementary stream in the form of a video ES based on one of additional video streams and files related to the additional video that matches the reference video. ; And a video decoding step of generating the additional view video by decoding the generated elementary stream of the video ES type.
- the receiving / storing step may be performed through the interworking media type information (linkage_media_type) indicating the type of the stream and the file to be interlocked with the interworking information and the interlocking URL information (linkage_URL) indicating the URL information where the stream and the file to be linked are stored. Identifying a stream and a file to be.
- interworking media type information (linkage_media_type) indicating the type of the stream and the file to be interlocked with the interworking information
- linkage_URL interlocking URL information
- Receiving apparatus for providing a 3D service of the present invention for achieving the above object is a receiving apparatus for providing a 3D service by interworking the reference image transmitted in real time and the additional image and content transmitted separately from the reference image
- a reference image generator for generating a reference image of the 3D service by performing demultiplexing and decoding on the real-time reference video stream received in real time An additional image generating unit generating the additional image by separately receiving and decoding the additional image stream or the additional image file related to the additional image and the content providing the 3D service in association with the reference image;
- a reproducing unit reproducing a 3D stereoscopic image based on the reference image and the additional image, wherein the reference image generating unit and the additional image generating unit are interlocked with the reference image included in the real time reference image stream.
- the decoding may be performed by synchronizing with the synchronization based on the linkage information that is information related to the additional video and the content and the synchronization information for synchronizing the reference video with the additional video.
- a transmission device and method for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time and a receiving device and method, a hybrid of real-time broadcasting, real-time broadcasting, and non-real-time transmission stored in advance
- a hybrid of real-time broadcasting, real-time broadcasting, and non-real-time transmission stored in advance
- the two images having different formats received with a time difference Synchronization has the effect of providing a foundation for the technology to configure stereoscopic video, and provides an interlocking service that utilizes storage media.
- FIG. 1 is a system for providing a 3D service by interworking an additional video and content transmitted separately from a reference video transmitted in real time according to an embodiment of the present invention, wherein real-time and non-real-time transmission is performed from a transmitting end to a receiving end.
- FIG. 2 is a diagram illustrating an interworking descriptor for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to an embodiment of the present invention
- FIG. 3 is a diagram illustrating a synchronization information descriptor for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to an embodiment of the present invention
- FIG. 4 is a process of generating a real-time reference video stream and an additional video stream or a file of a transmission device for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to an embodiment of the present invention.
- 5A is a block diagram illustrating a configuration in which an additional video and content transmission unit transmits an additional video stream to a receiving device through a broadcasting network according to an embodiment of the present invention
- FIG. 5B is a block diagram illustrating a configuration in which an additional video stream and a content transmission unit transmit an additional video stream or an additional video file to a receiving device through an IP network according to another embodiment of the present invention
- FIG. 6 is a block diagram illustrating a configuration of a transmission apparatus for providing a 3D service in association with content transmitted in a non real time in a real time broadcasting service environment according to an embodiment of the present invention
- FIG. 7 is a block diagram illustrating a configuration of a transmission apparatus for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to another embodiment of the present invention.
- FIG. 8 is a diagram illustrating a state in which synchronization information is included in a PES packet header of a transmission device for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to another embodiment of the present invention.
- FIG. 9 is a block diagram illustrating a configuration of a transmission apparatus for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to another embodiment of the present invention.
- FIG. 10 is a block diagram illustrating a process of generating a reference image and an additional image of a receiving apparatus for providing a 3D service by interworking an additional image and content transmitted separately from a reference image transmitted in real time according to an embodiment of the present invention.
- FIG. 11 is a block diagram illustrating a configuration of a receiving apparatus for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to an embodiment of the present invention
- FIG. 12 is a block diagram illustrating a configuration of a receiving apparatus for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to another embodiment of the present invention
- FIG. 13 is a block diagram illustrating a configuration of a receiving apparatus for providing a 3D service by interworking an additional video and content transmitted separately from a reference video transmitted in real time according to another embodiment of the present invention.
- first and second may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
- the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component.
- the 3D reference video is transmitted in real time transmitted to the MPEG-2 TS technology standard, and the additional video may be transmitted in advance through the ATSC NRT technology standard. Also, since the receiving terminal has a different format from the receiving point of the image, the receiving terminal should recognize and analyze the interlocking information and the synchronization information in the reference video.
- the present invention is described in a broadcast service using MPEG-2 TS and NRT technology, the technical field is not necessarily limited to the broadcast service, and the related information and the synchronization information between the images are different because 3D content constituent images have different reception points. Applicable to all fields without
- the portion expressed as the additional video is not necessarily limited to only video information for providing the additional video, and may be extended and utilized as the content as well as the additional video.
- the 3D service providing system includes a real-time reference video stream generator 100, an additional video and content transmitter 110, an MPEG-2 TS analyzer 120, The reference image generator 130, the additional image analyzer 140, the receiver / storage unit 150, and the 3D playback unit 160 may be included.
- the transmitter transmits a reference video to the MPEG-2 TS analyzer 120 through the real-time reference video stream generator 110.
- the transmitter transmits the additional video 20 and the content to be transmitted through the ATSC NRT standard through the additional video and the content transmitter 110.
- it does not necessarily follow the ATSC NRT standard, but may transmit additional video 20 and contents in real time through a broadcasting network as well as delivery using an IP network.
- the additional video 20 refers to a 2D video that can provide a 3D service in conjunction with the reference video 10 which is 2D video content.
- the additional image 20 may be encoded based on the NRT standard in the NRT transmission server 110 and transmitted to the MPEG-2 TS interpreter in the form of MPEG-2 TS in real time.
- the additional video and content transmission unit 110 delivers interworking information and synchronization information to the real-time reference video stream generator 100.
- the real-time reference video stream generator 100 may be transmitted by inserting the 3D start display screen information to clarify the point in time to start providing the 3D service. .
- the MPEG-2 TS analyzer 120 transmits the real-time reference video stream to the reference video generator 130 and the stream or file related to the additional video to the additional video analyzer 140.
- the additional video stream transmitted in real time is transferred from the additional video analyzer 140 to the reception / storage unit 150 to enter the 3D playback unit 160 in real time and output as a 3D stereoscopic image.
- the non-real time stream or file is stored in the reception / storage unit 150 through the additional image analyzer 140.
- the real-time reference video stream is decoded into the reference video 10 through the reference video generator 130 and transmitted to the 3D playback unit 160.
- the transmitter extracts the linkage information and the synchronization information included in the received real-time reference video stream and transmits it to the reception / storage unit 150 in the same way as the transmission and reception of the real-time reference video stream.
- the receiver / storage unit 150 searches for an additional video related stream or file to be linked with the reference video 10 through the interlocking information and the synchronization information, and the additional video 20 that is synchronized with the reference video 10. 160) to output a stereoscopic image on the screen.
- the interworking information may include a Virtual Channel Table (VCT) or an Event Information Table (EIT) and MPEG-2 Transport Stream (TS) PSI (PSIP) of a Program and System Information Protocol (PSIP) of a real-time reference video stream.
- VCT Virtual Channel Table
- EIT Event Information Table
- TS MPEG-2 Transport Stream
- PSIP Program and System Information Protocol
- PMT Program Specific Information
- PMT may be located in a Program Map Table (PMT) of information.
- the interworking descriptor includes a descriptor tag 210 (descriptor_tag), descriptor length information 220 (descriptor_length), interworking media number information 230 (linkage_media_number), media index id information 240 (media_index_id), and wake-up. It may include time information 250 (start_time), URL length information 260 (linkage_URL_length), linkage URL information 270 (linkage_file_URL), linkage media type information 270 (linkage_media_type), and track ID 280 (track_id). In addition, all of the above information may not be included, and only some of the information may be included.
- the interworking descriptor is a descriptor related to interworking information and may include information related to the number of streams or files to be interlocked, URL information of a stream or file to be interlocked, and information related to types of streams or files to be interlocked. This can be expressed as syntax:
- the descriptor tag 210 which is the first information included in the interworking descriptor, is used to identify that the interworking descriptor.
- the descriptor tag 210 may have a length value of 8 bits.
- the descriptor length information 220 indicates the length of the interworking descriptor.
- the descriptor length information 220 may have a length value of 8 bits.
- the interlocked media number information 230 indicates the number of streams or files to be interlocked in the interlocking descriptor.
- the companion media number information 230 may also have a length value of 8 bits.
- the following information may be further displayed.
- the media index id information 240 indicates an ID value for identifying a stream or file to be linked.
- the media index id information 240 may have a length value of 8 bits.
- the wakeup time information 250 indicates a start time of a stream or a file to be linked.
- the wakeup time information 250 may have a length value of 32 bits.
- the URL length information 260 indicates the length of the name of the stream or file to be linked. Since the URL information of the stream or file to be linked has a variable length, the length of URL information of the stream or file to be linked may be determined by the receiver through the URL length information 260.
- the URL length information 260 may have a length value of 8 bits.
- the interlocking URL information 270 indicates a name of a stream or file to be interlocked.
- the stream or file to be linked may be transmitted in real time, or is previously stored in the receiving terminal through an NRT service, and thus URL information of the stream or file to be linked is required. Therefore, the URL information of the stream or file to be linked with the reference video stream may be checked through the interlocking URL information 270.
- the interlocking URL information 270 may have a variable bit value.
- the companion media type information 280 indicates the type of stream or file to be linked with the reference video.
- the additional video to be used for the 3D service may be generated as an MP4 file type.
- the companion media type information 280 may be configured as a field in which the type of the stream or file can be extended in consideration of the diversity of the format of the stream or file generated based on the additional video.
- the track ID 290 indicates a track ID of the stream or file when the stream or file type to be linked is a specific type such as MP4.
- the track ID 290 may have a length value of 32 bits.
- the synchronization information descriptor includes a synchronization information identifier 310, a 3D distinguishing flag 320 (2D_3D_flag), media index id information 330 (media_index_id), and frame number information 340 (frame_number). can do.
- a synchronization information identifier 310 a 3D distinguishing flag 320
- media_index_id media_index_id
- frame_number frame number information
- the reference video is transmitted in real time and the additional video is transmitted in real time or in advance in non-real time, synchronization between each content is essential to compose stereoscopic video. Therefore, synchronization information to be commonly applied to the base view video and the additional view video should be included in order to synchronize the two contents.
- synchronization information (also referred to as timing information) is synchronization information of a base view video and an additional view video, and may be included in a real time reference video stream and transmitted.
- Synchronization information may be included in the MPEG-2 video stream, or may be included in the private data section of the PES header, and may be transmitted in the form of a TS packet having a separate PID (Packet Identifier) by defining a new stream.
- PID Packet Identifier
- timing information is synchronization information transmitted through a payload of a real time reference video stream.
- the synchronization information includes a synchronization information identifier 310.
- the synchronization information identifier 310 indicates that synchronization information exists after the identifier 310.
- the synchronization information identifier 310 may have a length value of 8 bits.
- the 3D classification flag 320 distinguishes whether the consumption information of the currently transmitted broadcast stream is 2D or 3D.
- the 3D division flag 320 may have a length value of 1 bit. For example, if the 3D segment flag 320 has a value of '1', the currently transmitted stream may be a stream for providing a 3D service, and if the 3D segment flag 320 has a value of '0', it may represent a stream for a 2D service.
- the 3D classification flag 320 indicates that the stream is for providing 3D service, the following information may be further included.
- the media index id information 330 indicates an id value for identifying a stream or a file to be linked with the reference video.
- the media index id information 330 is added to the synchronization information to distinguish each stream or file. Can be used.
- i represents the media index 330.
- First values define the media index id information 330 as one. Then, each time the loop operates again, the media index id information 330 increases by one.
- the media index id information 330 may have a length value of 8 bits.
- the frame number information 340 indicates a counter value for finding a reproduction time point for interworking the reference video with the additional video. That is, when the picture of the reference video is counted and linked to the 3D service from the i th picture, synchronization information may be transmitted including the number information of i in the frame number information 340.
- the additional video also has a counter value.
- the frame number information 340 may have a length value of 32 bits.
- the receiving side has an advantage of performing synchronization with a very small amount of information using the frame number information 340 and the media index id information 330.
- the synchronization information may be transmitted in a separate stream.
- a transmission apparatus includes a real time reference video stream including an image storage unit 400, a video encoder 410, a PES packetizer 420, and a multiplexer 430.
- An additional image and content transmitter including a generator, a video encoder 440, and a file / stream generator 450 may be included.
- the real-time reference video stream generator in relation to the reference video 402, the real-time reference video stream generator generates a real-time reference video stream by encoding, packetizing, and multiplexing the reference video 402.
- the reference image 402 is stored in the image storage unit 400 together with the additional image 404.
- the video encoder 410 receives the reference image 402 from the image storage unit 400 and encodes the reference image to generate a reference image stream.
- the video encoder 410 may be an MPEG-2 image encoder, and the reference image 402 may be encoded into an MPEG-2 image stream.
- the PES packetizer 420 generates a PES packet by receiving a reference video stream from the video encoder 410 and packetizing it. In this case, the PES packetizer inserts a 3D start display screen into the reference image 402 to synchronize with the reference image 402 based on the start time of the 3D broadcast.
- the multiplexer 430 receives a PES packet related to the reference video from the PES packetizer 430, and generates a real-time reference video stream by receiving and multiplexing the PSI / PSIP from a PSI / PSIP generator (not shown).
- the multiplexer 430 may generate the real-time reference video stream in the form of MPEG-2 TS packet.
- the additional video and the content transmission unit encode the additional video 404 and the content, and generate and multiplex the stream or the file to generate the additional video stream or the additional video file.
- the video encoder 440 generates an elementary stream by receiving and encoding the additional image 404 and the content from the image storage unit 400.
- the elementary stream may have a video ES form.
- the file / stream generator 460 generates an additional video stream or a file based on the elementary stream generated based on the additional video 404 and the content from the video encoder 440.
- the stream generator 462 may be a muxer and generates an additional video stream by multiplexing an elementary stream.
- the additional video stream may be an MPEG-2 TS stream.
- the additional video stream may be transmitted in real time through a transmission form of streaming.
- the file generator 464 generates an additional video file based on the elementary stream.
- the file may be an MP4 file.
- the additional video file may be received in real time and reproduced immediately, may be transmitted in advance in non-real time, and stored on the receiving side. Then, the 3D stereoscopic video may be generated in association with the reference video 402 transmitted in real time. Can be.
- the real-time reference video stream generator and the additional video and content transmitter include a transmitter, and transmit the stream or file generated through the multiplexer 430 and the file / stream generator 460 to the receiver. .
- FIG. 5A is a block diagram illustrating a configuration in which an additional video and content transmission unit transmits an additional video stream to a receiving device through a broadcasting network according to an embodiment of the present invention.
- the additional video and content transmitter 500 may transmit the additional video stream to the receiving device 520 through the broadcast network 510.
- the transmission form may be a streaming form.
- the base view video and the additional view video are simultaneously transmitted to the receiving device 520 in real time, but the base view video and the additional view video are transmitted through separate streams. Therefore, by including the interlocking information and the synchronization information in the stream or through a separate stream, it is possible to synchronize the reference video and the additional video transmitted in real time.
- FIG. 5B is a block diagram illustrating a configuration in which an additional video and a content transmission unit transmit an additional video stream or an additional video file to a receiving device through an IP network according to another embodiment of the present invention.
- the additional image and content transmitter 550 may transmit the additional image to the receiving device 570 through the IP network 560.
- the reception device 570 may request transmission of the additional video to the additional video and content transmitter 550 through the IP network 560.
- the additional video and content transmitter 550 receives the request, the additional video and content transmission unit 550 transmits the additional video in the form of a stream or a file in response to the request.
- transmitting in a streaming form it may be transmitted in real time. It may also be transmitted in non real time. Files can also be sent in real time or non-real time.
- the additional video and the content may be transmitted to the receiving device 570 without a separate video request.
- FIG. 6 is a block diagram illustrating a configuration of a transmission apparatus for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to an embodiment of the present invention.
- the transmission apparatus for providing a 3D service according to an embodiment of the present invention may include a real-time reference video stream generator 600 and an additional video and content transmitter 660.
- the real time reference video stream generator 600 includes an image storage unit 610, a video encoder 620, a PES packetizer set 630, a PSI / PSIP generator 640, and a multiplexer ( 650).
- the real-time reference video stream generator 600 generates a real-time reference video stream based on the reference video 402 and transmits it to the receiver in real time.
- the image storage unit 610 stores the reference image 602 and the additional image 606.
- the reference image 602 represents a left image of the 3D service as the image for the 3D service.
- the additional image 606 is a 2D image constituting a 3D screen in cooperation with the reference image 602 and represents a 3D right image.
- the 3D left image and the 3D right image may be interchanged with each other in some cases.
- the reference image 602 may be referred to in the broadcast programming order, and is transmitted to the video encoder 620 according to the order.
- the reference image 602 may include information indicating the start display screen 604 of the 3D TV.
- the image storage unit 610 stores the reference image 602 and the additional image 606, and the reference image 604 is the video encoder 620 for generating the real time reference image stream, and the additional image 606. Transmits the additional video stream or the additional video file to the video encoder 660.
- the image storage unit 610 receives and stores the synchronization information 608 from the video encoder 662 included in the additional video and content transmitter 660, and then stores the synchronization information 608 in the PES packetizer 634. ).
- the video encoder 620 generates a reference video stream by receiving the reference video 602 from the image storage 610 and encoding the reference video.
- the video encoder 620 may be an MPEG-2 image encoder, and the reference image 602 may be encoded into an MPEG-2 image stream.
- the PES packetizer set 630 may include two PES packetizers 632 and 634.
- the PES packetizer 632 generates a PES packet by receiving a reference video stream from the video encoder 620 and packetizing it.
- the PES packetizer inserts the 3D start display screen 604 into the reference image 602 to synchronize the reference image 602 with the synchronization information 608 based on the start time of the 3D broadcast.
- the 3D start display screen 604 it can be seen that consumption of 3D service is possible in terms of the use vehicle.
- the other PES packetizer 634 receives the synchronization information 608 from the image storage unit 610 and generates a PES packet based thereon. That is, the PES packetizer 634 may generate a packet different from the PES packet generated by the PES packetizer 632, and the synchronization information 608 included therein may be located in the payload of the PES packet. In addition, the synchronization information 608 may be multiplexed into a separate stream and transmitted to the receiving side.
- the PSI / PSIP packetizer 640 receives the linkage information 642 from the file format generator 664 of the additional video and content transmitter 660 and generates a PSI / PSIP based on the interworking information 642.
- the PSI / PSIP packetization unit 640 may be configured such that the interworking information 642 includes at least one of a virtual channel table (VCT) of the PSIP, an event information table (EIT), and a program map table (PMT) of the MPEG-2 TS PSI. It can be packetized to be included in either.
- the EIT and the PMT may include information related to the interworking of the non-real time content based on the 3D service configuration information and the time value for informing the progress time of the corresponding service.
- the PMT may include configuration information of the reference video stream and the synchronization information stream.
- the stereoscopic_video_info_descriptor may include the reference video (the reference video) so that processing of the reference video stream and the synchronization information stream may be different according to the stream type. 602 or the additional image 606, and information about the left image or the right image.
- the multiplexer 650 receives a PES packet related to a reference image and a PES packet related to synchronization information from each PES packetizer 632 and 634, and receives the PSI / PSIP from the PSI / PSIP generation unit 640. By multiplexing, a real time reference video stream is generated. In this case, a stream including synchronization information may be generated separately from the stream related to the reference video.
- the multiplexer 650 may generate the real-time reference video stream in the form of MPEG-2 TS packet.
- the present invention may include a transmitter, which transmits a real-time reference video stream to a receiver.
- the additional image and content transmitter 660 may include a video encoder 662 and a file format generator 664.
- the additional video and content transmitter 660 receives the additional video 606 from the image storage unit 610 of the real-time reference video stream generator 600, and based on the received additional video 606, the additional video stream or the content. An additional video file is generated and transmitted to the receiver in real time or non real time.
- the video encoder 662 generates an elementary stream by receiving an additional image 606 as an input from the image storage unit 610.
- the video encoder 662 may be an element that is different from the video encoder 620 included in the real-time reference video stream generator 600, and may use an encoder having a different standard.
- the video encoder 662 may generate synchronization information 608 for synchronization with the reference image 602 based on the additional image 606 related information.
- the video encoder 662 may transmit the synchronization information 608 to the image storage unit 610.
- the file / stream generator 664 receives an elementary stream encoded by the video encoder 662 as an input, and generates an additional video file or an additional video stream. According to an embodiment of the present invention, the file / stream generator 664 may generate the elementary stream in the form of an MP4 file. In addition, the file / stream generator 664 may generate an additional video stream in the form of an MPEG-2 TS packet. The file / stream generator 664 may acquire the information on the generated stream or the file while generating the additional video file or the additional video stream based on the elementary stream, and use the specific descriptor based on the obtained information.
- the interworking information 642 may be generated. The generated interworking information 642 is transmitted to the real time reference video stream generator 600 and included in the real time reference video stream through the PSI / PSIP generator 640 and the multiplexer 650.
- the additional video and content transmitter 660 may further include a transmitter, and the transmitter transmits the generated additional video stream or additional video file to the receiver in real time or non-real time.
- FIG. 7 is a block diagram illustrating a configuration of a transmission apparatus for providing a 3D service by interworking an additional video and content transmitted separately from a reference video transmitted in real time according to another embodiment of the present invention.
- the transmission apparatus according to another embodiment of the present invention includes a configuration for transmitting synchronization information 708 through PES private data of a header of a PES packet.
- Components not described among the components related to FIG. 7 perform the same functions as those of FIG. 6.
- the synchronization information 608 generated by the video encoder 662 of the additional video and content transmitter 660 based on the additional video 606 and the content may generate a PES packet based on the reference video stream.
- the PES packetizer generates PES packets based on a reference video stream.
- 730 is included in the PES private data of the PES header and multiplexed. That is, in this case, only one PES packetizing unit 730 is required, and it is not necessary to have a separate packetizing unit, and thus an efficient configuration can be achieved. That is, in this case, the synchronization information 708 is included in the reference video stream and not transmitted as a separate stream from the reference video stream.
- FIG. 8 shows synchronization information 802 in a PES packet header 800 of a transmission device for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to another embodiment of the present invention. It is a view showing the included state.
- synchronization information 802 is included in the PES packet header 800.
- the synchronization information 802 may be included and transmitted differently according to a real time stream, which may be included in an MPEG-2 video stream, and defined as a new stream to form a TS packet having a separate PID. 8 may be transmitted, but may be included in PES private data of the PES header 800 and transmitted.
- FIG. 9 is a block diagram illustrating a configuration of a transmission apparatus for providing a 3D service by interworking additional video and content transmitted separately from a reference video transmitted in real time according to another embodiment of the present invention.
- a transmission apparatus according to another embodiment of the present invention includes a configuration to transmit synchronization information 908 included in an MPEG-2 video sequence.
- Components not described among the components related to FIG. 9 perform the same functions as those of FIG. 6.
- the synchronization information 908 generated through the video encoder 962 based on the additional image 906 is not transmitted to the image storage unit 910, but is directly a real time reference image stream generator 900. Is transmitted to the video encoder 920). Accordingly, the synchronization information 908 may be located in the PES payload of the PES packet or may be included in the video sequence through the video encoder 920 and encoded without being included in the PES private data of the PES packet header. . According to an embodiment of the present invention, when the video encoder 920 generates an MPEG-2 video stream, the video encoder 920 includes the synchronization information 908 in the MPEG-2 video sequence and encodes the same. The encoded MPEG-2 video stream is transmitted to the receiver through the PES packetizer 930 and the multiplexer 950.
- FIG. 10 is a block diagram illustrating a process of generating a reference image and an additional image of a receiving apparatus for providing a 3D service by interworking an additional image and content transmitted separately from a reference image transmitted in real time according to an embodiment of the present invention. to be.
- a receiving apparatus includes a reference image generator including a demultiplexer 1010 and a video decoder 1030, a receiver / storage unit 1050, and a file /
- An additional image generator and a playback unit 1040 including a stream parser 1064 and a video decoder 1070 may be included.
- the reference image generator may include a demultiplexer 1010 and a video decoder 1030.
- the reference image generator generates a reference image of the 3D service by performing demultiplexing and decoding on the real-time reference image stream received in real time.
- the demultiplexer 1010 may extract the reference video stream by receiving and demultiplexing the real-time reference video stream, and extract synchronization information and interworking information.
- the extracted reference video stream is decoded by the video decoder 1030 to generate a reference video, and the synchronization information is transmitted to the additional video generator to decode the additional video generated based on the additional video stream or the additional video file. Can be.
- the additional image generator may include a receiver / storage unit 1050, a file / stream parser 1060, and a video decoder 1070.
- the additional video generator generates the additional video by receiving and decoding the additional video stream or the additional video file related to the additional video providing the 3D service in real time or non-real time through a broadcasting network or an IP network.
- the additional video stream or additional video file may be received in real time by the reception / storage unit 1050 and may be reproduced as an image through a parsing and decoding process without a storage process, and may be received and stored in a file form in real time and then reproduced. Can be. That is, the additional video stream or the additional video file may be received and stored before the corresponding real time reference video stream.
- the file / stream parser 1060 includes a stream parser 1062 and a file parser 1064.
- the stream parsing unit 1062 performs a function of parsing a stream. That is, the video ES type stream may be generated by demultiplexing the additional video stream. According to an embodiment of the present invention, the stream parsing unit 1062 may demultiplex an additional video stream having an MPEG-2 TS form to generate a stream having a video ES form.
- the file parsing unit 1064 may generate a video ES-type stream by parsing a file transmitted in real time or an additional image file transmitted in non-real time, that is, transmitted in advance.
- the file / stream parsing unit 1060 parses the synchronization information to synchronize with the reference video, and then decodes the corresponding additional video at a time point at which the reference video is decoded (extracted in consideration of DTS).
- the stream of the video signal is transmitted to the video decoder 1070.
- the generated video ES stream is decoded by the video decoder 1070 to become an additional picture.
- the playback unit 1040 configures and plays a stereoscopic image based on the reference image received from the video decoder 1030 of the reference image generator and the additional image received from the video decoder 1070 of the additional image generator.
- FIG. 11 is a block diagram illustrating a configuration of a receiving apparatus for providing a 3D service in association with content received in non real time in a real time broadcasting service environment according to an embodiment of the present invention.
- the receiving apparatus may include a reference image generator 1100, an additional image generator 1150, and a playback unit 1160.
- the reference image generator 1100 includes a demultiplexer 1110 and a video encoder 1120, and the demultiplexer 1110 includes a PSI / PSIP decoder 1112 and a PES parser. (1114, 1116).
- the reference image generator 1100 generates a reference image of the 3D service by performing demultiplexing and decoding on the real-time reference image stream received in real time.
- the PSI / PSIP decoder 1112 extracts a PSI / PSIP stream included in a real time reference video stream.
- the PSI / PSIP decoder 1112 extracts the PES packet, the synchronization information stream, and the interworking information related to the reference video, through the configuration information and the interworking descriptor of the reference video stream and the synchronization information stream.
- the PES packet related to the reference video is transmitted to the PES parser 1114, the synchronization information stream is transmitted to another PES parser 1116, and the interworking information is received / stored 1152 of the additional video generator 1150. Is sent to. Configuration information of the reference video stream and the synchronization information stream is included in the PMT.
- the PSI / PSIP decoder 1112 analyzes the stereoscopic_video_info_descriptor of the PMT and checks whether the corresponding video is a reference video or an additional video and whether the corresponding video is a left or right image.
- the PES parser 1114 receives a PES packet related to the reference video from the PSI / PSIP decoder 1112 and parses the PES packet to generate a reference video stream composed of a video ES. That is, the PES parser 1114 configures the reference video stream as a video ES based on the PES packet, and when the values of the Decoding Time Stamp (DTS) and the Program Clock Reference (PCR) are the same according to the existing broadcasting standard, the video decoder To 1120.
- the reference video stream may be an MPEG-2 video stream.
- the stream including the synchronization information is transmitted to another PES parser 1116.
- the PES parser 1116 extracts synchronization information for configuring a 3D screen from the synchronization information stream.
- the PES parser 1116 transmits synchronization information of a time point corresponding to the DTS of the reference video to the file / stream parser 1154 of the additional video generator 1150.
- the video decoder 1120 receives a reference video stream from the PES parser 1114 and decodes the reference video stream to generate a reference video.
- the video decoder 1120 may generate a reference image based on the MPEG-2 image stream.
- the video decoder 1120 decodes the corresponding video at the time indicated by the DTS indicated in the PMT.
- the additional image generator 1150 may include a receiver / storage 1115, a file / stream parser 1154, and a video decoder 1156.
- the additional image generator 1150 generates an additional image by receiving and decoding a stream or file related to the additional image that provides the 3D service in association with the reference image.
- the additional video stream and the additional video file are received and stored by the reception / storage 1152.
- the stream can be decoded immediately without being received and stored in real time, and the file is received in advance and stored in the form of a file.
- the receiving / storing unit 1152 receives the interlocking information from the PSI / PSIP decoder 1112 and matches the stream and the file indicated by the interlocking information with the received additional video stream and the file.
- a plurality of additional video streams and files may be matched with the reference video by analyzing the interworking information.
- the interlocking media type information 280 and the interlocking URL information 270 of the interlocking information may be interpreted to identify a file to be interlocked stored in the reception / storage 1152.
- the file / stream parser 1154 receives file and stream identification information and synchronization information from the PES parser 1116 of the reference image generator 1100, parses an additional image file and stream that matches the reference image, and then parses the video. It is made into an ES form and transmitted to the video decoder 1156.
- the file / stream parsing unit 1154 parses the synchronization information to synchronize with the reference video, and then decodes the corresponding additional video at a time point when the reference video is decoded (extracted in consideration of DTS). It is transmitted to the video decoder 1156.
- the video decoder 1156 generates the additional video by receiving and decoding the additional video stream and the video ES type stream generated based on the file from the file / stream parser 1154. The generated additional image is transmitted to the playback unit 1160.
- the video decoder 1156 may be a different component from the video decoder 1120 of the reference image generator 1100, or may have the same component. That is, one video decoder may decode both the reference video stream and the additional video file.
- the reproduction unit 1160 is stereoscopic based on the reference image received from the video decoder 1120 of the reference image generator 1100 and the additional image received from the video decoder 1156 of the additional image generator 1150. Compose and play the video.
- FIG. 12 is a block diagram illustrating a configuration of a receiving apparatus for providing a 3D service in association with content received in a non real time in a real time broadcasting service environment according to another embodiment of the present invention.
- the receiving apparatus includes a configuration for receiving stereo information transmitted through PES private data and playing a stereoscopic video.
- Components not described among the components related to FIG. 12 perform the same functions as in FIG. 11.
- the demultiplexer 1210 includes a PSI / PSIP decoder 1212 and a PES parser 1214, and does not include a separate PES parser. That is, in the embodiment of FIG. 11, a PES parser for parsing a new synchronization information stream for transmitting synchronization information is provided separately.
- the synchronization information is a PES in a PES packet 1214 that generates a reference video stream. Can be extracted by analyzing private data in the header of the packet. The extracted synchronization information is transmitted to the file / stream parser 1254.
- the file / stream parser 1254 parses the synchronization information and transfers the stream related to the image that matches the reference image to the video decoder 1256.
- the video decoded by the video decoder 1256 is composed of a stereoscopic video and reproduced by the playback unit 1260.
- FIG. 13 is a block diagram illustrating a configuration of a receiving apparatus for providing a 3D service in association with content received in a non real time in a real time broadcasting service environment according to another embodiment of the present invention.
- a receiving apparatus includes a configuration for receiving stereo information transmitted through a stream included in an MPEG-2 video sequence and reproducing a stereoscopic video.
- Components not described among the components related to FIG. 13 perform the same functions as in FIG. 11.
- the demultiplexer 1310 includes a PSI / PSIP decoder 1312 and a PES parser 1314, and does not include a separate PES parser.
- the video decoder 1320 extracts the synchronization information from each MPEG-2 video sequence. The extracted synchronization information is transmitted to the file / stream parser 1354.
- the file / stream parser 1354 parses the synchronization information and transmits the stream related to the image that matches the reference video to the video decoder 1356.
- the video decoded by the video decoder 1356 is composed of a stereoscopic video and reproduced by the playback unit 1360.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Library & Information Science (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
La présente invention concerne un procédé d'émission permettant de fournir un service 3D par le biais d'une liaison avec une image de référence émise en temps réel ainsi qu'avec une image et un contenu supplémentaires émis séparément de ladite image de référence. Ledit procédé comprend : la génération d'un flux d'image de référence en temps réel, qui consiste à générer un flux d'image de référence en temps réel sur la base d'une image de référence et à émettre en temps réel, vers un destinataire, le flux d'image généré; et l'émission d'une image et d'un contenu supplémentaires qui consiste à émettre, vers le destinataire et séparément du flux d'image de référence, une image et un contenu supplémentaires afin de fournir un service 3D par le biais d'une liaison avec ladite image de référence. Le flux d'image de référence en temps réel comprend des informations de liaison, c'est-à-dire des informations en rapport avec l'image et le contenu supplémentaires qui doivent être reliés à l'image de référence, ainsi que des informations de synchronisation destinées à synchroniser l'image et le contenu supplémentaires sur l'image de référence.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12819673.0A EP2739043A4 (fr) | 2011-07-29 | 2012-07-27 | Appareil et procédé d'émission et appareil et procédé de réception permettant de fournir un service 3d par le biais d'une liaison avec une image de référence émise en temps réel ainsi qu'avec une image et un contenu supplémentaires émis séparément |
US14/235,490 US20140160238A1 (en) | 2011-07-29 | 2012-07-27 | Transmission apparatus and method, and reception apparatus and method for providing 3d service using the content and additional image seperately transmitted with the reference image transmitted in real time |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110075823 | 2011-07-29 | ||
KR10-2011-0075823 | 2011-07-29 | ||
KR1020120082635A KR101639358B1 (ko) | 2011-07-29 | 2012-07-27 | 실시간으로 전송되는 기준 영상과 별도로 전송되는 부가 영상 및 콘텐츠를 연동하여 3d 서비스를 제공하기 위한 전송 장치 및 방법, 및 수신 장치 및 방법 |
KR10-2012-0082635 | 2012-07-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013019042A1 true WO2013019042A1 (fr) | 2013-02-07 |
Family
ID=47894609
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2012/006045 WO2013019042A1 (fr) | 2011-07-29 | 2012-07-27 | Appareil et procédé d'émission et appareil et procédé de réception permettant de fournir un service 3d par le biais d'une liaison avec une image de référence émise en temps réel ainsi qu'avec une image et un contenu supplémentaires émis séparément |
Country Status (4)
Country | Link |
---|---|
US (1) | US20140160238A1 (fr) |
EP (1) | EP2739043A4 (fr) |
KR (1) | KR101639358B1 (fr) |
WO (1) | WO2013019042A1 (fr) |
Families Citing this family (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9594814B2 (en) | 2012-09-07 | 2017-03-14 | Splunk Inc. | Advanced field extractor with modification of an extracted field |
US10394946B2 (en) | 2012-09-07 | 2019-08-27 | Splunk Inc. | Refining extraction rules based on selected text within events |
US8682906B1 (en) | 2013-01-23 | 2014-03-25 | Splunk Inc. | Real time display of data field values based on manual editing of regular expressions |
US9753909B2 (en) | 2012-09-07 | 2017-09-05 | Splunk, Inc. | Advanced field extractor with multiple positive examples |
US8751963B1 (en) | 2013-01-23 | 2014-06-10 | Splunk Inc. | Real time indication of previously extracted data fields for regular expressions |
US20150019537A1 (en) | 2012-09-07 | 2015-01-15 | Splunk Inc. | Generating Reports from Unstructured Data |
US9582585B2 (en) | 2012-09-07 | 2017-02-28 | Splunk Inc. | Discovering fields to filter data returned in response to a search |
US8788525B2 (en) | 2012-09-07 | 2014-07-22 | Splunk Inc. | Data model for machine data for semantic search |
US20140208217A1 (en) | 2013-01-22 | 2014-07-24 | Splunk Inc. | Interface for managing splittable timestamps across event records |
US9152929B2 (en) | 2013-01-23 | 2015-10-06 | Splunk Inc. | Real time display of statistics and values for selected regular expressions |
US9277251B2 (en) * | 2013-03-15 | 2016-03-01 | Echostar Technologies L.L.C. | Geographically independent determination of segment boundaries within a video stream |
US8738629B1 (en) | 2013-05-03 | 2014-05-27 | Splunk Inc. | External Result Provided process for retrieving data stored using a different configuration or protocol |
US9916367B2 (en) | 2013-05-03 | 2018-03-13 | Splunk Inc. | Processing system search requests from multiple data stores with overlapping data |
WO2015050408A1 (fr) * | 2013-10-04 | 2015-04-09 | Samsung Electronics Co., Ltd. | Procédé et appareil permettant de partager et d'afficher des informations d'écriture |
US9838756B2 (en) | 2014-05-20 | 2017-12-05 | Electronics And Telecommunications Research Institute | Method and apparatus for providing three-dimensional territorial broadcasting based on non real time service |
US20160019316A1 (en) | 2014-07-21 | 2016-01-21 | Splunk Inc. | Wizard for creating a correlation search |
US9251221B1 (en) | 2014-07-21 | 2016-02-02 | Splunk Inc. | Assigning scores to objects based on search query results |
US9509765B2 (en) | 2014-07-31 | 2016-11-29 | Splunk Inc. | Asynchronous processing of messages from multiple search peers |
US9047246B1 (en) | 2014-07-31 | 2015-06-02 | Splunk Inc. | High availability scheduler |
US10503698B2 (en) | 2014-07-31 | 2019-12-10 | Splunk Inc. | Configuration replication in a search head cluster |
US9813528B2 (en) | 2014-07-31 | 2017-11-07 | Splunk Inc. | Priority-based processing of messages from multiple servers |
US9128779B1 (en) | 2014-07-31 | 2015-09-08 | Splunk Inc. | Distributed tasks for retrieving supplemental job information |
US10133806B2 (en) | 2014-07-31 | 2018-11-20 | Splunk Inc. | Search result replication in a search head cluster |
US10296616B2 (en) | 2014-07-31 | 2019-05-21 | Splunk Inc. | Generation of a search query to approximate replication of a cluster of events |
US20160092045A1 (en) | 2014-09-30 | 2016-03-31 | Splunk, Inc. | Event View Selector |
US9990423B2 (en) | 2014-09-30 | 2018-06-05 | Splunk Inc. | Hybrid cluster-based data intake and query |
US9935864B2 (en) | 2014-09-30 | 2018-04-03 | Splunk Inc. | Service analyzer interface |
US10235460B2 (en) | 2014-09-30 | 2019-03-19 | Splunk Inc. | Sharing configuration information for searches in data intake and query systems |
US10261673B2 (en) | 2014-10-05 | 2019-04-16 | Splunk Inc. | Statistics value chart interface cell mode drill down |
US11231840B1 (en) | 2014-10-05 | 2022-01-25 | Splunk Inc. | Statistics chart row mode drill down |
US10447555B2 (en) | 2014-10-09 | 2019-10-15 | Splunk Inc. | Aggregate key performance indicator spanning multiple services |
US10417225B2 (en) | 2015-09-18 | 2019-09-17 | Splunk Inc. | Entity detail monitoring console |
US9491059B2 (en) | 2014-10-09 | 2016-11-08 | Splunk Inc. | Topology navigator for IT services |
US10536353B2 (en) | 2014-10-09 | 2020-01-14 | Splunk Inc. | Control interface for dynamic substitution of service monitoring dashboard source data |
US10235638B2 (en) | 2014-10-09 | 2019-03-19 | Splunk Inc. | Adaptive key performance indicator thresholds |
US10505825B1 (en) | 2014-10-09 | 2019-12-10 | Splunk Inc. | Automatic creation of related event groups for IT service monitoring |
US9760240B2 (en) | 2014-10-09 | 2017-09-12 | Splunk Inc. | Graphical user interface for static and adaptive thresholds |
US11455590B2 (en) | 2014-10-09 | 2022-09-27 | Splunk Inc. | Service monitoring adaptation for maintenance downtime |
US11296955B1 (en) | 2014-10-09 | 2022-04-05 | Splunk Inc. | Aggregate key performance indicator spanning multiple services and based on a priority value |
US9864797B2 (en) | 2014-10-09 | 2018-01-09 | Splunk Inc. | Defining a new search based on displayed graph lanes |
US11200130B2 (en) | 2015-09-18 | 2021-12-14 | Splunk Inc. | Automatic entity control in a machine data driven service monitoring system |
US10474680B2 (en) | 2014-10-09 | 2019-11-12 | Splunk Inc. | Automatic entity definitions |
US11501238B2 (en) | 2014-10-09 | 2022-11-15 | Splunk Inc. | Per-entity breakdown of key performance indicators |
US9146962B1 (en) | 2014-10-09 | 2015-09-29 | Splunk, Inc. | Identifying events using informational fields |
US9146954B1 (en) | 2014-10-09 | 2015-09-29 | Splunk, Inc. | Creating entity definition from a search result set |
US9130832B1 (en) | 2014-10-09 | 2015-09-08 | Splunk, Inc. | Creating entity definition from a file |
US11275775B2 (en) | 2014-10-09 | 2022-03-15 | Splunk Inc. | Performing search queries for key performance indicators using an optimized common information model |
US10592093B2 (en) | 2014-10-09 | 2020-03-17 | Splunk Inc. | Anomaly detection |
US11087263B2 (en) | 2014-10-09 | 2021-08-10 | Splunk Inc. | System monitoring with key performance indicators from shared base search of machine data |
US10305758B1 (en) | 2014-10-09 | 2019-05-28 | Splunk Inc. | Service monitoring interface reflecting by-service mode |
US11671312B2 (en) | 2014-10-09 | 2023-06-06 | Splunk Inc. | Service detail monitoring console |
US11755559B1 (en) | 2014-10-09 | 2023-09-12 | Splunk Inc. | Automatic entity control in a machine data driven service monitoring system |
US10417108B2 (en) | 2015-09-18 | 2019-09-17 | Splunk Inc. | Portable control modules in a machine data driven service monitoring system |
US10193775B2 (en) | 2014-10-09 | 2019-01-29 | Splunk Inc. | Automatic event group action interface |
US9158811B1 (en) | 2014-10-09 | 2015-10-13 | Splunk, Inc. | Incident review interface |
US10209956B2 (en) | 2014-10-09 | 2019-02-19 | Splunk Inc. | Automatic event group actions |
US9286413B1 (en) | 2014-10-09 | 2016-03-15 | Splunk Inc. | Presenting a service-monitoring dashboard using key performance indicators derived from machine data |
US9210056B1 (en) | 2014-10-09 | 2015-12-08 | Splunk Inc. | Service monitoring interface |
US9842160B2 (en) | 2015-01-30 | 2017-12-12 | Splunk, Inc. | Defining fields from particular occurences of field labels in events |
US11544248B2 (en) | 2015-01-30 | 2023-01-03 | Splunk Inc. | Selective query loading across query interfaces |
US10013454B2 (en) | 2015-01-30 | 2018-07-03 | Splunk Inc. | Text-based table manipulation of event data |
US9977803B2 (en) | 2015-01-30 | 2018-05-22 | Splunk Inc. | Column-based table manipulation of event data |
US11442924B2 (en) | 2015-01-30 | 2022-09-13 | Splunk Inc. | Selective filtered summary graph |
US9922082B2 (en) | 2015-01-30 | 2018-03-20 | Splunk Inc. | Enforcing dependency between pipelines |
US9916346B2 (en) | 2015-01-30 | 2018-03-13 | Splunk Inc. | Interactive command entry list |
US9922084B2 (en) | 2015-01-30 | 2018-03-20 | Splunk Inc. | Events sets in a visually distinct display format |
US11615073B2 (en) | 2015-01-30 | 2023-03-28 | Splunk Inc. | Supplementing events displayed in a table format |
US10061824B2 (en) | 2015-01-30 | 2018-08-28 | Splunk Inc. | Cell-based table manipulation of event data |
US10915583B2 (en) | 2015-01-30 | 2021-02-09 | Splunk Inc. | Suggested field extraction |
US10726037B2 (en) | 2015-01-30 | 2020-07-28 | Splunk Inc. | Automatic field extraction from filed values |
US9967351B2 (en) | 2015-01-31 | 2018-05-08 | Splunk Inc. | Automated service discovery in I.T. environments |
US10198155B2 (en) | 2015-01-31 | 2019-02-05 | Splunk Inc. | Interface for automated service discovery in I.T. environments |
US9836598B2 (en) | 2015-04-20 | 2017-12-05 | Splunk Inc. | User activity monitoring |
US10726030B2 (en) | 2015-07-31 | 2020-07-28 | Splunk Inc. | Defining event subtypes using examples |
US10942960B2 (en) | 2016-09-26 | 2021-03-09 | Splunk Inc. | Automatic triage model execution in machine data driven monitoring automation apparatus with visualization |
US10942946B2 (en) | 2016-09-26 | 2021-03-09 | Splunk, Inc. | Automatic triage model execution in machine data driven monitoring automation apparatus |
US11106442B1 (en) | 2017-09-23 | 2021-08-31 | Splunk Inc. | Information technology networked entity monitoring with metric selection prior to deployment |
US11093518B1 (en) | 2017-09-23 | 2021-08-17 | Splunk Inc. | Information technology networked entity monitoring with dynamic metric and threshold selection |
US11159397B2 (en) | 2017-09-25 | 2021-10-26 | Splunk Inc. | Lower-tier application deployment for higher-tier system data monitoring |
CN112690002B (zh) * | 2018-09-14 | 2023-06-02 | 华为技术有限公司 | 点云译码中属性层和指示的改进 |
US11676072B1 (en) | 2021-01-29 | 2023-06-13 | Splunk Inc. | Interface for incorporating user feedback into training of clustering model |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090066191A (ko) * | 2007-12-18 | 2009-06-23 | 한국전자통신연구원 | 영상정보의 분리 전송을 이용한 3차원 입체방송 송수신장치 및 그 방법 |
KR20100086440A (ko) * | 2009-01-22 | 2010-07-30 | 한국전자통신연구원 | 지상파 디지털멀티미디어방송에서 비실시간 스테레오스코픽 서비스 수행 방법 및 지상파 디지털멀티미디어방송 수신 장치 |
KR20110014821A (ko) * | 2009-08-06 | 2011-02-14 | 한국방송공사 | 입체방송을 위한 계층적 방송 시스템 및 방법 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100972792B1 (ko) | 2008-11-04 | 2010-07-29 | 한국전자통신연구원 | 스테레오스코픽 영상을 동기화하는 장치 및 방법과 이를 이용한 스테레오스코픽 영상 제공 장치 및 방법 |
US8099752B2 (en) * | 2008-12-03 | 2012-01-17 | Sony Corporation | Non-real time services |
EP2211556A1 (fr) * | 2009-01-22 | 2010-07-28 | Electronics and Telecommunications Research Institute | Procédé de traitement de services stéréoscopiques en différé dans une diffusion multimédia numérique terrestre et appareil pour recevoir une diffusion multimédia numérique terrestre |
WO2010117129A2 (fr) * | 2009-04-07 | 2010-10-14 | Lg Electronics Inc. | Emetteur de diffusion, récepteur de diffusion et procédé de traitement de données vidéo 3d de ces émetteur et récepteur |
JP4962525B2 (ja) * | 2009-04-08 | 2012-06-27 | ソニー株式会社 | 再生装置、再生方法、およびプログラム |
KR101372376B1 (ko) * | 2009-07-07 | 2014-03-14 | 경희대학교 산학협력단 | 디지털 방송 시스템의 스테레오스코픽 비디오 수신 방법 |
-
2012
- 2012-07-27 KR KR1020120082635A patent/KR101639358B1/ko active IP Right Grant
- 2012-07-27 US US14/235,490 patent/US20140160238A1/en not_active Abandoned
- 2012-07-27 EP EP12819673.0A patent/EP2739043A4/fr not_active Withdrawn
- 2012-07-27 WO PCT/KR2012/006045 patent/WO2013019042A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090066191A (ko) * | 2007-12-18 | 2009-06-23 | 한국전자통신연구원 | 영상정보의 분리 전송을 이용한 3차원 입체방송 송수신장치 및 그 방법 |
KR20100086440A (ko) * | 2009-01-22 | 2010-07-30 | 한국전자통신연구원 | 지상파 디지털멀티미디어방송에서 비실시간 스테레오스코픽 서비스 수행 방법 및 지상파 디지털멀티미디어방송 수신 장치 |
KR20110014821A (ko) * | 2009-08-06 | 2011-02-14 | 한국방송공사 | 입체방송을 위한 계층적 방송 시스템 및 방법 |
Also Published As
Publication number | Publication date |
---|---|
KR101639358B1 (ko) | 2016-07-13 |
US20140160238A1 (en) | 2014-06-12 |
KR20130014428A (ko) | 2013-02-07 |
EP2739043A4 (fr) | 2015-03-18 |
EP2739043A1 (fr) | 2014-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013019042A1 (fr) | Appareil et procédé d'émission et appareil et procédé de réception permettant de fournir un service 3d par le biais d'une liaison avec une image de référence émise en temps réel ainsi qu'avec une image et un contenu supplémentaires émis séparément | |
WO2012121572A2 (fr) | Procédé et dispositif d'émission pour transmettre un service de radiodiffusion stéréoscopique lié à un programme, et procédé et dispositif de réception pour celui-ci | |
WO2010053246A2 (fr) | Appareil et procédé de synchronisation d’images stéréoscopiques et appareil et procédé d’élaboration d’images stéréoscopiques les utilisant | |
WO2012077982A2 (fr) | Emetteur et récepteur destinés à émettre et recevoir un contenu multimédia, et procédé de reproduction associé | |
WO2012128563A2 (fr) | Dispositif et procédé d'émission/réception de contenu radiodiffusé lié grâce à un réseau hétérogène | |
KR101828639B1 (ko) | 멀티미디어 흐름을 동기화시키기 위한 방법 및 대응하는 장치 | |
WO2010041896A2 (fr) | Système de réception et procédé de traitement de données | |
WO2013154402A1 (fr) | Appareil de réception permettant de recevoir une pluralité de signaux sur différents chemins et procédé de traitement de ses signaux | |
WO2012099359A2 (fr) | Dispositif de réception destiné à recevoir une pluralité de flux de transfert en temps réel, dispositif d'émission conçu pour émettre ces flux et procédé permettant de lire un contenu multimédia | |
WO2013043000A1 (fr) | Appareil et procédé de lecture d'un contenu diffusé dans un système de diffusion | |
WO2010041905A2 (fr) | Système de réception et méthode de traitement de données | |
WO2011108903A2 (fr) | Procédé et appareil pour la transmission et la réception dans la fourniture d'une pluralité de services de diffusion de tv 3d interactive de transport | |
WO2013154397A1 (fr) | Système de transmission et appreil de réception assurant un service hybride, procédé de fourniture du service correspondant | |
WO2013154350A1 (fr) | Appareil récepteur fournissant des services hybrides, et procédé de fourniture de services hybrides associé | |
JP2013179664A (ja) | 放送データ送信方法およびその装置 | |
WO2012121571A2 (fr) | Procédé et dispositif d'émission / réception d'un service de radiodiffusion stéréoscopique en temps différé | |
WO2011049372A2 (fr) | Procédé et appareil de génération d'un flux, procédé et appareil de traitement d'un flux | |
WO2010090439A2 (fr) | Procédé de transmission, appareil de transmission, procédé de réception et appareil de réception pour un service de diffusion multimédia numérique | |
WO2011046271A1 (fr) | Récepteur de diffusion et son procédé de traitement de données vidéo 3d | |
WO2012144857A2 (fr) | Récepteur permettant de recevoir et d'afficher une pluralité de flux par l'intermédiaire de routes séparées, procédé permettant de traiter la pluralité de flux et procédé de transmission associé | |
WO2011159093A2 (fr) | Mécanisme de fourniture hybride dans un système de transmission multimédia | |
WO2012023787A2 (fr) | Récepteur numérique et procédé de traitement de contenu dans un récepteur numérique | |
WO2012070716A1 (fr) | Procédé pour la transmission du type de compatibilité de service dans une diffusion numérique | |
WO2017171391A1 (fr) | Procédé et appareil d'émission et de réception de signaux de diffusion | |
WO2014010830A1 (fr) | Procédé et appareil de transmission et de réception de paquets dans un service de transmission hybride de mmt |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12819673 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14235490 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2012819673 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012819673 Country of ref document: EP |