KR20130116154A - Receiving device for a plurality of signals through different paths and method for processing the signals thereof - Google Patents

Receiving device for a plurality of signals through different paths and method for processing the signals thereof Download PDF

Info

Publication number
KR20130116154A
KR20130116154A KR1020120116023A KR20120116023A KR20130116154A KR 20130116154 A KR20130116154 A KR 20130116154A KR 1020120116023 A KR1020120116023 A KR 1020120116023A KR 20120116023 A KR20120116023 A KR 20120116023A KR 20130116154 A KR20130116154 A KR 20130116154A
Authority
KR
South Korea
Prior art keywords
signal
synchronization
information
type information
synchronization information
Prior art date
Application number
KR1020120116023A
Other languages
Korean (ko)
Inventor
박홍석
윤국진
이재준
이진영
임현정
장용석
정원식
주유성
허남호
황승오
Original Assignee
삼성전자주식회사
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사, 한국전자통신연구원 filed Critical 삼성전자주식회사
Priority to US13/861,068 priority Critical patent/US20130276046A1/en
Priority to PCT/KR2013/003137 priority patent/WO2013154402A1/en
Publication of KR20130116154A publication Critical patent/KR20130116154A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2389Multiplex stream processing, e.g. multiplex stream encrypting
    • H04N21/23892Multiplex stream processing, e.g. multiplex stream encrypting involving embedding information at multiplex stream level, e.g. embedding a watermark at packet level
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4343Extraction or processing of packetized elementary streams [PES]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The receiving device is disclosed. The apparatus includes a first receiver for receiving a first signal through a radio frequency broadcast network, a second receiver for receiving a second signal through an internet protocol communication network, a first signal, and a first signal. Detecting synchronization type information from at least one of the two signals, selecting the synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal, and according to the selected synchronization information, And a signal processor for synchronizing two signals. Accordingly, different signals can be synchronized and output.

Description

RECEIVING DEVICE FOR A PLURALITY OF SIGNALS THROUGH DIFFERENT PATHS AND METHOD FOR PROCESSING THE SIGNALS THEREOF}

The present invention relates to a receiving apparatus and a signal processing method thereof, and more particularly, to a receiving apparatus and a signal processing method for synchronizing a plurality of signals received through different paths.

With the development of electronic technology, various types of electronic devices have been developed and spread. Representative examples of these electronic devices include a receiving device such as a TV.

Recently, as the performance of TV is improved, even multimedia contents such as 3D content or full HD content are being serviced. This type of content has a larger data size than existing content.

However, the transmission bandwidth used in the broadcasting network is limited. Therefore, there is a limitation in the size of the content that can be transmitted in the current broadcasting network. In order to meet this constraint, it is necessary to reduce the resolution inevitably, there is a problem that the image quality deteriorated accordingly.

In order to solve this problem, there have been attempts to provide various types of media data through various transmission environments. However, since these data are transmitted through different paths, it is not known whether the data is related to each other on the receiving device side, and thus there is a problem in that they cannot be properly synchronized.

Therefore, there is a need for a method that can properly synchronize these contents.

The present invention is in accordance with the above-described needs, and an object of the present invention is to provide a receiving apparatus and a signal processing method for receiving and synchronizing and outputting a plurality of signals transmitted through different networks.

The reception device according to an embodiment of the present invention for achieving the above object, the first receiving unit for receiving the first signal through an RF (Radio Frequency broadcast network), IP communication network (Internet Protocol Communication Network) A synchronization type information is detected from at least one of a second receiver, the first signal, and the second signal that receives a second signal through the second signal; and the synchronization is performed among the synchronization information included in the first signal and the second signal. And a signal processor configured to select synchronization information corresponding to type information and synchronize the first signal and the second signal according to the selected synchronization information.

The signal processor may detect the synchronization type information from a reserved area or a descriptor area in at least one of the first signal and the second signal.

Alternatively, the signal processor may detect the synchronization type information from at least one PSIP VCT (Program and System Information Protocol Virtual Channel Table), EIT reserved or descriptor region of the first signal and the second signal.

Alternatively, the signal processor may detect the synchronization type information from a private stream or a metadata stream included in at least one of the first signal and the second signal.

The signal processor may further include: a first demux for demuxing the first signal to detect first video data, and when the second signal is in the form of a transport stream, demux the second signal to perform second video data. A second demux for detecting a second demux, a file parser for parsing the second signal to detect second video data when the second signal is a file format, and decoding the first video data demuxed at the first demux A second decoder for decoding the second video data detected by the first decoder, the second demux or the file parser, the first video data decoded by the first decoder and the second decoded by the second decoder The video data may be combined to include a rendering unit that performs rendering.

Here, at least one of the first demux, the second demux, the file parser, the first decoder, the second decoder, and the rendering unit may be selectively operated according to the value of the synchronization type information to provide the synchronization information. Can be detected.

Alternatively, the signal processor may include: a first demux for detecting the first video data by demuxing the first signal, and when the second signal is in the form of a transport stream, demuxing the second signal to perform second video data. A second demux for detecting a second demux, a file parser for parsing the second signal to detect second video data when the second signal is a file format, and decoding the first video data demuxed at the first demux A second decoder for decoding the second video data detected by the first decoder, the second demux or the file parser, the first video data decoded by the first decoder and the second decoded by the second decoder Combining video data to detect the synchronization type information from a rendering unit that performs rendering and at least one of the first signal and the second signal to determine the synchronization type. And a controller configured to control at least one of the first demux, the second demux, the file parser, the first decoder, the second decoder, and the renderer to detect and synchronize the synchronization information. You may.

Here, if the synchronization type information is information specifying a type using the time code or frame number recorded at the PES level as the synchronization information, the control unit may detect video data having the same time code or frame number. Control the first demux and the second demux or the first demux and the file parser, and designate a type of using a time code or frame number in which the synchronization type information is recorded at an ES level as the synchronization information; The first and second decoders are decoded, respectively, and then the first and second decoders are output to output video data having the same time code or frame number, and the synchronization type information is set to the video level. If it is information specifying a type using the recorded time code or frame number as the synchronization information, The rendering unit may be controlled to detect and render video data having the same time code or frame number from the decoded first signal and the decoded second signal, respectively.

The synchronization type information may include a first value for designating a video level watermark timecode as the synchronization information, a second value for designating a video level watermark frame number as the synchronization information, and synchronizing an ES level SMPTE timecode. A third value designated by the information, a fourth value specifying the PES level SMPTE timecode as the synchronization information, a fifth value specifying the PES level frame number as the synchronization information, and a PES level counterpart PTS as the synchronization information The sixth value may include one of a seventh value designating the video level watermark counterpart PTS as the synchronization information.

The first signal may include one of left eye image data and right eye image data of 3D content, and the second signal may include another one of left eye image data and right eye image data of the 3D content.

Alternatively, one of the first signal and the second signal includes 2D content data, and the other of the first signal and the second signal includes multilingual audio data, multilingual subtitle data, and UHD corresponding to the 2D content data. It may include at least one of broadcast data, depth map data, and other viewpoint view data.

On the other hand, according to an embodiment of the present invention, in the signal processing method of the receiving device, receiving the first signal and the second signal through the RF broadcast network and IP communication network, respectively, of the first signal and the second signal Detecting synchronization type information from at least one signal, selecting synchronization information corresponding to the synchronization type information from among the synchronization information included in the first signal and the second signal, and according to the selected synchronization information; And a signal processing step of synchronizing a signal and the second signal.

The synchronization type information may be information recorded in a reserved area or a descriptor area in at least one of the first signal and the second signal.

Alternatively, the synchronization type information may be information recorded in at least one PSIP Program and System Information Protocol Virtual Channel Table (VCT), EIT reserved or descriptor area of the first signal and the second signal.

Alternatively, the synchronization type information may be recorded in a private stream or a metadata stream included in at least one of the first signal and the second signal.

The synchronization type information may include: a first value designating a video level watermark timecode as the synchronization information, a second value designating a video level watermark frame number as the synchronization information, and synchronizing an ES level SMPTE timecode A third value designated by the information, a fourth value specifying the PES level SMPTE timecode as the synchronization information, a fifth value specifying the PES level frame number as the synchronization information, and a PES level counterpart PTS as the synchronization information The sixth value may include one of a seventh value for designating the video level watermark counterpart PTS as the synchronization information.

As described above, according to various embodiments of the present disclosure, data included in a plurality of signals received through a plurality of different communication networks may be synchronized using synchronization type information for designating an appropriate synchronization signal. Accordingly, appropriate synchronization information can be used depending on the situation.

1 is a block diagram showing the configuration of a transmission and reception system according to an embodiment of the present invention;
2 is a table showing examples of synchronization type information;
3 to 10 are diagrams illustrating a method of transmitting synchronization type information and a structure of synchronization type information according to various embodiments of the present disclosure;
11 is a view for explaining the configuration and operation of a transmission system according to an embodiment of the present invention;
12 is a view for explaining the configuration and operation of a receiving apparatus according to an embodiment of the present invention;
13 is a diagram illustrating an example of a structure in which a time code is recorded in a GOP header;
14 is a view for explaining a method of transmitting a time code using Supplemental enhancement information (SEI) defined in AVC (Advanced Video Coding: ISO / IEC 14496-10);
15 is a diagram illustrating an example of synchronization type information in the form of a watermark carried in an ES payload of a first signal transmitted through an RF broadcasting network;
16 is a diagram illustrating an example of synchronization type information in the form of a watermark carried in an ES payload of a second signal transmitted through an IP communication network;
17 is a diagram illustrating an example of synchronization type information carried on a second signal based on an MP4 file format;
18 is a view for explaining the configuration and operation of a transmission system according to another embodiment of the present invention;
19 is a view for explaining the configuration and operation of a receiving apparatus according to another embodiment of the present invention;
20 is a diagram illustrating an example of PES level synchronization type information of a first signal transmitted through an RF broadcast network;
21 is a configuration illustrating an example of PES level synchronization type information of a second signal transmitted through an IP communication network;
22 is a configuration illustrating an example of PES level synchronization type information carried on a second signal based on an MP4 file format;
23 is a view for explaining the configuration and operation of a transmission system according to another embodiment of the present invention;
24 is a view for explaining the configuration and operation of a receiving apparatus according to another embodiment of the present invention;
25 is a table showing synchronization type information to which a value for designating a PES level counterpart PTS as synchronization information is added;
26 is an example of a signal format including a PES level counterpart PTS,
27 is a table showing synchronization type information with a value specifying a video level watermark counterpart PTS as synchronization information;
FIG. 28 is a diagram for describing a method of transmitting synchronization type information through a private or metadata stream of a first signal transmitted through an RF broadcast network; FIG.
29 is a view for explaining a method of transmitting synchronization type information through a private or metadata stream of a second signal transmitted through an IP communication network;
30 is a view for explaining a method of transmitting synchronization type information through a private or metadata stream of a second signal based on an MP4 file format;
31 is a table for explaining syntax of synchronization type information;
32 is a diagram illustrating still another example of the signal processing unit configuration, and
33 is a flowchart illustrating a signal processing method of a receiving apparatus according to an embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram showing the configuration of a transmission and reception system according to an embodiment of the present invention. According to FIG. 1, a transmission / reception system includes a plurality of transmission devices 1 and 2 (100-1 and 100-2) and a reception device 200.

The transmitting apparatuses 1 and 2 100-1 and 100-2 transmit different signals through different paths. For example, the transmitting apparatus 1 100-1 transmits a first signal through an RF frequency network, and the transmitting apparatus 2 100-2 transmits a second signal through an IP protocol network. You can send a signal.

The first signal and the second signal each include different data constituting one multimedia content. For example, in the case of 3D content, the left eye image and the right eye image may be included in the first signal and the second signal, respectively. Alternatively, the video signal may be divided into video data and audio data, or may be divided into video data, subtitle data, and other data, and may be included in the first signal or the second signal, respectively.

The first signal includes first synchronization information together with the first data, and the second signal includes second synchronization information together with the second data.

Various information may be used as the first and second synchronization information. Specifically, the synchronization information may be an SMPTE time code, a frame number, and various other information according to its form, and may be classified into a video level, an ES level, a PES level, and the like according to a provided method.

The transmitting apparatuses 1 and 2 may provide synchronization information in various forms and methods according to the type of content, transmission environment, encoding method, content size, and the like. As such, there is a need for a separator capable of distinguishing synchronization information provided in various forms and methods.

Synchronization type information means such a separator. That is, the synchronization type information refers to a value for designating information to be used as synchronization information among various pieces of information included in at least one of the first signal and the second signal. The synchronization type information may alternatively be referred to as a pair type or a media synchronization pair type.

The reception device 200 checks the synchronization type information transmitted from at least one of the transmission devices 1 and 2, and detects information designated as the synchronization information from the first signal and the second signal. The receiving device 200 compares the detected synchronization information and synchronizes and outputs the first signal and the second signal.

According to FIG. 1, the receiver 200 includes a first receiver 210, a second receiver 220, and a signal processor 230.

The first receiver 210 receives a first signal through a radio frequency broadcast network. The second receiver 220 receives a second signal through an IP protocol network.

The second signal may be transmitted in the form of a real time transport stream or in the form of an MP4 file. When implemented in the form of a real-time transport stream, the second signal may be transmitted and received using a protocol such as RTP or HTTP. In the case of using HTTP, a metadata file must be provided to obtain a second signal.

The metadata is information for indicating where the multimedia content can be received. The metadata file may include information that the client needs to know in advance, such as a location on a content time corresponding to each of the plurality of separate files, a URL of a source for providing the file, and a size. Metadata files can be classified in various ways depending on the type of HTTP-based streaming. That is, in the case of a smooth streaming method, an ism (Internet Information Service (IIS) Smooth streaming Media) file is used as a metadata file. In case of Internet Engineering Task Force (IETF) HTTP live streaming, m3v8 file is used as metadata file, and adaptive HTTP streaming Rel adopted by 3GPP. 9 or adaptive HTTP streaming Rel employed in OIPF. 2, in the case of the dynamic adaptive streaming over HTTP scheme employed in MPEG, MPD (Media Presentation Description) may be used as the metadata file. The second receiving unit 220 detects address information of a source from which the metadata file can be obtained from the first signal, accesses the source to obtain the metadata file, and then uses the metadata file to obtain the second metadata file. It can receive a signal. Alternatively, according to another embodiment, the metadata file may be directly included in the first signal. In this case, the second receiver 220 may directly obtain the metadata file from the first signal and receive the second signal.

The signal processor 230 detects synchronization type information from at least one of the first signal and the second signal. The signal processor 230 selects synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal. The first signal and the second signal are synchronized and output according to the selected synchronization information.

2 is a table for explaining various examples of synchronization type information. According to FIG. 2, the synchronization type information includes a first value for designating a video level watermark timecode as synchronization information, a second value for designating a video level watermark frame number as synchronization information, and an ES level SMPTE timecode as synchronization information. A third value to designate, a fourth value to designate a PES level SMPTE timecode as synchronization information, and a fifth value to designate a PES level frame number as synchronization information. The first to fifth values may be represented by bit values such as 0x01, 0x02, 0x03, 0x04, 0x05, and the like.

In this case, the video level means that synchronization information is carried in the video data payload region of the PES packet. The ES level means that synchronization information is carried in the video data ES header area of the PES packet. The PES level means that synchronization information is carried in a payload region of a data packet separately provided in a PES packet.

On the other hand, the time code is a series of pulse signals produced by the time code generator, and is a signal standard developed for easy editing management. When creating and editing content, the same time code is used for synchronized management of left and right eye images. Therefore, the time code can maintain the same pair regardless of the time of stream generation or delivery. Specifically, SMPTE (Society of Motion Picture and Television Engineers) time code may be used. That is, in SMPTE 12M, the time code is represented in the form of "hour: minute: second: frame". The SMPTE time code may be classified into a longitude time code (LTC) or a vertical interval time code (VITC) according to a recording method. In the case of the LTC, a total of 80 bits of data including time information (25 bits), user information (32 bits), synchronization information (16 bits), storage area (4 bits), and frame mode display (2 bits) may be configured. VITC is recorded on two horizontal lines within the vertical blanking period of the video signal. SMPTE RP-188 defines an interface specification that allows LTC or VITC type timecode to be transmitted as ancillary data.

The frame number means identification information such as a number assigned to each frame. The frame number may be recorded in an event information table (EIT), a PMT, a private stream, a transport stream header, or the like of the first signal or the second signal.

As described above, the receiving apparatus 200 detects a time code or a frame number according to the synchronization type information, and performs synchronization based on the detected information. A method of synchronizing using a time code or a frame number will be described in detail later.

Meanwhile, the synchronization type information for specifying the type of synchronization information may be transmitted using a reserved area or descriptor area in a program map table (PMT), and a PSIP program and system information protocol virtual channel table. ), A second method for transmitting using an event information table (EIT) reserved or a descriptor region, and a third method for transmitting using a private stream or a metadata stream, and the like.

Hereinafter, various examples of a method of transmitting synchronization type information will be described.

FIG. 3 is a diagram for describing a first method of transmitting synchronization type information using a PMT reserve area or a descriptor area. Fig. 3A shows the syntax of the program map table. According to FIG. 3A, a reserve area and a descriptor are defined in the program map table. The synchronization type information may be recorded in such a reserve area or descriptor area. 3 (b) shows the syntax of a new descriptor defined for synchronization type information. According to FIG. 3B, 24-bit synchronization type information may be recorded in the new descriptor.

FIG. 4 is a diagram for describing a second method of transmitting using a reserved or descriptor area of a PSIP VCT (Program and System Information Protocol Virtual Channel Table).

4 (a) shows a bit stream syntax of a VCT providing channel or event information in ATSC A65. According to FIG. 4, synchronization type information may be recorded in a reserve area or a descriptor area in the VCT. Fig. 4B shows the syntax of the synchronization type information recorded in the descriptor area.

5 is a diagram for describing a second scheme of transmitting using an event information table (EIT) reserved or descriptor region. FIG. 5A shows the bit stream syntax of the EIT. According to FIG. 5, synchronization type information may be recorded in a reserve area or a descriptor area in the EIT. FIG. 5B shows the syntax of the synchronization type information recorded in the descriptor area.

FIG. 6 is a diagram for describing a third method of transmitting synchronization type information using a private stream or a metadata stream. When implemented in the third manner, the synchronization type information may be provided using a structure of a private stream or a metadata stream for providing additional information or additional data. The private stream is defined as one of stream types = 0x05 and 0x06 in the PMT and the metadata stream is one of stream types = 0x14, 0x15 and 0x16. In FIGS. 6A to 6D, in addition to the PES video packet b and the audio packet c, synchronization type information is provided through a private stream or a metadata stream defined as stream type = 0x15 or 0x05. Indicates.

FIG. 7 is a table for explaining syntax of synchronization type information provided through a stream as shown in FIG. According to FIG. 7, the synchronization type information (media_synchronization_pair_type) is composed of 8 bits.

As described above, the synchronization type information may be transmitted in various ways. Hereinafter, the providing form and the providing method of the synchronization information according to the synchronization type information will be described in detail.

First, FIGS. 8 to 12 show a configuration of synchronization information transmitted at a video level and a configuration of a transmission / reception system using the synchronization information.

According to the table shown in FIG. 2 described above, if the synchronization type information has a value of 0x01 or 0x02, the synchronization information is provided at the video level. That is, the synchronization information is inserted in the form of frame by frame through a video watermark method. The watermark is divided into a method of embedding in the spatial domain or the frequency domain. Specifically, the synchronization information may be implemented by inserting synchronization information into the LSB (Least Significant Bit) of the image in the spatial domain, or inserting the synchronization information after converting the synchronization information into the frequency domain.

8 is a diagram illustrating a form of providing a first signal transmitted through an RF broadcast network. According to FIG. 8, pair information is provided in a watermark format in an ES payload region of a video packet carrying actual video data. Pair information may be regarded as information for PTS and DTS correction provided to resolve mismatch of PCR-based PTS and DTS synchronization information between two or more streams.

FIG. 9 is a diagram illustrating a form of providing a second signal transmitted in a transport stream form through an IP communication network. According to FIG. 9, watermark-type synchronization information (pair information) is carried in an ES payload region of a video packet carrying actual video data.

10 is a diagram illustrating a form of a second signal transmitted in an MP4 file format. According to FIG. 10, synchronization information in a watermark format may be carried on video data of an mdat region provided with an actual video value.

As described above, the synchronization information may be stored in various locations, and the transmission apparatuses 1 and 2 (100-1, 100-2) generate the synchronization type information according to the location where the synchronization information to be used is stored and receive the reception apparatus 200. Can provide.

11 shows an example of a configuration of a transmission system including transmission apparatuses 1 and 2. FIG. According to FIG. 11, a transmission system includes a plurality of source devices 300-1 and 300-2, a transmission device 1 100-1, and a transmission device 2 100-2.

The source device 1 300-1 refers to a content server that provides pre-recorded content, and the source device 2 300-2 refers to a live source that provides content in real time. The raw video provided by the source devices 1 and 2 300-1 and 300-2 includes synchronization information in a watermark format. In FIG. 11, a time code or frame numbers such as frames # 1, # 2, and # 3, which are sequentially added to each frame, such as 01.00.00.00, 01.00.00.01, and 01.00.00.02 added to each frame, are included in the form of a watermark. Status is displayed.

In FIG. 11, the left eye low video data is provided to the transmitting device 1 100-1 from each of the source devices 1 and 2 300-1 and 300-2, and the right eye low video data is transmitted to the transmitting device 2 (100-2). The case provided by) is shown. However, this is only an example, and the data of the source apparatus 1 300-1 is provided to the transmitting apparatus 1 100-1, and the data of the source apparatus 2 300-2 is transmitted to the transmitting apparatus 2 100-2. It may be provided as.

The transmission device 1 100-1 includes an encoder 110-1, a mixer 120-1, and a modulator 130-1. The encoder 110-1 encodes the raw data using the MPEG 2 encoding scheme to generate a video elementary stream (ES), and then provides it to the video processor 120-1. The muxer 120-1 muxes the additional data for the video ES to generate MPEG2-TS (Transport Stream), and then provides it to the modulator 130-1. The modulator 130-1 performs modulation by the ATSC 8-VSB modulation method and outputs the modulated signal.

The transmission apparatus 2 100-2 includes an encoder 110-2, a processor 120-2, a file generator 130-2, and a server 140-2. The encoder 110-2 encodes the raw data in the Advanced Video Coding (AVC) scheme to generate a video ES. The encoder 110-2 may provide the video ES to the 120-2 when the content is transmitted in the TS format. Museo 120-2 muxes the additional data to the video ES and provides it to the server 140-2.

On the other hand, when transmitting in the MP4 file format, the encoder 110-2 may provide the video ES to the file generator 130-2. The file generation unit 130-2 converts the video ES into a file format and provides it to the server 140-2.

The server 140-2 stores video data provided by the processor 120-2 or the file generator 130-2. When the server 140-2 receives a video request for video data from the receiving apparatus 200, the server 140-2 streams the stored TS through the IP communication network or receives the stored file through the IP communication network according to the request. It may be provided to the device 200. Although FIG. 11 illustrates a configuration in which 3D video data is requested and provided, the same processing may be performed on 2D video data or other audio data instead of 3D video data.

In FIG. 11, the synchronization type information is inserted into a PMT, a PSIP VCT, an EIT, or the like in the TS or a file by the encoder units 110-1 and 110-2 and 120-1, 120-2, or a separate private stream or the like. It can be generated as a metadata stream. Such synchronization type information may be provided from the source apparatuses 1 and 2 300-1 and 300-2, and are generated by the encoder units 110-1 and 110-2 and 120-1 and 120-2. The value stored in a separate storage unit (not shown) may be used as it is. Alternatively, the synchronization type information may be generated by a separately provided control unit (not shown) and provided to the encoder units 110-1 and 110-2 or the encoder 120-1 and 120-2. When the controller is provided, the controller may selectively determine synchronization information to be used according to user commands input from the transmitting apparatuses 1 and 2. When the synchronization information is determined, the controller generates the synchronization information and the synchronization type information corresponding to the transmission method, and provides the synchronization information to the encoder units 110-1 and 110-2 or to the encoders 120-1 and 120-2. Can give

FIG. 12 is a diagram illustrating an example of a configuration of a receiving apparatus that receives a first signal and a second signal transmitted by the transmission system of FIG. 11. According to FIG. 12, the receiver includes a first receiver 210, a second receiver 220, and a signal processor 230.

The first receiver 210 receives a first signal through a radio frequency broadcast network. The second receiver 220 receives a second signal through an IP protocol network.

The signal processor 230 detects synchronization type information from at least one of the first signal and the second signal. The synchronization information corresponding to the synchronization type information is selected from the synchronization information included in the first signal and the second signal to synchronize the first signal and the second signal according to the selected synchronization information.

For example, if a time code is specified by the synchronization type information, the reception apparatus 200 detects a time code recorded in the video frame of the first signal and a time code recorded in the video frame of the second signal, respectively. The signal processor 230 compares the detected time codes and selects video frames having the same time code among the video frame of the first signal and the video frame of the second signal. The time stamp difference between the selected video frames may be checked, and the time stamp may be corrected according to the identified value to achieve synchronization. That is, according to the MPEG standard, a transport stream for transmitting broadcast data includes a PCR (Program Clock Reference) and a PTS (Presentation Time Stamp). PCR refers to reference time information for adjusting the time reference to the transmitters 100-1 and 100-2 in a receiving apparatus (set top box or TV) that conforms to the MPEG standard. The receiver 200 adjusts the value of the system time clock (STC) according to the PCR. PTS refers to a time stamp indicating a reproduction time for synchronizing video and audio in a broadcast system according to the MPEG standard. In this specification, it is called a time stamp. As shown in FIG. 1, when different signals are transmitted from different transmission devices 100-1 and 100-2, PCR may be different according to characteristics of the transmission devices 100-1 and 100-2. Therefore, even if playback is performed according to the time stamps set for PCR, synchronization may not be achieved. When the time code is designated as the synchronization information by the synchronization type information, the signal processor 230 may perform synchronization by correcting time stamps of video frames having the same time code among video frames of each signal to coincide with each other. . Alternatively, the signal processor 230 may immediately perform synchronization, decoding, scaling, and rendering of a video frame based on the time code without correcting the time stamp.

As another example, frame index information such as a frame number may be designated as synchronization information by the synchronization type information. The frame index information means identification information provided for each frame. Typically, this may be a frame number. The frame index information may be recorded in an event information table (EIT), a PMT, a private stream, a transport stream header, and the like of a real time transport stream. The signal processor 230 may correct the time stamps of the frames having the same frame index to be the same. Accordingly, by processing the video frame of each signal based on the corrected time stamp, synchronization can be naturally achieved.

According to FIG. 12, the signal processor 230 may include a first demux 231, a first decoder 232, a renderer 233, a second demux 234, a file parser 235, and a second decoder ( 236). The first demux 231, the first decoder 232, the renderer 233, the second demux 234, the file parser 235, the second decoder 236, and the like, according to the value of the synchronization type information. It can selectively operate to detect synchronization information.

According to FIG. 12, the first signal received by the first receiver 210 is transmitted to the first demux 231 in the MPEG2-TS format. The first demux 231 demuxes the received transport stream and outputs a video ES. The first decoder 232 is implemented with an MPEG2 decoder to decode the video ES. The decoded raw data is provided to the renderer 233.

Meanwhile, the second signal received by the second receiver 220 may be received in an MPEG2-TS format or a file format. MPEG2-TS refers to a transport stream that is encoded in MPEG2 encoding and then modulated and transmitted in ATSC 8VSB. The MPEG2-TS is passed to the second demux 234. The second demux 234 detects the video ES by demuxing the received transport stream and provides it to the second decoder 236. Meanwhile, the second signal received in the file format is provided to the file parser 235. The file parser 235 parses the received file and then provides the parsing result to the second decoder 236. The second decoder 236 decodes the video data provided by the second demux 234 or the file parser 235 in an AVC manner, and provides the decoded video data to the renderer 233.

According to FIG. 12, the decoded data provided by the first decoder 232 and the second decoder 236, that is, the raw data, may include the synchronization information such as a time code or a frame number in the form of a watermark. . As described above, the rendering unit 233 detects watermark type synchronization information included in each row data, and then directly synchronizes frame by frame or corrects a time stamp of each frame based on the synchronization information. Therefore, rendering can be performed according to the time stamp.

Next, synchronization information may be sent at the ES level.

13 to 17 are diagrams illustrating a configuration of synchronization information transmitted at an ES level and a configuration of a transmission / reception system using the synchronization information.

According to the table shown in FIG. 2 described above, if the synchronization type information has a value of 0x03, the synchronization information is provided at the ES level. That is, synchronization information may be inserted in the ES header area. In more detail, the synchronization information has a form of an SMPTE time code, and in the case of MPEG2, an SMPTE time code of an ES header level may be provided through a GOP header or a picture timing SEI of an AVC.

13 shows an example of a syntax structure of a GoP header in an MPEG stream in which a time code is recorded in the GoP header. According to FIG. 13, a time code may be recorded as 25 bits of data. As shown in FIG. 13, the time code may be delivered to the receiving device 200 in GoP units.

14 shows a method of transmitting a time code using Supplemental enhancement information (SEI) defined in AVC (Advanced Video Coding: ISO / IEC 14496-10). According to FIG. 14, a time code may be delivered using seconds_value, minutes_value, hours_value, and n_frames defined in Picture timing SEI.

15 is a diagram illustrating a form of provision of a first signal transmitted through an RF broadcast network. According to FIG. 15, the PES packet is transmitted according to the stream type defined in the PMT. Among the PES packets, the synchronization information is included in the ES header of the video packet.

16 is a diagram illustrating a form of provision of a TS-based second signal transmitted through an IP communication network. According to FIG. 16, synchronization information may be included in the ES header level in the form of an SMPTE time code and transmitted.

17 is a diagram illustrating a form of providing a second signal based on an MP4 file format transmitted through an IP communication network. According to FIG. 17, it is shown that synchronization information is included in the form of an SMPTE time code in an ES header portion of an mdat area where an actual video value is provided.

18 is a diagram for explaining the configuration and operation of a transmission system that transmits content at an ES level. According to FIG. 18, the transmission system includes a plurality of source devices 300-1 and 300-2, a transmission device 1 100-1, and a transmission device 2 100-2. Since the basic configuration is the same as the embodiment of FIG. 11, the description of overlapping portions will be omitted.

Referring to FIG. 18, an SMPTE time code is included in a VBI section of raw data transmitted from source device 1 300-1 implemented as a content server and source device 2 300-2 implemented as a live source. Accordingly, the MPEG2 encoder 110-1 and the AVC encoder 110-2 included in the transmission apparatuses 1 and 2 100-1 and 100-2 and the AVC encoder 110-2 respectively capture the SMPTE time codes and include them in the ES level stream. . Accordingly, the file generators 130-1 and 120-2 generate the TS or the file including the time code at the ES level, respectively, so that the later components, that is, the modulator 130-2, are generated. 1) or to the server 140-2.

19 shows an example of a configuration of a reception apparatus using synchronization information included at an ES level. Since the basic configuration of the receiving apparatus of FIG. 19 is the same as that of the receiving apparatus of FIG. 12, description of overlapping portions will be omitted.

According to FIG. 19, the first receiver 210 and the second receiver 220 respectively receive the first signal and the second signal and provide them to the signal processor 230. Each received signal contains an ES level SMPTE timecode, i.e., synchronization information.

The first decoder 232 and the second decoder 236 in the signal processor 230 extract SMPTE time codes provided through the ES level, and provide them to the renderer 233. The rendering unit 233 compares the time code of the first signal and the time code of the second signal, and performs a synchronization process so that frames having the same time code are synchronized and output. Since the synchronization processing method has been described above, duplicate descriptions are omitted.

20 to 24 are diagrams illustrating the configuration of synchronization information transmitted at the PES level and the configuration of a transmission / reception system using the synchronization information.

According to the table shown in FIG. 2 described above, if the synchronization type information has a value of 0x04 or 0x05, the synchronization information is provided at the PES level. In more detail, the synchronization information may be an SMPTE time code or a frame number. These synchronization information may be transmitted through a private stream or metadata PES stream having the same reproduction time information, that is, a time stamp.

20 is a diagram illustrating a form of provision of a first signal transmitted through an RF broadcast network. According to FIG. 20, an SMPTE time code or frame number, which is synchronization information, is inserted into a payload portion of a video packet or another PES packet other than an audio packet and transmitted.

21 is a diagram illustrating a form of providing a TS-based second signal transmitted through an IP communication network. According to FIG. 21, a stream type is designated in a PMT, and an SMPTE time code or frame number, which is synchronization information, is inserted and transmitted in a payload portion of a separate PES packet other than a video packet according to the designated stream type.

FIG. 22 is a diagram illustrating a form of provision of a second MP4-based signal transmitted through an IP communication network. FIG. According to FIG. 22, a value for calculating a frame number using stts (Time-to-Sample Atom), ctts (Composition Time To Sample Atom), and stss (Sync Sample Atom) that provides existing frame reproduction time information in a moov header Can be provided implicitly. That is, the MP4 file provides a relative time value with respect to the playback time from the file start position, such as PTS and DTS of TS, through stts, ctts, and stss. However, the frame number does not have a specific time unit because only the relative order between frames is provided, such as # 1 and # 2. Therefore, referring to relative time values provided by stts, ctts, etc., it is possible to infer the order of frames, that is, the frame number.

Alternatively, the SMPTE time code or frame number may be explicitly provided through a separate box extension. More specifically, an additional box may be defined in the ISO media base file format 14496-12, or a time code may be provided by extending a field in a predefined box. For example, a time code may be provided by extending a box of "stss (sync sample table)" that provides random access.

FIG. 23 is a diagram for explaining the configuration and operation of a transmission system that provides synchronization information at a PES level. Since the transmission system of FIG. 23 is also basically the same as the structure of the transmission system of FIG. 11, FIG. 18, description of the overlapping part is abbreviate | omitted.

Referring to FIG. 23, the source device 1 300-1 implemented as a content server and the source device 2 300-2 implemented as a live source include an SMPTE timecode in a VBI section of raw video data or in a program unit. Include a separate start marker to indicate the starting point of the. For example, when the synchronization information is an SMPTE time code, the MPEG2 encoder 110-1 and the AVC encoder 110-2 respectively capture the SMPTE time code included in the low video data VBI section and include it in the ES level stream. On the other hand, if the synchronization information is a frame number, the file generator 130-2 recognizes the start marker and transfers the generated frame number to the master 120-1 in the transmission device 1 100-1. Accordingly, the frame number value may be included in the first signal.

In FIG. 23, a subject for generating and inserting PES level synchronization information may be variously determined according to whether a transport stream or an MP4 file is used. For example, in the case of a transport stream, it can be either Encoder or Muxer, or a separate PES can be generated directly in the Encoder. Alternatively, the PES level synchronization information may be generated by extracting a time code, a frame number, and the like delivered to the ES level. On the other hand, in the case of MP 4, time code and frame number delivered to the ES level through the encoder can be extracted, and the corresponding information can be inserted by extending a part of the file format Moov in the file generator. As described above, the intermediate processing for generating the stream may be variously combined or configured.

24 is a diagram for describing a configuration and an operation of a reception device according to an embodiment of transmitting synchronization information at an ES level. According to FIG. 24, the receiver 200 includes first and second receivers 210 and 220 and a signal processor 230. Among the components of FIG. 24, duplicated descriptions of parts identical to those in FIGS. 12 and 19 will be omitted.

The first signal received through the first receiver 210 is provided to the first demux 231. The first demux extracts synchronization information provided through the ES level, that is, SMPTE time code or frame number, and provides the extracted information to the rendering unit 233. The second signal received through the second receiver 220 is provided to the second demux 231. The second demux 231 also extracts synchronization information and provides it to the renderer 233. The rendering unit 233 synchronizes and outputs video frames of the first signal and the second signal based on the provided synchronization information.

As described above, the reception apparatus 200 may determine which synchronization information is to be performed based on the synchronization type information. Accordingly, the synchronization can be performed in a manner optimized for the configuration of the system while being compatible with the existing broadcasting system.

Meanwhile, in FIG. 2, five pieces of synchronization type information are illustrated, but the synchronization type information may be further added.

25 shows synchronization type information including a value of 0x06. According to FIG. 25, synchronization type information for using the PES level counterpart PTS as synchronization information may also be provided. That is, when the synchronization type information is set to 0x06, the first signal should have the PTS value of the second signal to be paired, and the second signal should have the PTS value of the first signal to be paired. Such a PTS value may be provided through a private stream corresponding to stream types 0x05 and 0x06 or a metadata stream corresponding to stream types 0x14, 0x15 and 0x16 in a form similar to that shown in FIGS. 20 and 21. 26 shows an example thereof.

According to FIG. 26, the time stamp information of the second signal b, that is, the PTS is stored in the payload region of the private data stream or the metadata stream defined as the stream type 0x15 or 0x05 in the PMT of the first signal a. It is shown. As such, when the time stamp information of the counterpart counterpart of the signals constituting one content is displayed on the first signal, the receiving apparatus 200 checks the time stamp information and the current video frame of the first signal. The video frame of the second signal corresponding to may be selected and processed. As a result, synchronization can be achieved.

On the other hand, when time stamp information is used as synchronization information, the time stamp information can also be recorded at the video level.

27 shows synchronization type information specifying time stamp information of a video level. According to FIG. 27, a value of 0x07 is newly defined, and the video level watermark counterpart PTS is stored to match the newly defined value.

If the synchronization type information is 0x07, the transmission system includes the PTS value of the counterpart stream to be synchronized in the video level of each signal in the form of a watermark.

Meanwhile, according to another embodiment of the present invention, the synchronization type information and the synchronization information may be provided together through a private data stream or a metadata stream.

FIG. 28 is a diagram illustrating a structure for transmitting synchronization type information and synchronization information together in a first signal transmitted through an RF broadcast network. Referring to FIG. 28, the PMT of the first signal indicates that synchronization type information (Pair_type) and synchronization information (Pair information) are included in a payload region of a private data stream or a metadata stream defined as stream type 0x15 or 0x05.

FIG. 29 illustrates that the synchronization type information (Pair_type) and the synchronization information (Pair information) are included in the payload region of the private data stream or the metadata stream in the TS-based second signal transmitted through the IP communication network.

FIG. 30 illustrates a structure in which synchronization type information (Pair_type) and synchronization information (Pair information) are simultaneously transmitted through a PES level private data stream or a metadata stream in an MP4 based second signal transmitted through an IP communication network. .

31 is a table illustrating an example of syntax of synchronization type information and a description of each part in the syntax. According to FIG. 31, the synchronization type information includes syntax words such as identifier, media_index_id, media_synchronization_pair_type, and the like. In addition, the SMPTE time code or frame number is recorded depending on whether the media_synchronization_pair_type is 0x04 or 0x05.

As described above, the synchronization type information is transmitted to the receiving device in various ways, and the receiving device can detect and synchronize the necessary synchronization information based on the received information. Although the above-described exemplary embodiments have described the synchronization type information from both the first signal and the second signal, the synchronization type information may be included in only one of the first signal and the second signal. For example, when checking the synchronization type information from the first signal as a reference, the receiving device may detect the synchronization information corresponding to the synchronization type information and use the same for the second signal matching the first signal.

Meanwhile, as described above, the receiving apparatus may be implemented in various forms.

32 is a block diagram illustrating still another example of the configuration of the signal processing unit included in the receiving apparatus. Referring to FIG. 32, the signal processor 230 may include a first demux 231, a first decoder 232, a renderer 233, a second demux 234, a file parser 235, and a second decoder ( 236, a control unit 236.

The first demux 231 detects the first video data by demuxing the first signal. When the second signal is in the form of a transport stream, the second demux 234 detects the second video data by demuxing the second signal. On the other hand, when the second signal is a file format, the file parser 235 detects the second video data by parsing the second signal. The controller 236 may process the second signal by selectively driving the second demux 234 or the file parser 235 according to the format of the second signal.

The first decoder 232 decodes the first video data demuxed by the first demux 231, and the second decoder 236 detects the second video detected by the second demux 234 or file parser 235. 2 Decode the video data.

The renderer 233 combines the first video data decoded by the first decoder and the second video data decoded by the second decoder to perform rendering.

The controller 237 controls each of these components based on the synchronization type information to process the first signal and the second signal to be synchronized.

That is, the controller 237 detects synchronization type information from at least one of the first signal and the second signal. As described above, the synchronization type information may be transmitted in various forms according to the transmission scheme. If each component detects the synchronization type information in the signal processing, the controller 237 is provided with the information. The first demux 231, the first decoder 232, the renderer 233, the second demux 234, and the file parser 235 are configured to detect and synchronize the synchronization information according to the synchronization type information. Control at least one of the second decoder 236.

Specifically, the control unit 237 is configured to detect video data having the same time code or frame number, if the synchronization type information is information for specifying to use the time code or frame number recorded at the PES level as the synchronization information. The first demux 231 and the second demux 234 or the first demux 231 and the file parser 235 are controlled.

Alternatively, if the synchronization type information specifies that the time code or frame number recorded at the ES level is to be used as the synchronization information, the control unit 237 decodes the first and second signals, respectively, and then matches the time code or frame number. The first and second decoders 232 and 236 may be controlled to output video data.

Alternatively, if the synchronization type information is information for specifying to use the time code or frame number recorded at the video level as the synchronization information, the control unit 237, from the decoded first signal and the decoded second signal, the time code or frame number. The controller 233 may control the rendering unit 233 to detect and render matching video data, respectively.

As described above, the signal processing unit may be configured in various forms to process the synchronization type information and the synchronization information.

33 is a flowchart illustrating a signal processing method of a receiving apparatus according to various embodiments of the present disclosure. Referring to FIG. 33, the reception apparatus receives a first signal and a second signal through an RF broadcast network and an IP communication network, respectively (S3310). The first signal and the second signal mean a signal in which data constituting one content is divided. As described above, the first signal and the second signal may be separated according to various types according to the type of content.

When the first signal and the second signal are received, the receiving device detects synchronization type information from at least one of the received signals (S3320). As described with reference to FIG. 2, the synchronization type information synchronizes a first value for designating a video level watermark timecode as synchronization information, a second value for designating a video level watermark frame number as synchronization information, and an ES level SMPTE timecode. The third value specified as the information, the fourth value specifying the PES level SMPTE timecode as the synchronization information, and the fifth value specifying the PES level frame number as the synchronization information may be set. Alternatively, according to an embodiment, as shown in FIG. 25 or FIG. 27, a sixth value for designating a PES level counterpart PTS as synchronization information, or a seventh for designating a video level watermark counterpart PTS as synchronization information. Values may also be included.

When the synchronization type information is confirmed, the reception apparatus selects synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal. In operation S3330, the first signal and the second signal are synchronized according to the selected synchronization information. The reception device may be implemented in various forms as described above. In addition, synchronization may also be performed by correcting and matching time stamps, or by selecting and processing frames based on time codes or frame numbers.

Meanwhile, as described above, in order to transmit the synchronization type information to the first signal and the second signal, a process for loading the synchronization type information in the transmission system must be performed. Such a process may be sufficiently understood based on the contents described in the above-described various drawings, and thus the illustration thereof is omitted.

The system described above may be applied to various environments for transmitting and receiving data whose time stamps do not match. That is, in addition to the 3D content consisting of the left eye image and the right eye image, it can be used for various types of hybrid services for separately transmitting and transmitting content based on a broadcasting network and a network network.

For example, the present invention may be applied to a data broadcasting service system that transmits 2D broadcasting through a broadcasting network and transmits data such as multilingual audio and multilingual subtitles through a network. Alternatively, the present invention may be applied to a UHD broadcasting service system that transmits 2D broadcasting through a broadcasting network and transmits UHD broadcasting data through a network. In addition, a multi-view broadcasting service system that transmits 2D broadcasting through a broadcasting network and data such as a depth map or another viewpoint view through a network, or a 2D broadcasting through a broadcasting network, May be applied to a multi-angle service system that provides image data of another photographing angle.

In addition, in the above examples, 2D broadcasting is illustrated as being transmitted only through a broadcasting network, but this is only an example for utilizing an existing broadcasting system, and is not necessarily limited thereto. That is, multilingual audio data, multilingual subtitle data, UHD broadcast data, depth map data, and other view view data corresponding to 2D content data may also be transmitted through the broadcast network.

In addition, in the above various examples, a hybrid system using both an RF broadcasting network and an IP communication network is illustrated. However, the type of communication network may be variously set.

The signal processing method of the transmitting apparatus or the signal processing method of the receiving apparatus according to the various embodiments described above may be coded in software and mounted on various devices.

Specifically, according to an embodiment of the present invention, receiving the first signal and the second signal through the RF broadcasting network and the IP communication network, respectively, the synchronization type information from at least one of the first signal and the second signal Detecting the signal; selecting the synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal, and performing a signal processing step of synchronizing the first signal and the second signal according to the selected synchronization information. A non-transitory readable medium having stored thereon a program may be installed.

A non-transitory readable medium is a medium that stores data for a short period of time, such as a register, cache, memory, etc., but semi-permanently stores data and is readable by the apparatus. In particular, the various applications or programs described above may be stored on non-volatile readable media such as CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM,

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

210: first receiver 220: second receiver
230: signal processing unit 100-1, 100-2: transmitting apparatus 1, 2
300-1, 300-2: source device 1, 2

Claims (15)

A first receiver configured to receive a first signal through an RF broadcast network;
A second receiver configured to receive a second signal through an Internet Protocol Communication Network;
Detecting synchronization type information from at least one of the first signal and the second signal, selecting the synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal, And a signal processor configured to synchronize the first signal and the second signal according to the selected synchronization information.
The method of claim 1,
The signal processing unit,
And receiving the synchronization type information from a reserved area or descriptor area in at least one of the first signal and the second signal.
The method of claim 1,
The signal processing unit,
And receiving the synchronization type information from at least one PSIP VCT (Program and System Information Protocol Virtual Channel Table), EIT reserved or descriptor region of the first signal and the second signal.
The method of claim 1,
The signal processing unit,
And the synchronization type information is detected from a private stream or a metadata stream included in at least one of the first signal and the second signal.
The method of claim 1,
The signal processing unit,
A first demux for demuxing the first signal to detect first video data;
A second demux for detecting second video data by demuxing the second signal when the second signal is in the form of a transport stream;
A file parser for parsing the second signal and detecting second video data when the second signal is a file format;
A first decoder for decoding the first video data demuxed in the first demux;
A second decoder for decoding the second video data detected by the second demux or file parser;
And a rendering unit which combines the first video data decoded by the first decoder and the second video data decoded by the second decoder to perform rendering.
At least one of the first demux, the second demux, the file parser, the first decoder, the second decoder, and the rendering unit may selectively operate according to a value of the synchronization type information to detect the synchronization information. Receiving device, characterized in that.
The method of claim 1,
The signal processing unit,
A first demux for demuxing the first signal to detect first video data;
A second demux for detecting second video data by demuxing the second signal when the second signal is in the form of a transport stream;
A file parser for parsing the second signal and detecting second video data when the second signal is a file format;
A first decoder for decoding the first video data demuxed in the first demux;
A second decoder for decoding the second video data detected by the second demux or file parser;
A rendering unit which performs rendering by combining first video data decoded by the first decoder and second video data decoded by the second decoder; And
Detecting the synchronization type information from at least one of the first signal and the second signal, and detecting the synchronization information according to the synchronization type information to perform synchronization; And a controller for controlling at least one of a file parser, the first decoder, the second decoder, and the renderer.
The method according to claim 6,
The control unit,
If the synchronization type information is information specifying a type using a time code or frame number recorded at a PES level as the synchronization information, the first demux to detect video data having the same time code or frame number, respectively. And controlling the second demux or the first demux and the file parser,
If the synchronization type information is information specifying a type using a time code or frame number recorded at an ES level as the synchronization information, the time code or frame number matches after decoding the first and second signals, respectively. Control the first and second decoders to output video data;
If the synchronization type information is information specifying a type using a time code or frame number recorded at a video level as the synchronization information, the time code or frame number from the decoded first signal and the decoded second signal. And the rendering unit controls the rendering unit to detect and render matching video data, respectively.
8. The method according to any one of claims 1 to 7,
The synchronization type information,
A first value for designating a video level watermark timecode as the synchronization information, a second value for designating a video level watermark frame number as the synchronization information, a third value for designating an ES level SMPTE timecode as the synchronization information, A fourth value for designating a PES level SMPTE timecode as the synchronization information, a fifth value for designating a PES level frame number as the synchronization information, a sixth value for designating a PES level counterpart PTS as the synchronization information, and a video level water And a seventh value for designating a mark counterpart PTS as the synchronization information.
8. The method according to any one of claims 1 to 7,
The first signal includes one of left eye image data and right eye image data of 3D content, and the second signal includes another one of left eye image data and right eye image data of the 3D content.
8. The method according to any one of claims 1 to 7,
One of the first signal and the second signal includes 2D content data,
The other one of the first signal and the second signal includes at least one of multilingual audio data, multilingual subtitle data, UHD broadcast data, depth map data, and other viewpoint view data corresponding to the 2D content data. Receiving device.
In the signal processing method of the receiving device,
Receiving a first signal and a second signal through an RF broadcast network and an IP communication network, respectively;
Detecting synchronization type information from at least one of the first signal and the second signal;
A signal processing step of synchronizing the first signal and the second signal according to the selected synchronization information by selecting synchronization information corresponding to the synchronization type information among the synchronization information included in the first signal and the second signal; Signal processing method comprising a.
12. The method of claim 11,
The synchronization type information,
And information recorded in a reserved area or a descriptor area of at least one of the first signal and the second signal.
12. The method of claim 11,
The synchronization type information,
And at least one of the first signal and the second signal is information recorded in a PSIP VCT (Program and System Information Protocol Virtual Channel Table), an EIT reserved, or a descriptor area.
12. The method of claim 11,
The synchronization type information,
And a private stream or a metadata stream included in at least one of the first signal and the second signal.
15. The method according to any one of claims 11 to 14,
The synchronization type information,
A first value for designating a video level watermark timecode as the synchronization information, a second value for designating a video level watermark frame number as the synchronization information, a third value for designating an ES level SMPTE timecode as the synchronization information, A fourth value for designating a PES level SMPTE timecode as the synchronization information, a fifth value for designating a PES level frame number as the synchronization information, a sixth value for designating a PES level counterpart PTS as the synchronization information, and a video level water And one of a seventh value for designating a mark counterpart PTS as the synchronization information.
KR1020120116023A 2012-04-13 2012-10-18 Receiving device for a plurality of signals through different paths and method for processing the signals thereof KR20130116154A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/861,068 US20130276046A1 (en) 2012-04-13 2013-04-11 Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof
PCT/KR2013/003137 WO2013154402A1 (en) 2012-04-13 2013-04-15 Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261623735P 2012-04-13 2012-04-13
US61/623,735 2012-04-13

Publications (1)

Publication Number Publication Date
KR20130116154A true KR20130116154A (en) 2013-10-23

Family

ID=49635420

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120116023A KR20130116154A (en) 2012-04-13 2012-10-18 Receiving device for a plurality of signals through different paths and method for processing the signals thereof

Country Status (1)

Country Link
KR (1) KR20130116154A (en)

Similar Documents

Publication Publication Date Title
US9628771B2 (en) Transmitter and receiver for transmitting and receiving multimedia content, and reproduction method therefor
US20130276046A1 (en) Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof
JP5977760B2 (en) Receiving device for receiving a plurality of real-time transmission streams, its transmitting device, and multimedia content reproducing method
US20130271657A1 (en) Receiving apparatus for providing hybrid service, and hybrid service providing method thereof
US9392256B2 (en) Method and apparatus for generating 3-dimensional image datastream including additional information for reproducing 3-dimensional image, and method and apparatus for receiving the 3-dimensional image datastream
EP2728858B1 (en) Receiving apparatus and receiving method thereof
CA2967245C (en) Transmission device, transmission method, reception device, and reception method
US20110010739A1 (en) Method and apparatus for transmitting/receiving stereoscopic video in digital broadcasting system
EP2744214A2 (en) Transmitting device, receiving device, and transceiving method thereof
US9210354B2 (en) Method and apparatus for reception and transmission
US9516086B2 (en) Transmitting device, receiving device, and transceiving method thereof
US20100208750A1 (en) Method and appartus for generating three (3)-dimensional image data stream, and method and apparatus for receiving three (3)-dimensional image data stream
US20130271568A1 (en) Transmitting system and receiving apparatus for providing hybrid service, and service providing method thereof
KR20110123658A (en) Method and system for transmitting/receiving 3-dimensional broadcasting service
US20140157342A1 (en) Reception device, transmission device, reception method, and transmission method
EP2822282B1 (en) Signal processing device and method for 3d service
US20140147088A1 (en) Transmission device, receiving/playing device, transmission method, and receiving/playing method
KR102016674B1 (en) Receiving device for providing hybryd service and method thereof
KR20130056829A (en) Transmitter/receiver for 3dtv broadcasting, and method for controlling the same
KR101191498B1 (en) System and Method for synchronization of 3D broadcasting service using real-time broadcasting and non-real time additional broadcasting data
KR20130116154A (en) Receiving device for a plurality of signals through different paths and method for processing the signals thereof
KR20150006340A (en) Method and apparatus for providing three-dimensional video
KR20130115975A (en) Transmitting system and receiving device for providing hybrid service, and methods thereof
KR20140004045A (en) Transmission apparatus and method, and reception apparatus and method for providing 3d service using the content and additional image seperately transmitted with the reference image transmitted in real time

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination