KR20130116154A - Receiving device for a plurality of signals through different paths and method for processing the signals thereof - Google Patents
Receiving device for a plurality of signals through different paths and method for processing the signals thereof Download PDFInfo
- Publication number
- KR20130116154A KR20130116154A KR1020120116023A KR20120116023A KR20130116154A KR 20130116154 A KR20130116154 A KR 20130116154A KR 1020120116023 A KR1020120116023 A KR 1020120116023A KR 20120116023 A KR20120116023 A KR 20120116023A KR 20130116154 A KR20130116154 A KR 20130116154A
- Authority
- KR
- South Korea
- Prior art keywords
- signal
- synchronization
- information
- type information
- synchronization information
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2389—Multiplex stream processing, e.g. multiplex stream encrypting
- H04N21/23892—Multiplex stream processing, e.g. multiplex stream encrypting involving embedding information at multiplex stream level, e.g. embedding a watermark at packet level
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4343—Extraction or processing of packetized elementary streams [PES]
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The receiving device is disclosed. The apparatus includes a first receiver for receiving a first signal through a radio frequency broadcast network, a second receiver for receiving a second signal through an internet protocol communication network, a first signal, and a first signal. Detecting synchronization type information from at least one of the two signals, selecting the synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal, and according to the selected synchronization information, And a signal processor for synchronizing two signals. Accordingly, different signals can be synchronized and output.
Description
The present invention relates to a receiving apparatus and a signal processing method thereof, and more particularly, to a receiving apparatus and a signal processing method for synchronizing a plurality of signals received through different paths.
With the development of electronic technology, various types of electronic devices have been developed and spread. Representative examples of these electronic devices include a receiving device such as a TV.
Recently, as the performance of TV is improved, even multimedia contents such as 3D content or full HD content are being serviced. This type of content has a larger data size than existing content.
However, the transmission bandwidth used in the broadcasting network is limited. Therefore, there is a limitation in the size of the content that can be transmitted in the current broadcasting network. In order to meet this constraint, it is necessary to reduce the resolution inevitably, there is a problem that the image quality deteriorated accordingly.
In order to solve this problem, there have been attempts to provide various types of media data through various transmission environments. However, since these data are transmitted through different paths, it is not known whether the data is related to each other on the receiving device side, and thus there is a problem in that they cannot be properly synchronized.
Therefore, there is a need for a method that can properly synchronize these contents.
The present invention is in accordance with the above-described needs, and an object of the present invention is to provide a receiving apparatus and a signal processing method for receiving and synchronizing and outputting a plurality of signals transmitted through different networks.
The reception device according to an embodiment of the present invention for achieving the above object, the first receiving unit for receiving the first signal through an RF (Radio Frequency broadcast network), IP communication network (Internet Protocol Communication Network) A synchronization type information is detected from at least one of a second receiver, the first signal, and the second signal that receives a second signal through the second signal; and the synchronization is performed among the synchronization information included in the first signal and the second signal. And a signal processor configured to select synchronization information corresponding to type information and synchronize the first signal and the second signal according to the selected synchronization information.
The signal processor may detect the synchronization type information from a reserved area or a descriptor area in at least one of the first signal and the second signal.
Alternatively, the signal processor may detect the synchronization type information from at least one PSIP VCT (Program and System Information Protocol Virtual Channel Table), EIT reserved or descriptor region of the first signal and the second signal.
Alternatively, the signal processor may detect the synchronization type information from a private stream or a metadata stream included in at least one of the first signal and the second signal.
The signal processor may further include: a first demux for demuxing the first signal to detect first video data, and when the second signal is in the form of a transport stream, demux the second signal to perform second video data. A second demux for detecting a second demux, a file parser for parsing the second signal to detect second video data when the second signal is a file format, and decoding the first video data demuxed at the first demux A second decoder for decoding the second video data detected by the first decoder, the second demux or the file parser, the first video data decoded by the first decoder and the second decoded by the second decoder The video data may be combined to include a rendering unit that performs rendering.
Here, at least one of the first demux, the second demux, the file parser, the first decoder, the second decoder, and the rendering unit may be selectively operated according to the value of the synchronization type information to provide the synchronization information. Can be detected.
Alternatively, the signal processor may include: a first demux for detecting the first video data by demuxing the first signal, and when the second signal is in the form of a transport stream, demuxing the second signal to perform second video data. A second demux for detecting a second demux, a file parser for parsing the second signal to detect second video data when the second signal is a file format, and decoding the first video data demuxed at the first demux A second decoder for decoding the second video data detected by the first decoder, the second demux or the file parser, the first video data decoded by the first decoder and the second decoded by the second decoder Combining video data to detect the synchronization type information from a rendering unit that performs rendering and at least one of the first signal and the second signal to determine the synchronization type. And a controller configured to control at least one of the first demux, the second demux, the file parser, the first decoder, the second decoder, and the renderer to detect and synchronize the synchronization information. You may.
Here, if the synchronization type information is information specifying a type using the time code or frame number recorded at the PES level as the synchronization information, the control unit may detect video data having the same time code or frame number. Control the first demux and the second demux or the first demux and the file parser, and designate a type of using a time code or frame number in which the synchronization type information is recorded at an ES level as the synchronization information; The first and second decoders are decoded, respectively, and then the first and second decoders are output to output video data having the same time code or frame number, and the synchronization type information is set to the video level. If it is information specifying a type using the recorded time code or frame number as the synchronization information, The rendering unit may be controlled to detect and render video data having the same time code or frame number from the decoded first signal and the decoded second signal, respectively.
The synchronization type information may include a first value for designating a video level watermark timecode as the synchronization information, a second value for designating a video level watermark frame number as the synchronization information, and synchronizing an ES level SMPTE timecode. A third value designated by the information, a fourth value specifying the PES level SMPTE timecode as the synchronization information, a fifth value specifying the PES level frame number as the synchronization information, and a PES level counterpart PTS as the synchronization information The sixth value may include one of a seventh value designating the video level watermark counterpart PTS as the synchronization information.
The first signal may include one of left eye image data and right eye image data of 3D content, and the second signal may include another one of left eye image data and right eye image data of the 3D content.
Alternatively, one of the first signal and the second signal includes 2D content data, and the other of the first signal and the second signal includes multilingual audio data, multilingual subtitle data, and UHD corresponding to the 2D content data. It may include at least one of broadcast data, depth map data, and other viewpoint view data.
On the other hand, according to an embodiment of the present invention, in the signal processing method of the receiving device, receiving the first signal and the second signal through the RF broadcast network and IP communication network, respectively, of the first signal and the second signal Detecting synchronization type information from at least one signal, selecting synchronization information corresponding to the synchronization type information from among the synchronization information included in the first signal and the second signal, and according to the selected synchronization information; And a signal processing step of synchronizing a signal and the second signal.
The synchronization type information may be information recorded in a reserved area or a descriptor area in at least one of the first signal and the second signal.
Alternatively, the synchronization type information may be information recorded in at least one PSIP Program and System Information Protocol Virtual Channel Table (VCT), EIT reserved or descriptor area of the first signal and the second signal.
Alternatively, the synchronization type information may be recorded in a private stream or a metadata stream included in at least one of the first signal and the second signal.
The synchronization type information may include: a first value designating a video level watermark timecode as the synchronization information, a second value designating a video level watermark frame number as the synchronization information, and synchronizing an ES level SMPTE timecode A third value designated by the information, a fourth value specifying the PES level SMPTE timecode as the synchronization information, a fifth value specifying the PES level frame number as the synchronization information, and a PES level counterpart PTS as the synchronization information The sixth value may include one of a seventh value for designating the video level watermark counterpart PTS as the synchronization information.
As described above, according to various embodiments of the present disclosure, data included in a plurality of signals received through a plurality of different communication networks may be synchronized using synchronization type information for designating an appropriate synchronization signal. Accordingly, appropriate synchronization information can be used depending on the situation.
1 is a block diagram showing the configuration of a transmission and reception system according to an embodiment of the present invention;
2 is a table showing examples of synchronization type information;
3 to 10 are diagrams illustrating a method of transmitting synchronization type information and a structure of synchronization type information according to various embodiments of the present disclosure;
11 is a view for explaining the configuration and operation of a transmission system according to an embodiment of the present invention;
12 is a view for explaining the configuration and operation of a receiving apparatus according to an embodiment of the present invention;
13 is a diagram illustrating an example of a structure in which a time code is recorded in a GOP header;
14 is a view for explaining a method of transmitting a time code using Supplemental enhancement information (SEI) defined in AVC (Advanced Video Coding: ISO / IEC 14496-10);
15 is a diagram illustrating an example of synchronization type information in the form of a watermark carried in an ES payload of a first signal transmitted through an RF broadcasting network;
16 is a diagram illustrating an example of synchronization type information in the form of a watermark carried in an ES payload of a second signal transmitted through an IP communication network;
17 is a diagram illustrating an example of synchronization type information carried on a second signal based on an MP4 file format;
18 is a view for explaining the configuration and operation of a transmission system according to another embodiment of the present invention;
19 is a view for explaining the configuration and operation of a receiving apparatus according to another embodiment of the present invention;
20 is a diagram illustrating an example of PES level synchronization type information of a first signal transmitted through an RF broadcast network;
21 is a configuration illustrating an example of PES level synchronization type information of a second signal transmitted through an IP communication network;
22 is a configuration illustrating an example of PES level synchronization type information carried on a second signal based on an MP4 file format;
23 is a view for explaining the configuration and operation of a transmission system according to another embodiment of the present invention;
24 is a view for explaining the configuration and operation of a receiving apparatus according to another embodiment of the present invention;
25 is a table showing synchronization type information to which a value for designating a PES level counterpart PTS as synchronization information is added;
26 is an example of a signal format including a PES level counterpart PTS,
27 is a table showing synchronization type information with a value specifying a video level watermark counterpart PTS as synchronization information;
FIG. 28 is a diagram for describing a method of transmitting synchronization type information through a private or metadata stream of a first signal transmitted through an RF broadcast network; FIG.
29 is a view for explaining a method of transmitting synchronization type information through a private or metadata stream of a second signal transmitted through an IP communication network;
30 is a view for explaining a method of transmitting synchronization type information through a private or metadata stream of a second signal based on an MP4 file format;
31 is a table for explaining syntax of synchronization type information;
32 is a diagram illustrating still another example of the signal processing unit configuration, and
33 is a flowchart illustrating a signal processing method of a receiving apparatus according to an embodiment of the present invention.
Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.
1 is a block diagram showing the configuration of a transmission and reception system according to an embodiment of the present invention. According to FIG. 1, a transmission / reception system includes a plurality of
The transmitting
The first signal and the second signal each include different data constituting one multimedia content. For example, in the case of 3D content, the left eye image and the right eye image may be included in the first signal and the second signal, respectively. Alternatively, the video signal may be divided into video data and audio data, or may be divided into video data, subtitle data, and other data, and may be included in the first signal or the second signal, respectively.
The first signal includes first synchronization information together with the first data, and the second signal includes second synchronization information together with the second data.
Various information may be used as the first and second synchronization information. Specifically, the synchronization information may be an SMPTE time code, a frame number, and various other information according to its form, and may be classified into a video level, an ES level, a PES level, and the like according to a provided method.
The transmitting
Synchronization type information means such a separator. That is, the synchronization type information refers to a value for designating information to be used as synchronization information among various pieces of information included in at least one of the first signal and the second signal. The synchronization type information may alternatively be referred to as a pair type or a media synchronization pair type.
The
According to FIG. 1, the
The
The second signal may be transmitted in the form of a real time transport stream or in the form of an MP4 file. When implemented in the form of a real-time transport stream, the second signal may be transmitted and received using a protocol such as RTP or HTTP. In the case of using HTTP, a metadata file must be provided to obtain a second signal.
The metadata is information for indicating where the multimedia content can be received. The metadata file may include information that the client needs to know in advance, such as a location on a content time corresponding to each of the plurality of separate files, a URL of a source for providing the file, and a size. Metadata files can be classified in various ways depending on the type of HTTP-based streaming. That is, in the case of a smooth streaming method, an ism (Internet Information Service (IIS) Smooth streaming Media) file is used as a metadata file. In case of Internet Engineering Task Force (IETF) HTTP live streaming, m3v8 file is used as metadata file, and adaptive HTTP streaming Rel adopted by 3GPP. 9 or adaptive HTTP streaming Rel employed in OIPF. 2, in the case of the dynamic adaptive streaming over HTTP scheme employed in MPEG, MPD (Media Presentation Description) may be used as the metadata file. The
The
2 is a table for explaining various examples of synchronization type information. According to FIG. 2, the synchronization type information includes a first value for designating a video level watermark timecode as synchronization information, a second value for designating a video level watermark frame number as synchronization information, and an ES level SMPTE timecode as synchronization information. A third value to designate, a fourth value to designate a PES level SMPTE timecode as synchronization information, and a fifth value to designate a PES level frame number as synchronization information. The first to fifth values may be represented by bit values such as 0x01, 0x02, 0x03, 0x04, 0x05, and the like.
In this case, the video level means that synchronization information is carried in the video data payload region of the PES packet. The ES level means that synchronization information is carried in the video data ES header area of the PES packet. The PES level means that synchronization information is carried in a payload region of a data packet separately provided in a PES packet.
On the other hand, the time code is a series of pulse signals produced by the time code generator, and is a signal standard developed for easy editing management. When creating and editing content, the same time code is used for synchronized management of left and right eye images. Therefore, the time code can maintain the same pair regardless of the time of stream generation or delivery. Specifically, SMPTE (Society of Motion Picture and Television Engineers) time code may be used. That is, in SMPTE 12M, the time code is represented in the form of "hour: minute: second: frame". The SMPTE time code may be classified into a longitude time code (LTC) or a vertical interval time code (VITC) according to a recording method. In the case of the LTC, a total of 80 bits of data including time information (25 bits), user information (32 bits), synchronization information (16 bits), storage area (4 bits), and frame mode display (2 bits) may be configured. VITC is recorded on two horizontal lines within the vertical blanking period of the video signal. SMPTE RP-188 defines an interface specification that allows LTC or VITC type timecode to be transmitted as ancillary data.
The frame number means identification information such as a number assigned to each frame. The frame number may be recorded in an event information table (EIT), a PMT, a private stream, a transport stream header, or the like of the first signal or the second signal.
As described above, the receiving
Meanwhile, the synchronization type information for specifying the type of synchronization information may be transmitted using a reserved area or descriptor area in a program map table (PMT), and a PSIP program and system information protocol virtual channel table. ), A second method for transmitting using an event information table (EIT) reserved or a descriptor region, and a third method for transmitting using a private stream or a metadata stream, and the like.
Hereinafter, various examples of a method of transmitting synchronization type information will be described.
FIG. 3 is a diagram for describing a first method of transmitting synchronization type information using a PMT reserve area or a descriptor area. Fig. 3A shows the syntax of the program map table. According to FIG. 3A, a reserve area and a descriptor are defined in the program map table. The synchronization type information may be recorded in such a reserve area or descriptor area. 3 (b) shows the syntax of a new descriptor defined for synchronization type information. According to FIG. 3B, 24-bit synchronization type information may be recorded in the new descriptor.
FIG. 4 is a diagram for describing a second method of transmitting using a reserved or descriptor area of a PSIP VCT (Program and System Information Protocol Virtual Channel Table).
4 (a) shows a bit stream syntax of a VCT providing channel or event information in ATSC A65. According to FIG. 4, synchronization type information may be recorded in a reserve area or a descriptor area in the VCT. Fig. 4B shows the syntax of the synchronization type information recorded in the descriptor area.
5 is a diagram for describing a second scheme of transmitting using an event information table (EIT) reserved or descriptor region. FIG. 5A shows the bit stream syntax of the EIT. According to FIG. 5, synchronization type information may be recorded in a reserve area or a descriptor area in the EIT. FIG. 5B shows the syntax of the synchronization type information recorded in the descriptor area.
FIG. 6 is a diagram for describing a third method of transmitting synchronization type information using a private stream or a metadata stream. When implemented in the third manner, the synchronization type information may be provided using a structure of a private stream or a metadata stream for providing additional information or additional data. The private stream is defined as one of stream types = 0x05 and 0x06 in the PMT and the metadata stream is one of stream types = 0x14, 0x15 and 0x16. In FIGS. 6A to 6D, in addition to the PES video packet b and the audio packet c, synchronization type information is provided through a private stream or a metadata stream defined as stream type = 0x15 or 0x05. Indicates.
FIG. 7 is a table for explaining syntax of synchronization type information provided through a stream as shown in FIG. According to FIG. 7, the synchronization type information (media_synchronization_pair_type) is composed of 8 bits.
As described above, the synchronization type information may be transmitted in various ways. Hereinafter, the providing form and the providing method of the synchronization information according to the synchronization type information will be described in detail.
First, FIGS. 8 to 12 show a configuration of synchronization information transmitted at a video level and a configuration of a transmission / reception system using the synchronization information.
According to the table shown in FIG. 2 described above, if the synchronization type information has a value of 0x01 or 0x02, the synchronization information is provided at the video level. That is, the synchronization information is inserted in the form of frame by frame through a video watermark method. The watermark is divided into a method of embedding in the spatial domain or the frequency domain. Specifically, the synchronization information may be implemented by inserting synchronization information into the LSB (Least Significant Bit) of the image in the spatial domain, or inserting the synchronization information after converting the synchronization information into the frequency domain.
8 is a diagram illustrating a form of providing a first signal transmitted through an RF broadcast network. According to FIG. 8, pair information is provided in a watermark format in an ES payload region of a video packet carrying actual video data. Pair information may be regarded as information for PTS and DTS correction provided to resolve mismatch of PCR-based PTS and DTS synchronization information between two or more streams.
FIG. 9 is a diagram illustrating a form of providing a second signal transmitted in a transport stream form through an IP communication network. According to FIG. 9, watermark-type synchronization information (pair information) is carried in an ES payload region of a video packet carrying actual video data.
10 is a diagram illustrating a form of a second signal transmitted in an MP4 file format. According to FIG. 10, synchronization information in a watermark format may be carried on video data of an mdat region provided with an actual video value.
As described above, the synchronization information may be stored in various locations, and the
11 shows an example of a configuration of a transmission system including
The
In FIG. 11, the left eye low video data is provided to the transmitting
The
The
On the other hand, when transmitting in the MP4 file format, the encoder 110-2 may provide the video ES to the file generator 130-2. The file generation unit 130-2 converts the video ES into a file format and provides it to the server 140-2.
The server 140-2 stores video data provided by the processor 120-2 or the file generator 130-2. When the server 140-2 receives a video request for video data from the receiving
In FIG. 11, the synchronization type information is inserted into a PMT, a PSIP VCT, an EIT, or the like in the TS or a file by the encoder units 110-1 and 110-2 and 120-1, 120-2, or a separate private stream or the like. It can be generated as a metadata stream. Such synchronization type information may be provided from the
FIG. 12 is a diagram illustrating an example of a configuration of a receiving apparatus that receives a first signal and a second signal transmitted by the transmission system of FIG. 11. According to FIG. 12, the receiver includes a
The
The
For example, if a time code is specified by the synchronization type information, the
As another example, frame index information such as a frame number may be designated as synchronization information by the synchronization type information. The frame index information means identification information provided for each frame. Typically, this may be a frame number. The frame index information may be recorded in an event information table (EIT), a PMT, a private stream, a transport stream header, and the like of a real time transport stream. The
According to FIG. 12, the
According to FIG. 12, the first signal received by the
Meanwhile, the second signal received by the
According to FIG. 12, the decoded data provided by the
Next, synchronization information may be sent at the ES level.
13 to 17 are diagrams illustrating a configuration of synchronization information transmitted at an ES level and a configuration of a transmission / reception system using the synchronization information.
According to the table shown in FIG. 2 described above, if the synchronization type information has a value of 0x03, the synchronization information is provided at the ES level. That is, synchronization information may be inserted in the ES header area. In more detail, the synchronization information has a form of an SMPTE time code, and in the case of MPEG2, an SMPTE time code of an ES header level may be provided through a GOP header or a picture timing SEI of an AVC.
13 shows an example of a syntax structure of a GoP header in an MPEG stream in which a time code is recorded in the GoP header. According to FIG. 13, a time code may be recorded as 25 bits of data. As shown in FIG. 13, the time code may be delivered to the receiving
14 shows a method of transmitting a time code using Supplemental enhancement information (SEI) defined in AVC (Advanced Video Coding: ISO / IEC 14496-10). According to FIG. 14, a time code may be delivered using seconds_value, minutes_value, hours_value, and n_frames defined in Picture timing SEI.
15 is a diagram illustrating a form of provision of a first signal transmitted through an RF broadcast network. According to FIG. 15, the PES packet is transmitted according to the stream type defined in the PMT. Among the PES packets, the synchronization information is included in the ES header of the video packet.
16 is a diagram illustrating a form of provision of a TS-based second signal transmitted through an IP communication network. According to FIG. 16, synchronization information may be included in the ES header level in the form of an SMPTE time code and transmitted.
17 is a diagram illustrating a form of providing a second signal based on an MP4 file format transmitted through an IP communication network. According to FIG. 17, it is shown that synchronization information is included in the form of an SMPTE time code in an ES header portion of an mdat area where an actual video value is provided.
18 is a diagram for explaining the configuration and operation of a transmission system that transmits content at an ES level. According to FIG. 18, the transmission system includes a plurality of source devices 300-1 and 300-2, a
Referring to FIG. 18, an SMPTE time code is included in a VBI section of raw data transmitted from
19 shows an example of a configuration of a reception apparatus using synchronization information included at an ES level. Since the basic configuration of the receiving apparatus of FIG. 19 is the same as that of the receiving apparatus of FIG. 12, description of overlapping portions will be omitted.
According to FIG. 19, the
The
20 to 24 are diagrams illustrating the configuration of synchronization information transmitted at the PES level and the configuration of a transmission / reception system using the synchronization information.
According to the table shown in FIG. 2 described above, if the synchronization type information has a value of 0x04 or 0x05, the synchronization information is provided at the PES level. In more detail, the synchronization information may be an SMPTE time code or a frame number. These synchronization information may be transmitted through a private stream or metadata PES stream having the same reproduction time information, that is, a time stamp.
20 is a diagram illustrating a form of provision of a first signal transmitted through an RF broadcast network. According to FIG. 20, an SMPTE time code or frame number, which is synchronization information, is inserted into a payload portion of a video packet or another PES packet other than an audio packet and transmitted.
21 is a diagram illustrating a form of providing a TS-based second signal transmitted through an IP communication network. According to FIG. 21, a stream type is designated in a PMT, and an SMPTE time code or frame number, which is synchronization information, is inserted and transmitted in a payload portion of a separate PES packet other than a video packet according to the designated stream type.
FIG. 22 is a diagram illustrating a form of provision of a second MP4-based signal transmitted through an IP communication network. FIG. According to FIG. 22, a value for calculating a frame number using stts (Time-to-Sample Atom), ctts (Composition Time To Sample Atom), and stss (Sync Sample Atom) that provides existing frame reproduction time information in a moov header Can be provided implicitly. That is, the MP4 file provides a relative time value with respect to the playback time from the file start position, such as PTS and DTS of TS, through stts, ctts, and stss. However, the frame number does not have a specific time unit because only the relative order between frames is provided, such as # 1 and # 2. Therefore, referring to relative time values provided by stts, ctts, etc., it is possible to infer the order of frames, that is, the frame number.
Alternatively, the SMPTE time code or frame number may be explicitly provided through a separate box extension. More specifically, an additional box may be defined in the ISO media base file format 14496-12, or a time code may be provided by extending a field in a predefined box. For example, a time code may be provided by extending a box of "stss (sync sample table)" that provides random access.
FIG. 23 is a diagram for explaining the configuration and operation of a transmission system that provides synchronization information at a PES level. Since the transmission system of FIG. 23 is also basically the same as the structure of the transmission system of FIG. 11, FIG. 18, description of the overlapping part is abbreviate | omitted.
Referring to FIG. 23, the
In FIG. 23, a subject for generating and inserting PES level synchronization information may be variously determined according to whether a transport stream or an MP4 file is used. For example, in the case of a transport stream, it can be either Encoder or Muxer, or a separate PES can be generated directly in the Encoder. Alternatively, the PES level synchronization information may be generated by extracting a time code, a frame number, and the like delivered to the ES level. On the other hand, in the case of
24 is a diagram for describing a configuration and an operation of a reception device according to an embodiment of transmitting synchronization information at an ES level. According to FIG. 24, the
The first signal received through the
As described above, the
Meanwhile, in FIG. 2, five pieces of synchronization type information are illustrated, but the synchronization type information may be further added.
25 shows synchronization type information including a value of 0x06. According to FIG. 25, synchronization type information for using the PES level counterpart PTS as synchronization information may also be provided. That is, when the synchronization type information is set to 0x06, the first signal should have the PTS value of the second signal to be paired, and the second signal should have the PTS value of the first signal to be paired. Such a PTS value may be provided through a private stream corresponding to stream types 0x05 and 0x06 or a metadata stream corresponding to stream types 0x14, 0x15 and 0x16 in a form similar to that shown in FIGS. 20 and 21. 26 shows an example thereof.
According to FIG. 26, the time stamp information of the second signal b, that is, the PTS is stored in the payload region of the private data stream or the metadata stream defined as the stream type 0x15 or 0x05 in the PMT of the first signal a. It is shown. As such, when the time stamp information of the counterpart counterpart of the signals constituting one content is displayed on the first signal, the receiving
On the other hand, when time stamp information is used as synchronization information, the time stamp information can also be recorded at the video level.
27 shows synchronization type information specifying time stamp information of a video level. According to FIG. 27, a value of 0x07 is newly defined, and the video level watermark counterpart PTS is stored to match the newly defined value.
If the synchronization type information is 0x07, the transmission system includes the PTS value of the counterpart stream to be synchronized in the video level of each signal in the form of a watermark.
Meanwhile, according to another embodiment of the present invention, the synchronization type information and the synchronization information may be provided together through a private data stream or a metadata stream.
FIG. 28 is a diagram illustrating a structure for transmitting synchronization type information and synchronization information together in a first signal transmitted through an RF broadcast network. Referring to FIG. 28, the PMT of the first signal indicates that synchronization type information (Pair_type) and synchronization information (Pair information) are included in a payload region of a private data stream or a metadata stream defined as stream type 0x15 or 0x05.
FIG. 29 illustrates that the synchronization type information (Pair_type) and the synchronization information (Pair information) are included in the payload region of the private data stream or the metadata stream in the TS-based second signal transmitted through the IP communication network.
FIG. 30 illustrates a structure in which synchronization type information (Pair_type) and synchronization information (Pair information) are simultaneously transmitted through a PES level private data stream or a metadata stream in an MP4 based second signal transmitted through an IP communication network. .
31 is a table illustrating an example of syntax of synchronization type information and a description of each part in the syntax. According to FIG. 31, the synchronization type information includes syntax words such as identifier, media_index_id, media_synchronization_pair_type, and the like. In addition, the SMPTE time code or frame number is recorded depending on whether the media_synchronization_pair_type is 0x04 or 0x05.
As described above, the synchronization type information is transmitted to the receiving device in various ways, and the receiving device can detect and synchronize the necessary synchronization information based on the received information. Although the above-described exemplary embodiments have described the synchronization type information from both the first signal and the second signal, the synchronization type information may be included in only one of the first signal and the second signal. For example, when checking the synchronization type information from the first signal as a reference, the receiving device may detect the synchronization information corresponding to the synchronization type information and use the same for the second signal matching the first signal.
Meanwhile, as described above, the receiving apparatus may be implemented in various forms.
32 is a block diagram illustrating still another example of the configuration of the signal processing unit included in the receiving apparatus. Referring to FIG. 32, the
The
The
The
The
That is, the
Specifically, the
Alternatively, if the synchronization type information specifies that the time code or frame number recorded at the ES level is to be used as the synchronization information, the
Alternatively, if the synchronization type information is information for specifying to use the time code or frame number recorded at the video level as the synchronization information, the
As described above, the signal processing unit may be configured in various forms to process the synchronization type information and the synchronization information.
33 is a flowchart illustrating a signal processing method of a receiving apparatus according to various embodiments of the present disclosure. Referring to FIG. 33, the reception apparatus receives a first signal and a second signal through an RF broadcast network and an IP communication network, respectively (S3310). The first signal and the second signal mean a signal in which data constituting one content is divided. As described above, the first signal and the second signal may be separated according to various types according to the type of content.
When the first signal and the second signal are received, the receiving device detects synchronization type information from at least one of the received signals (S3320). As described with reference to FIG. 2, the synchronization type information synchronizes a first value for designating a video level watermark timecode as synchronization information, a second value for designating a video level watermark frame number as synchronization information, and an ES level SMPTE timecode. The third value specified as the information, the fourth value specifying the PES level SMPTE timecode as the synchronization information, and the fifth value specifying the PES level frame number as the synchronization information may be set. Alternatively, according to an embodiment, as shown in FIG. 25 or FIG. 27, a sixth value for designating a PES level counterpart PTS as synchronization information, or a seventh for designating a video level watermark counterpart PTS as synchronization information. Values may also be included.
When the synchronization type information is confirmed, the reception apparatus selects synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal. In operation S3330, the first signal and the second signal are synchronized according to the selected synchronization information. The reception device may be implemented in various forms as described above. In addition, synchronization may also be performed by correcting and matching time stamps, or by selecting and processing frames based on time codes or frame numbers.
Meanwhile, as described above, in order to transmit the synchronization type information to the first signal and the second signal, a process for loading the synchronization type information in the transmission system must be performed. Such a process may be sufficiently understood based on the contents described in the above-described various drawings, and thus the illustration thereof is omitted.
The system described above may be applied to various environments for transmitting and receiving data whose time stamps do not match. That is, in addition to the 3D content consisting of the left eye image and the right eye image, it can be used for various types of hybrid services for separately transmitting and transmitting content based on a broadcasting network and a network network.
For example, the present invention may be applied to a data broadcasting service system that transmits 2D broadcasting through a broadcasting network and transmits data such as multilingual audio and multilingual subtitles through a network. Alternatively, the present invention may be applied to a UHD broadcasting service system that transmits 2D broadcasting through a broadcasting network and transmits UHD broadcasting data through a network. In addition, a multi-view broadcasting service system that transmits 2D broadcasting through a broadcasting network and data such as a depth map or another viewpoint view through a network, or a 2D broadcasting through a broadcasting network, May be applied to a multi-angle service system that provides image data of another photographing angle.
In addition, in the above examples, 2D broadcasting is illustrated as being transmitted only through a broadcasting network, but this is only an example for utilizing an existing broadcasting system, and is not necessarily limited thereto. That is, multilingual audio data, multilingual subtitle data, UHD broadcast data, depth map data, and other view view data corresponding to 2D content data may also be transmitted through the broadcast network.
In addition, in the above various examples, a hybrid system using both an RF broadcasting network and an IP communication network is illustrated. However, the type of communication network may be variously set.
The signal processing method of the transmitting apparatus or the signal processing method of the receiving apparatus according to the various embodiments described above may be coded in software and mounted on various devices.
Specifically, according to an embodiment of the present invention, receiving the first signal and the second signal through the RF broadcasting network and the IP communication network, respectively, the synchronization type information from at least one of the first signal and the second signal Detecting the signal; selecting the synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal, and performing a signal processing step of synchronizing the first signal and the second signal according to the selected synchronization information. A non-transitory readable medium having stored thereon a program may be installed.
A non-transitory readable medium is a medium that stores data for a short period of time, such as a register, cache, memory, etc., but semi-permanently stores data and is readable by the apparatus. In particular, the various applications or programs described above may be stored on non-volatile readable media such as CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM,
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.
210: first receiver 220: second receiver
230: signal processing unit 100-1, 100-2: transmitting
300-1, 300-2:
Claims (15)
A second receiver configured to receive a second signal through an Internet Protocol Communication Network;
Detecting synchronization type information from at least one of the first signal and the second signal, selecting the synchronization information corresponding to the synchronization type information from the synchronization information included in the first signal and the second signal, And a signal processor configured to synchronize the first signal and the second signal according to the selected synchronization information.
The signal processing unit,
And receiving the synchronization type information from a reserved area or descriptor area in at least one of the first signal and the second signal.
The signal processing unit,
And receiving the synchronization type information from at least one PSIP VCT (Program and System Information Protocol Virtual Channel Table), EIT reserved or descriptor region of the first signal and the second signal.
The signal processing unit,
And the synchronization type information is detected from a private stream or a metadata stream included in at least one of the first signal and the second signal.
The signal processing unit,
A first demux for demuxing the first signal to detect first video data;
A second demux for detecting second video data by demuxing the second signal when the second signal is in the form of a transport stream;
A file parser for parsing the second signal and detecting second video data when the second signal is a file format;
A first decoder for decoding the first video data demuxed in the first demux;
A second decoder for decoding the second video data detected by the second demux or file parser;
And a rendering unit which combines the first video data decoded by the first decoder and the second video data decoded by the second decoder to perform rendering.
At least one of the first demux, the second demux, the file parser, the first decoder, the second decoder, and the rendering unit may selectively operate according to a value of the synchronization type information to detect the synchronization information. Receiving device, characterized in that.
The signal processing unit,
A first demux for demuxing the first signal to detect first video data;
A second demux for detecting second video data by demuxing the second signal when the second signal is in the form of a transport stream;
A file parser for parsing the second signal and detecting second video data when the second signal is a file format;
A first decoder for decoding the first video data demuxed in the first demux;
A second decoder for decoding the second video data detected by the second demux or file parser;
A rendering unit which performs rendering by combining first video data decoded by the first decoder and second video data decoded by the second decoder; And
Detecting the synchronization type information from at least one of the first signal and the second signal, and detecting the synchronization information according to the synchronization type information to perform synchronization; And a controller for controlling at least one of a file parser, the first decoder, the second decoder, and the renderer.
The control unit,
If the synchronization type information is information specifying a type using a time code or frame number recorded at a PES level as the synchronization information, the first demux to detect video data having the same time code or frame number, respectively. And controlling the second demux or the first demux and the file parser,
If the synchronization type information is information specifying a type using a time code or frame number recorded at an ES level as the synchronization information, the time code or frame number matches after decoding the first and second signals, respectively. Control the first and second decoders to output video data;
If the synchronization type information is information specifying a type using a time code or frame number recorded at a video level as the synchronization information, the time code or frame number from the decoded first signal and the decoded second signal. And the rendering unit controls the rendering unit to detect and render matching video data, respectively.
The synchronization type information,
A first value for designating a video level watermark timecode as the synchronization information, a second value for designating a video level watermark frame number as the synchronization information, a third value for designating an ES level SMPTE timecode as the synchronization information, A fourth value for designating a PES level SMPTE timecode as the synchronization information, a fifth value for designating a PES level frame number as the synchronization information, a sixth value for designating a PES level counterpart PTS as the synchronization information, and a video level water And a seventh value for designating a mark counterpart PTS as the synchronization information.
The first signal includes one of left eye image data and right eye image data of 3D content, and the second signal includes another one of left eye image data and right eye image data of the 3D content.
One of the first signal and the second signal includes 2D content data,
The other one of the first signal and the second signal includes at least one of multilingual audio data, multilingual subtitle data, UHD broadcast data, depth map data, and other viewpoint view data corresponding to the 2D content data. Receiving device.
Receiving a first signal and a second signal through an RF broadcast network and an IP communication network, respectively;
Detecting synchronization type information from at least one of the first signal and the second signal;
A signal processing step of synchronizing the first signal and the second signal according to the selected synchronization information by selecting synchronization information corresponding to the synchronization type information among the synchronization information included in the first signal and the second signal; Signal processing method comprising a.
The synchronization type information,
And information recorded in a reserved area or a descriptor area of at least one of the first signal and the second signal.
The synchronization type information,
And at least one of the first signal and the second signal is information recorded in a PSIP VCT (Program and System Information Protocol Virtual Channel Table), an EIT reserved, or a descriptor area.
The synchronization type information,
And a private stream or a metadata stream included in at least one of the first signal and the second signal.
The synchronization type information,
A first value for designating a video level watermark timecode as the synchronization information, a second value for designating a video level watermark frame number as the synchronization information, a third value for designating an ES level SMPTE timecode as the synchronization information, A fourth value for designating a PES level SMPTE timecode as the synchronization information, a fifth value for designating a PES level frame number as the synchronization information, a sixth value for designating a PES level counterpart PTS as the synchronization information, and a video level water And one of a seventh value for designating a mark counterpart PTS as the synchronization information.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/861,068 US20130276046A1 (en) | 2012-04-13 | 2013-04-11 | Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof |
PCT/KR2013/003137 WO2013154402A1 (en) | 2012-04-13 | 2013-04-15 | Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261623735P | 2012-04-13 | 2012-04-13 | |
US61/623,735 | 2012-04-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20130116154A true KR20130116154A (en) | 2013-10-23 |
Family
ID=49635420
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120116023A KR20130116154A (en) | 2012-04-13 | 2012-10-18 | Receiving device for a plurality of signals through different paths and method for processing the signals thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20130116154A (en) |
-
2012
- 2012-10-18 KR KR1020120116023A patent/KR20130116154A/en not_active Application Discontinuation
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9628771B2 (en) | Transmitter and receiver for transmitting and receiving multimedia content, and reproduction method therefor | |
US20130276046A1 (en) | Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof | |
JP5977760B2 (en) | Receiving device for receiving a plurality of real-time transmission streams, its transmitting device, and multimedia content reproducing method | |
US20130271657A1 (en) | Receiving apparatus for providing hybrid service, and hybrid service providing method thereof | |
US9392256B2 (en) | Method and apparatus for generating 3-dimensional image datastream including additional information for reproducing 3-dimensional image, and method and apparatus for receiving the 3-dimensional image datastream | |
EP2728858B1 (en) | Receiving apparatus and receiving method thereof | |
CA2967245C (en) | Transmission device, transmission method, reception device, and reception method | |
US20110010739A1 (en) | Method and apparatus for transmitting/receiving stereoscopic video in digital broadcasting system | |
EP2744214A2 (en) | Transmitting device, receiving device, and transceiving method thereof | |
US9210354B2 (en) | Method and apparatus for reception and transmission | |
US9516086B2 (en) | Transmitting device, receiving device, and transceiving method thereof | |
US20100208750A1 (en) | Method and appartus for generating three (3)-dimensional image data stream, and method and apparatus for receiving three (3)-dimensional image data stream | |
US20130271568A1 (en) | Transmitting system and receiving apparatus for providing hybrid service, and service providing method thereof | |
KR20110123658A (en) | Method and system for transmitting/receiving 3-dimensional broadcasting service | |
US20140157342A1 (en) | Reception device, transmission device, reception method, and transmission method | |
EP2822282B1 (en) | Signal processing device and method for 3d service | |
US20140147088A1 (en) | Transmission device, receiving/playing device, transmission method, and receiving/playing method | |
KR102016674B1 (en) | Receiving device for providing hybryd service and method thereof | |
KR20130056829A (en) | Transmitter/receiver for 3dtv broadcasting, and method for controlling the same | |
KR101191498B1 (en) | System and Method for synchronization of 3D broadcasting service using real-time broadcasting and non-real time additional broadcasting data | |
KR20130116154A (en) | Receiving device for a plurality of signals through different paths and method for processing the signals thereof | |
KR20150006340A (en) | Method and apparatus for providing three-dimensional video | |
KR20130115975A (en) | Transmitting system and receiving device for providing hybrid service, and methods thereof | |
KR20140004045A (en) | Transmission apparatus and method, and reception apparatus and method for providing 3d service using the content and additional image seperately transmitted with the reference image transmitted in real time |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |