KR101699367B1 - Method for 3dtv multiplexing and apparatus thereof - Google Patents

Method for 3dtv multiplexing and apparatus thereof Download PDF

Info

Publication number
KR101699367B1
KR101699367B1 KR1020120051878A KR20120051878A KR101699367B1 KR 101699367 B1 KR101699367 B1 KR 101699367B1 KR 1020120051878 A KR1020120051878 A KR 1020120051878A KR 20120051878 A KR20120051878 A KR 20120051878A KR 101699367 B1 KR101699367 B1 KR 101699367B1
Authority
KR
South Korea
Prior art keywords
value
pes
video
image
3dtv
Prior art date
Application number
KR1020120051878A
Other languages
Korean (ko)
Other versions
KR20130128101A (en
Inventor
조숙희
김종호
정세윤
추현곤
최진수
김진웅
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020120051878A priority Critical patent/KR101699367B1/en
Priority to US13/717,492 priority patent/US9270972B2/en
Publication of KR20130128101A publication Critical patent/KR20130128101A/en
Application granted granted Critical
Publication of KR101699367B1 publication Critical patent/KR101699367B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23608Remultiplexing multiplex streams, e.g. involving modifying time stamps or remapping the packet identifiers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2365Multiplexing of several video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2368Multiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Abstract

A 3DTV multiplexing method according to the present invention includes deriving a frame unit delay value for a left image and a right image based on a left image PES corresponding to a left image and a right image PES corresponding to a right image, Performing synchronization on the left video PES and the right video PES, and multiplexing the synchronized left video PES and the synchronized right video PES to generate a 3DTV TS. According to the present invention, video service efficiency can be improved.

Description

[0001] METHOD FOR 3DTV MULTIPLEXING AND APPARATUS THEREOF [0002]

The present invention relates to image processing, and more particularly, to a 3DTV multiplexing method and apparatus.

In addition to UDTV service, digital broadcasting service using 3D video is attracting attention as a next generation broadcasting service following HDTV. Based on the development of related technologies such as the launch of high-quality commercial stereoscopic display, each household can enjoy 3D video 3DTV services are expected to be available within the next few years. Especially, the 3D broadcasting service which is currently being provided as a commercial service or a trial service is a service using stereoscopic video mainly composed of left and right images.

In the process of processing a 3D image, a plurality of images correlated with each other can be stored or processed together. In addition, not only a 3D image but also a plurality of images correlated with each other can be stored or processed in a process of a free view image, a panorama image, a multi-view image, and a multi-divided image. Here, for example, the multi-divided image may include an image in which an ultrahigh-resolution image having a resolution corresponding to 4 to 16 times the HD image is divided into a plurality of HD images. In this manner, when a plurality of images correlated with each other are stored or processed together, a plurality of images must be synchronized with each other on a frame basis.

SUMMARY OF THE INVENTION The present invention provides a method and apparatus for generating a 3DTV TS capable of improving video service efficiency.

It is another object of the present invention to provide a 3DTV multiplexing method and apparatus which can improve the efficiency of video service.

Another aspect of the present invention is to provide an image synchronization method and apparatus for improving image service efficiency.

1. One embodiment of the present invention is a 3DTV multiplexing method. The method includes deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to a left image and a right image PES corresponding to a right image, Performing synchronization on the left video PES and the right video PES based on the unit delay value, and multiplexing the synchronized left video PES and the synchronized right video PES to generate a 3DTV TS (Transport Stream) And the frame unit delay value is a value indicating a time difference between the left image and the right image in frame units.

2. The method of claim 1, wherein deriving the frame unit delay value comprises: extracting a first synchronization information value from a first video ES (Elementary Stream) in the left video PES; Extracting a synchronization information value, and deriving the frame unit delay value based on the first synchronization information and the second synchronization information.

3. The apparatus of claim 3, wherein the first sync information value is a value included in the first video ES, the first sync information value being counted in units of frames, the second sync information value being counted in frame units, And the difference value between the first synchronization information and the second synchronization information may be determined as the frame unit delay value in the step of deriving the frame unit delay value.

4. The method of claim 2, wherein the first synchronization information value is a value included in the first video ES in the form of a time code, and the second synchronization information value is a value included in the second video ES Wherein deriving the frame unit delay value comprises deriving a time difference value in seconds between the first synchronization information and the second synchronization information and multiplying the second time difference value by the number of frames per second, And a step of deriving a value.

5. The method of claim 1, wherein the synchronized left picture PES further comprises a first PTS (Presentation Time Stamp) and a first DTS (Decoding Time Stamp), and the synchronized right picture PES includes a second PTS and a second DTS The first PTS, the second PTS, and the second DTS, based on a third PTS input from a clock, to a new value As shown in FIG.

6. The method of claim 5, wherein in the modifying step, the value of the first PTS and the value of the second PTS are modified to the value of the third PTS, and the value of the first DTS is changed from the first DTS value The value of the second DTS is modified by adding the third PTS value to the value obtained by subtracting the first PTS value from the second DTS value minus the second PTS value, Lt; / RTI >

7. The method according to claim 1, further comprising the step of generating a 3DTV PSI, which is 3D program configuration information, based on a left image PSI (Program Specific Information) corresponding to the left image and a right image PSI corresponding to the right image In the 3DTV TS generation step, the synchronized left image PES, the synchronized right image PES, and the 3DTV PSI may be multiplexed.

8. The method of claim 7, wherein the left image PSI includes a first PAT (Program Association Table) and a first PMT (Program Map Table), the right image PSI includes a second PAT and a second PMT, Wherein the 3DTV PSI generating step generates a third PAT having information corresponding to both the first PMT and the second PMT by reconstructing the first PAT and the second PAT, 2 A program in which a stream type value is changed in one PMT corresponding to an additional stream in PMT and information indicating a program type provided in digital broadcasting is defined in one PMT corresponding to the additional stream An information descriptor and a video information descriptor in which information indicating the characteristics of the ES constituting the video data is defined.

9. The method of claim 1, wherein the method further comprises: extracting the left image PES from a transport stream (TS) corresponding to the left image and extracting the right image PES from a right image TS corresponding to the right image As shown in FIG.

10. Another embodiment of the present invention is a 3DTV multiplexing apparatus. The apparatus includes a delay value calculation module for deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to a left image and a right image PES corresponding to a right image, A synchronization module for performing synchronization on the left image PES and the right image PES based on the frame unit delay value and multiplexing on the synchronized left image PES and the synchronized right image PES, And a 3DTV TS packetizer for generating a transport stream, wherein the frame unit delay value is a value representing a time difference between the left image and the right image in frame units.

11. The apparatus of claim 10, wherein the delay value calculation module comprises: a first synchronization information extractor for extracting a first synchronization information value from a first video ES (Elementary Stream) in the left video PES; A second synchronization information extractor for extracting a second synchronization information value from the first synchronization information and a delay value calculator for deriving the frame unit delay value based on the first synchronization information and the second synchronization information.

12. The method of claim 10, wherein the synchronized left picture PES further comprises a first presentation time stamp (PTS) and a first decoding time stamp (DTS), and the synchronized right picture PES comprises a second PTS and a second DTS Wherein the synchronization module is configured to update the values of the first PTS, the first PTS, the second PTS, and the second DTS to a new value based on a third PTS input from a clock And may further include a PTS / DTS modification module to be modified.

13. The apparatus of claim 10, wherein the apparatus further comprises: a 3DTV PSI generation unit generating 3DTV PSI, which is 3D program configuration information, based on left image PSI (Program Specific Information) corresponding to the left image and right image PSI corresponding to the right image Module, and the 3DTV TS packetizer may perform multiplexing on the synchronized left image PES, the synchronized right image PES, and the 3DTV PSI.

14. The apparatus of claim 10, wherein the apparatus comprises: a first de-packetizer for extracting the left image PES from a left transport stream (TS) corresponding to the left image; And a second reverse packetizer for extracting the right video PES from the right video PES.

15. Another embodiment of the present invention is a method for image synchronization. The method includes deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to a left image and a right image PES corresponding to a right image, And performing synchronization on the left image PES and the right image PES based on a unit delay value, wherein the frame unit delay value is a value representing a time difference between the left image and the right image in frame units .

According to the 3DTV TS generation method of the present invention, video service efficiency can be improved.

According to the 3DTV multiplexing method of the present invention, video service efficiency can be improved.

According to the image synchronization method of the present invention, image service efficiency can be improved.

1 is a diagram schematically showing an embodiment of a 3DTV TS generation process.
2 is a diagram schematically showing another embodiment of the 3DTV TS generation process.
3 is a block diagram schematically showing an embodiment of a 3DTV TS generating apparatus according to the present invention.
4 is a block diagram schematically illustrating an embodiment of a DTV encoder configuration.
5 is a block diagram schematically illustrating an embodiment of a 3DTV multiplexer configuration based on an automatic synchronization scheme according to the present invention.
FIG. 6 is a block diagram schematically illustrating an embodiment of a synchronization module included in the automatic synchronization-based 3DTV multiplexer of FIG.
7 is a block diagram schematically illustrating an embodiment of the delay value calculation module included in the synchronization module of FIG.
FIG. 8 is a flowchart schematically illustrating an embodiment of a 3DTV multiplexing method based on an automatic synchronization scheme according to the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the following description of the embodiments of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present disclosure rather unclear.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . In addition, the description of "including" a specific configuration in the present invention does not exclude a configuration other than the configuration, and means that additional configurations can be included in the practice of the present invention or the technical scope of the present invention.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

In addition, the components shown in the embodiments of the present invention are shown independently to represent different characteristic functions, which does not mean that each component is composed of separate hardware or software constituent units. That is, each constituent unit is included in each constituent unit for convenience of explanation, and at least two constituent units of the constituent units may be combined to form one constituent unit, or one constituent unit may be divided into a plurality of constituent units to perform a function. The integrated embodiments and separate embodiments of the components are also included within the scope of the present invention, unless they depart from the essence of the present invention.

In addition, some of the components are not essential components to perform essential functions in the present invention, but may be optional components only to improve performance. The present invention can be implemented only with components essential for realizing the essence of the present invention except for the components used for performance improvement, Are also included in the scope of the present invention.

1 is a diagram schematically showing an embodiment of a 3DTV TS generation process. Here, the TS may mean a transport stream. 1 shows a 3DTV TS generation process for a stereoscopic 3DTV service.

When a plurality of images are stored or processed together as in a 3D image, a free view image, a multi-view image, a panoramic image, or the like, a plurality of image signals must be synchronized with each other on a frame basis. Since the stereoscopic video may be composed of a left video signal and a right video signal, the left video signal and the right video signal in the embodiment of FIG. 1 must be synchronized with each other on a frame basis.

In a 3D image, free-view image, multi-view image, panorama image, etc., a plurality of images can be encoded by different encoders. In this case, a plurality of transport streams can be output by a plurality of encoders. For example, in stereoscopic video, the left and right images can be individually encoded by a Moving Picture Experts Group (MPEG) -2 encoder and an AVC (Advanced Video Coding) encoder, respectively. At this time, as described above, a plurality of transport streams must be synchronized with each other on a frame basis. However, if there is no way to automatically synchronize a plurality of transport streams, a 3DTV dedicated encoder that encodes a plurality of images (e.g., left and right images in stereoscopic video) together can be used.

In the embodiment of FIG. 1, a 3DTV dedicated encoder may be used to generate a 3DTV TS. Referring to FIG. 1, the signal generator 110 may generate a left video signal and a right video signal. The left video output apparatus 120 receives the left video signal and outputs a high definition serial digital interface (HD-SDI) signal (hereinafter referred to as a left video HD-SDI) corresponding to the left video signal And the right video output device 130 can receive the right video signal and output an HD-SDI signal (hereinafter referred to as a right video HD-SDI) corresponding to the right video signal. Here, HD-SDI can represent a standard specification for video transmission between HD broadcasting equipment. The dual stream encoder (3DTV dedicated encoder) 140 can generate a DVB-ASI (Digital Video Broadcast Asynchronous serial interface) signal based on the left video HD-SDI and the right video HD-SDI. Here, DVB-ASI may represent a standard for serial transmission of compressed digital video / audio streams between devices. In FIG. 1, the DVB-ASI signal generated by the dual stream encoder 140 may correspond to a multiplexed 3DTV transport stream.

2 is a diagram schematically showing another embodiment of the 3DTV TS generation process. FIG. 2 illustrates a 3DTV TS generation process for a stereoscopic 3DTV service.

As described above, when a plurality of images are stored or processed together as in a 3D image, a free view image, a multi-view image, a panoramic image, or the like, a plurality of image signals must be synchronized with each other on a frame basis. Since the stereoscopic video may be composed of a left video signal and a right video signal, the left video signal and the right video signal in the embodiment of FIG. 2 must be synchronized with each other on a frame basis.

In a 3D image, free-view image, multi-view image, panorama image, etc., a plurality of images can be encoded by different encoders. At this time, a plurality of transport streams can be output by a plurality of encoders. For example, as in the embodiment of FIG. 2, left and right images in stereo-scopic video can be separately encoded by an MPEG-2 encoder and an AVC encoder, respectively. At this time, as described above, a plurality of transport streams must be synchronized with each other on a frame basis. However, if there is no way to automatically synchronize a plurality of transport streams, a method of manually synchronizing a plurality of video signals on a frame-by-frame basis may be used.

Referring to FIG. 2, the signal generator 210 may generate a left video signal and a right video signal. The left video output device 220 can receive the left video signal and output the left video HD-SDI, and the right video output device 230 can receive the right video signal and output the right video HD-SDI have. The MPEG-2 encoder 240 can generate a DVB-ASI signal (hereinafter referred to as a left video DVB-ASI) corresponding to a left image based on a left image HD-SDI, ASI signal (hereinafter referred to as right video DVB-ASI) corresponding to the right video based on the right video HD-SDI.

Referring again to FIG. 2, the 3DTV multiplexer / demultiplexer 260 may generate and output a DVB-ASI signal for monitoring by performing multiplexing based on the left video DVB-ASI and the right video DVB-ASI. The output DVB-ASI signal for monitoring can be input to the 3DTV terminal 270 for monitoring. At this time, the frame-by-frame time difference between the left image and the right image can be obtained through the 3DTV terminal 270 for monitoring. The time difference can be directly recognized by the human eye. The obtained frame-by-frame time difference may be input manually by the person to the 3DTV multiplexer / demultiplexer 260. At this time, the 3DTV multiplexer / demultiplexer 260 generates the final DVB-ASI signal by performing re-multiplexing on the left DVB-ASI and the right DVB-ASI based on the input time difference information have. The final DVB-ASI signal may correspond to a multiplexed 3DTV transport stream.

In the above-described embodiment, a 3DTV multiplexing and remultiplexer based on a passive synchronization scheme can be used. In other words, according to the above-described embodiment, a person visually confirms a frame unit time difference between a left image and a right image, and a left image frame and a right image frame can be manually synchronized based on the time difference.

Meanwhile, when a 3DTV dedicated encoder is used as in the embodiment of FIG. 1 described above, new expensive equipment is required for generating a 3DTV TS. Also, the 3DTV TS generation method according to the embodiment of FIG. 1 has a disadvantage that the existing encoder can not be used. In the embodiment of FIG. 2, the stream multiplexed and output is reproduced in the 3DTV terminal for monitoring, and the synchronization is manually performed while watching the image of the person being reproduced. Therefore, the 3DTV TS generation method of FIG. 2 has a disadvantage that a monitoring terminal is necessarily required. In addition, the 3DTV TS generating method of FIG. 2 has the disadvantage of being accompanied by human manual operation. Therefore, in order to solve such a problem, a 3DTV multiplexing method based on an automatic synchronization method can be provided.

3 is a block diagram schematically showing an embodiment of a 3DTV TS generating apparatus according to the present invention.

3 shows a 3DTV TS generation apparatus for a stereoscopic 3DTV service. The left and right images constituting the stereoscopic image may be images having different viewpoints for the same scene. However, in the embodiments described below, a plurality of images processed together for 3DTV TS generation will be handled as having separate contents and / or programs even if they are images for the same scene.

The 3DTV TS generating apparatus according to 310 of FIG. 3 may include a first DTV encoder 313, a second DTV encoder 316, and a 3DTV multiplexer 319 based on an automatic synchronization scheme. In addition, the 3DTV TS generating apparatus according to FIG. 3, 320 may include a multi-DTV encoder 323 and a 3DTV multiplexer 326 based on an automatic synchronization scheme.

The automatic synchronization-based 3DTV multiplexer according to the present invention can receive two types of inputs as shown in 310 and 320 of FIG. In FIG. 3, the left and right video signals constituting the stereoscopic video can be separately encoded by the two independent coders 313 and 316, respectively. In this case, two MPEG-2 TSs can be input to the 3DTV multiplexer 319 based on the automatic synchronization scheme. In FIG. 3, the left and right video signals constituting the stereoscopic video can be encoded together in the multi-DTV encoder 323. Here, the multi-DTV encoder 323 can encode a plurality of contents in one encoder. In this case, the signal input from the multi-DTV 323 to the automatic synchronization-based 3DTV multiplexer 326 may be one MPEG-2 TS. At this time, the one MPEG-2 TS may include two contents and / or programs.

Referring to FIG. 3, the first DTV encoder 313 encodes a video signal and / or an audio signal included in the left image HD-SDI to generate an MPEG-2 TS corresponding to the left image (hereinafter, 2 TS) can be output. The second DTV encoder 316 encodes a video signal and / or an audio signal included in the right video HD-SDI to generate an MPEG-2 TS (hereinafter referred to as a right video MPEG-2 TS) corresponding to the right video. Can be output. In this case, each of the left-side HD-SDI and the right-side HD-SDI may include both a video signal and an audio signal, and may not include an audio signal. The operation of each DTV encoder will be described later.

In FIG. 3, the automatic synchronization-based 3DTV multiplexer 319 can generate an MPEG-2 3DTV TS by performing multiplexing based on a left video MPEG-2 TS and a right video MPEG-2 TS. At this time, the generated MPEG-2 3DTV TS may correspond to the multiplexed 3DTV transport stream. The concrete operation of the 3DTV multiplexer 319 based on the automatic synchronization method will be described later.

Referring to FIG. 3, the multi-DTV encoder 323 may encode the left image HD-SDI and the right image HD-SDI to output one MPEG-2 TS. In this case, each of the left-side HD-SDI and the right-side HD-SDI may include both a video signal and an audio signal, and may not include an audio signal. In addition, the one MPEG-2 TS may include two contents and / or programs.

In FIG. 3, the automatic synchronization-based 3DTV multiplexer 326 multiplexes based on one MPEG-2 TS generated by the multi-DTV encoder 323 to generate an MPEG-2 3DTV TS. At this time, the generated MPEG-2 3DTV TS may correspond to the multiplexed 3DTV transport stream. The concrete operation of the 3DTV multiplexer 319 based on the automatic synchronization method will be described later.

In FIG. 3, two independent coders 313 and 316 are used. Therefore, at this time, a PCR (Program Clock Reference) for the left video MPEG-2 TS generated by the first DTV encoder 313 and a PCR for the right video MPEG-2 TS generated by the second DTV encoder 316 are transmitted to each other can be different. 3, a multi-DTV encoder 323 is used, and one MPEG-2 TS generated by the multi-DTV encoder 323 may include a plurality of programs. At this time, the PCRs for the plurality of programs may be the same. Here, the PCR may mean a time reference value included in the transport stream and transmitted to the receiver so that the receiver can adjust the time reference to the transmitter. However, in all of the embodiments 310 and 320 of FIG. 3, time information between a plurality of programs may not be synchronized with each other. In other words, in the embodiment of FIGS. 3 and 4, in the encoded stream output by the encoder (s), the encoding stream corresponding to the left image (hereinafter referred to as the left video stream) Stream (hereinafter referred to as a right video stream) may not be synchronized with each other.

Accordingly, in order to provide a stereoscopic 3DTV service composed of two images including a left image and a right image, it is necessary that the left and right video streams outputted through the encoder (s) are automatically synchronized frame by frame have. The MPEG-2 3DTV TS created based on the automatic synchronization method can enable stereoscopic 3DTV service. An automatic synchronization scheme for a plurality of encoded streams (for example, an operation of a 3DTV multiplexer based on an automatic synchronization scheme) will be described later.

In the above-described embodiment, the output signal of the DTV encoder is described as an MPEG-2 TS, but the present invention is not limited thereto and each output signal may correspond to another type of transport stream.

4 is a block diagram schematically illustrating an embodiment of a DTV encoder configuration. The DTV encoder according to the embodiment of FIG. 4 includes an audio encoder 410, a video encoder 420, an audio packetizer 430, a video packetizer 440, a clock 450, a PSI generator 460, and a TS packetizer 470.

Referring to FIG. 4, the audio encoder 410 may encode audio data included in the HD-SDI to generate an audio ES (Elementary Stream). In addition, the video encoder 420 may encode the video data included in the HD-SDI to generate a video ES (Elementary Stream).

The audio packetizer 430 may generate an audio PES based on the clock signal including the PTS / DTS and the audio ES. In addition, the video packetizer 440 may generate a video PES based on the clock signal including the PTS / DTS and the video ES. The PTS and the DTS may be obtained by the clock 450. Here, the DTS (Decoding Time Stamp) may correspond to a value indicating the time point at which the ES is to be decoded, and the PTS (Presentation Time Stamp) may correspond to a value indicating a time at which the decoded ACCESS UNIT should be reproduced . In addition, a PES (Packetized Elementary Stream) may mean a stream composed of packets generated by packetizing bitstreams of compressed video / audio data.

The PSI generator 460 can generate PSI (Program Specific Information) corresponding to the program configuration information. PSI may mean metadata including information necessary for demultiplexing a TS (Transport Stream) and reproducing image information in a table format in MPEG-2. In one embodiment, the PSI may include PAT (Program Association Table) information, PMT (Program Map Table) information, and the like. Here, the PAT may include a list of all programs that are currently available in the TS. The PAT may include a program number indicating which program is currently being transmitted and a PID (Packet Identifier) corresponding to each program. Further, the PMT may include a program element constituting one program and / or information on a video stream constituting video data in the program.

The TS packetizer 470 multiplexes the audio PES, the video PES, the PCR information generated in the clock 450, and the PAT information and PMT information generated in the PSI generator 460, thereby outputting the MPEG-2 TS signal can do.

5 is a block diagram schematically illustrating an embodiment of a 3DTV multiplexer configuration based on an automatic synchronization scheme according to the present invention.

5, a 3DTV multiplexer based on the automatic synchronization scheme includes a first de-packetizer 510, a second de-packetizer 520, a synchronization module 530, a PTS / DTS modification module 540, A clock 550, a 3DTV PSI generation module 560, and a 3DTV TS packetizer 570.

5, the first de-packetizer 510 receives a left video MPEG-2 TS and generates an audio PES corresponding to a left video, a video PES corresponding to a left video, and a video PES corresponding to a left video based on a left video MPEG- (Hereinafter, referred to as a left image PSI) corresponding to the left image can be generated. The audio PES and video PES generated in the first de-packetizer 510 may be input to the synchronization module, and the left video PSI may be input to the 3DTV PSI generation module 560. [ In addition, the second inverse packetizer 520 receives the right video MPEG-2 TS and generates an audio PES corresponding to the right video based on the right video MPEG-2 TS, a video PES corresponding to the right video, PSI (hereinafter referred to as right image PSI) can be generated. The audio PES and video PES generated in the second inverse packetizer 520 may be input to the synchronization module and the right video PSI may be input to the 3DTV PSI generation module 560. [ That is, each MPEG-2 TS input to the 3DTV multiplexer based on the automatic synchronization scheme can be separated into PES signals and input to the synchronization module, and the PSI generated based on each MPEG-2 TS is input to the 3DTV generation module .

5, all of the signals input to the first de-packetizer 510 and the second de-packetizer 520 are described as being MPEG-2 TSs. However, the present invention is not limited thereto, Lt; / RTI > type transport stream.

The synchronization module 530 may synchronize a plurality of input PESs and output a plurality of synchronized PESs. The synchronization module 530 may extract synchronization information from an ES (Elementary Stream) included in each of the input PESs, and may generate synchronization information for a left video signal and a right video signal on a frame- Synchronization can be performed. The synchronized PES output from the synchronization module 530 may include a left video audio PES, a left video PES, a right video PES, and a right video PES. Details of the operation and / or configuration of the synchronization module 530 and the synchronization information will be described later.

The PTS / DTS modification module 540 may modify the PTS value for each of the plurality of synchronized PESs to a new PTS value input from the clock 550. Also, the PTS / DTS modification module 540 can extract the existing PTS and DTS included in each PES for each of the plurality of PESs input synchronously. At this time, the PTS / DTS modification module 540 modifies the DTS value of each of the plurality of synchronized PESs based on the extracted existing PTS value, the extracted existing DTS value, and the modified new PTS value to a new DTS value Can be modified. For example, the PTS / DTS modification module 540 may calculate a time difference value between the extracted existing PTS value and the extracted existing DTS value for each PES, A new DTS value can be obtained by adding the calculated time difference value. Here, the new DTS value may be obtained for each PES input to the PTS / DTS modification module 540. At this time, the PTS / DTS modification module 540 can modify the existing DTS value to the obtained new DTS value for each PES.

The following Equation 1 shows an embodiment of a process for obtaining a new DTS value for one video PES.

[Equation 1]

Diff_DTS_PTS_PES_video1 = current_DTS_PES_video1 - current_PTS_PES_video1

New_DTS_PES_video1 = New_PTS + Diff_DTS_PTS_PES_video1

Here, New_PTS may represent a new PTS value input from the clock 550. [ Also, current_DTS_PES_video1 and current_PTS_PES_video1 may represent an existing DTS value and an existing PTS value included in the one video PES, respectively. New_DTS_PES_video1 may indicate a new DTS value obtained from the PTS / DTS modification module 540. [ At this time, since the PTS / DTS modification module 540 calculates a new DTS value for each PES, the number of times the new DTS value is calculated in the PTS / DTS modification module 540 is input to the PTS / DTS modification module 540 Lt; RTI ID = 0.0 > PES < / RTI >

The 3DTV PSI generation module 560 can generate 3DTV PSI corresponding to the program configuration information based on the left image PSI and the right image PSI. Here, the 3DTV PSI may include PAT information, PMT information, and the like.

As described above, the PAT may include a list of all programs currently available in the TS, and may include a program number indicating which program is currently being transmitted and a PID corresponding to each program . Meanwhile, the TS output from the DTV encoder according to the embodiment of FIG. 4 may include one program, and two TSs (left TS and right TS) are input to the 3DTV multiplexer of FIG. 5, The 3DTV TS output from the 3DTV multiplexer according to the embodiment of the present invention may include two programs (a program corresponding to a left image and a program corresponding to a right image). Accordingly, the 3DTV PSI generation module 560 may reconstruct the PAT so that one PAT corresponds to two PMT information and / or one PAT has two PMT information. Here, the two PMT information may correspond to the PMT information corresponding to the left image PSI and the PMT information corresponding to the right image PSI, respectively.

On the other hand, as described above, the PMT may include program elements constituting one program and / or video stream information constituting video data in the program. The PMT includes a program information descriptor (e.g., stereoscopic_program_info_descriptor) in which information indicating a program type provided in digital broadcasting is defined, a video information descriptor in which information indicating the characteristics of an ES constituting video data is defined , stereoscopic_video_info_descriptor) and / or a stream type (e.g., stream_type).

The 3DTV PSI generation module 560 generates the 3DTV PSI generation module 560 with one PMT among the two PMT information (PMT information corresponding to the left video PSI and PMT information corresponding to the right video PSI) You can change the stream type value. Here, the stream type may be represented by stream_type, for example.

In one embodiment, it is assumed that the TS is output from the MPEG-2 encoder and the AVC encoder and is input to the 3DTV multiplexer of FIG. Here, the TS outputted from the MPEG-2 encoder is referred to as an MPEG-2 TS, and the TS outputted from the AVC encoder is referred to as an AVC TS. At this time, in order to maintain compatibility between the existing DTV and the 3DTV, the 3DTV PSI generation module 560 may not modify the PMT included in the MPEG-2 TS. In addition, the 3DTV PSI generation module 560 may change the stream type (e.g., stream_type) value for the video and audio streams in the PMT included in the AVC TS. The 3DTV PSI generation module 560 may change the stream type value so that the 3DTV receiver can know that the additional encoded stream for the 3DTV service is encoded by the AVC encoder.

In addition, the 3DTV PSI generation module 560 may insert and / or include a program information descriptor and a video information descriptor defined in the MPEG system standard for 3DTV signaling in the PMT of the additional encoded stream. Here, the program information descriptor may be represented by, for example, a stereoscopic_program_info_descriptor, and the video information descriptor may be represented by, for example, a stereoscopic_video_info_descriptor. The following Tables 1 and 2 may represent an embodiment of a program information descriptor (stereoscopic_program_info_descriptor) and a video information descriptor (stereoscopic_video_info_descriptor) syntax inserted or included in the PMT of the additional encoded stream, respectively.

[Table 1]

Figure 112012039094518-pat00001

[Table 2]

Figure 112012039094518-pat00002

4, the 3DTV TS packetizer 570 may include a plurality of input synchronized PESs, a PCR which is time information generated from the clock 550, and a 3D TV generated by the 3DTV PSI generation module 560 PSI to perform 3DTV-TS signal output.

FIG. 6 is a block diagram schematically illustrating an embodiment of a synchronization module included in the automatic synchronization-based 3DTV multiplexer of FIG. The synchronization module according to the embodiment of FIG. 6 may include a first PES storage buffer 610, a second PES storage buffer 620, a delay value calculation module 630, and an output control module 640.

Referring to FIG. 6, a plurality of PESs may be input to the synchronization module. In one embodiment, the plurality of PESs input to the synchronization module may be respectively an audio PES corresponding to a left image, a video PES corresponding to a left image, an audio PES corresponding to a right image, and a video PES corresponding to a right image.

The plurality of PESs input to the synchronization module may be stored in the PES storage buffer. Referring to FIG. 6, the first PES storage buffer 610 may store a left image audio PES and a left image video PES. In addition, the second PES storage buffer 620 may store the right video PES and the right video PES.

The delay value calculation module 630 can calculate the frame unit delay value between the left image (or the left image coded stream) and the right image (or the right image coded stream) based on the left image video PES and the right image video PES. Here, the frame unit delay value may mean a time difference between the left image and the right image in frame units. That is, the frame-by-frame delay value may correspond to a value indicating how many frames (and / or delay differences) exist between the left image and the right image. The delay value calculation module 630 may calculate the frame unit delay value based on the synchronization information included in the ES in the left video PES and the synchronization information included in the ES in the right video PES.

If the frame-by-frame delay value is obtained for the left-image encoded stream and the right-image encoded stream, the frame-by-frame delay value is determined such that the program is completed unless an error occurs in the left image encoded stream and / The same value can be maintained until the time point. Therefore, the synchronization module (and / or the delay value calculation module) may not calculate the frame-by-frame delay value for each PES after calculating the frame-by-frame delay value first. In this case, the synchronization module (and / or the delay value calculation module) may periodically calculate the frame-by-frame delay value and perform synchronization while checking whether there is a change in the frame-by-frame delay value.

Details of the operation of the delay value calculation module 630 and the synchronization information described above will be described later with reference to FIG.

Referring again to FIG. 6, the output control module 640 receives the PES values input from the first PES storage buffer 610 and the second PES storage buffer 620 and the frame unit delay value And outputs the generated synchronized PES. Here, the PES input from the first PES storage buffer 610 may be a left video PES and a left video PES, for example. In addition, the PES input from the second PES storage buffer 620 may be a right video PES and a right video PES, for example.

Hereinafter, the left video PES and the left video audio PES will be collectively referred to as a left video PES, and the right video PES and the right video audio PES will be collectively referred to as a right video PES. In one embodiment, the output control module 640 may perform synchronization between the left and right images by delaying one PES signal out of the left and right PES signals by a frame unit delay value. That is, in this case, the output control module 640 can output the synchronized left image PES and the synchronized right image PES.

At this time, the signal delayed by the output control module 640 may correspond to a temporally preceding signal among the left video PES signal and the right video PES signal. In one embodiment, the output control module 640 delays the output of the left image PES signal and the right image PES signal based on the synchronization information included in the ES in the left image PES and the synchronization information included in the ES in the right image PES The signal can be selected. For example, the output control module 640 may output a signal having a larger synchronization information value out of the left and right PES signals by a frame unit delay value. Details of the synchronization information will be described later with reference to FIG.

7 is a block diagram schematically illustrating an embodiment of the delay value calculation module included in the synchronization module of FIG. The delay value calculation module according to the embodiment of FIG. 7 may include a first synchronization information extractor 710, a second synchronization information extractor 720, and a delay value calculator 730.

Referring to FIG. 7, the first synchronization information extractor 710 may extract the first synchronization information included in the video ES (hereinafter referred to as ES1) in the left video PES. Also, the second synchronization information extractor 720 can extract the second synchronization information included in the video ES (hereinafter referred to as ES2) in the right video PES. That is, the delay value calculation module can extract the synchronization information included in each of ES1 and ES2. The delay value calculator 730 may calculate a frame-by-frame delay value by calculating how many frames there are between the left and right PESs based on the first synchronization information and the second synchronization information. That is, the delay value calculator 730 can derive the time difference between the left image PES and the right image PES on a frame-by-frame basis.

In one embodiment, the synchronization information value included in the video ES may be a value that is incremented by one in units of frames. That is, the synchronization information value may be a value counted in frame units and included in the video ES. In this case, in the embodiment of FIG. 7, the difference value between the first synchronization information value and the second synchronization information value may correspond to the frame unit delay value. For example, if the first synchronization information value in ES1 is 5 and the second synchronization information value in ES2 is 2, the frame unit delay value derived from the delay value calculator may be 3.

In another embodiment, the synchronization information included in the video ES may be information in the form of a time code. That is, the synchronization information value may be a value included in the video ES in the form of time code composed of hour, minute, second, frame and the like. In this case, the delay value calculator 730 calculates a difference value between the first synchronization information value and the second synchronization information value (for example, a time difference value between the left image PES and the right image PES) Can be multiplied by the number of frames per second to derive the frame unit delay value. For example, if the time difference value is 0.5 seconds and the number of frames per second is 30, the frame unit delay value may be 15.

When the MPEG-2 video encoder is used, the above-described synchronization information can be included in the user data area in the video ES. Here, the user data may be represented by user_data as an example. In this case, the value of the synchronization information may be a value that is incremented by 1 in frame units, and / or a value that is counted and included in units of frames. As another example, the value of the synchronization information may be a value included in the form of a time code composed of hour, minute, second, frame and the like. When an AVC and / or HEVC (High Efficiency Video Coding) encoder is used, the above-described synchronization information may be included in the video ES in the form of SEI (Supplemental Enhancement Information). The location at which the synchronization information is inserted in the video ES and / or the type of synchronization information may be a predetermined predetermined location and / or a predetermined predetermined type. At this time, the delay value calculation module can know the position where the synchronization information is inserted and / or the type of synchronization information in the video ES without any additional information.

The delay value calculation module (and / or the delay value calculator 730) has a function of replacing or changing the value of the synchronization information used for calculating the frame unit delay value with a null value after calculating the frame unit delay value . Synchronization information for using a closed caption syntax may be inserted into the user data area in the video ES. Here, the closed caption may be a character string provided in synchronization with the voice of the broadcast program, and may be displayed on the screen only when the closed caption function is activated. At this time, when actual caption information is provided, confusion may occur between the synchronization information for the closed caption syntax and the synchronization information for synchronizing the plurality of video signals. Therefore, to reduce this confusion, the delay value calculation module (and / or delay value calculator 730) may delete the synchronization information used in the frame-by-frame delay value calculation.

FIG. 8 is a flowchart schematically illustrating an embodiment of a 3DTV multiplexing method based on an automatic synchronization scheme according to the present invention.

Referring to FIG. 8, the 3DTV multiplexer according to the embodiment of the present invention can extract PES and PSI for each of a plurality of images (S810).

For example, in a stereoscopic 3DTV service, a left video TS and a right video TS may be input to a 3DTV multiplexer. At this time, the 3DTV multiplexer can extract or generate the left video audio PES, the left video PES, and the left video PSI based on the left video TS. Also, the 3DTV multiplexer can extract or generate right video PES, right video PES and right video PSI based on the right video TS.

Referring again to FIG. 8, the 3DTV multiplexer performs synchronization on a plurality of extracted or generated PESs, and outputs a plurality of synchronized PESs (S820). For example, the 3DTV multiplexer can extract synchronization information from an ES included in each of a plurality of PESs, and can perform synchronization with respect to a left video signal and a right video signal on a frame-by-frame basis, based on the extracted synchronization information . The details of the synchronization performing method and the synchronization information have been described above, and will not be described here.

Also, the 3DTV multiplexer may modify the PTS / DTS value for each of the synchronized PESs to a new PTS / DTS value based on the new PTS value input from the clock (S830). The concrete embodiment of the PTS / DTS correction method has been described above, and therefore will not be described.

Referring again to FIG. 8, the 3DTV multiplexer may generate a 3DTV PSI corresponding to the program configuration information based on the left image PSI and the right image PSI (S840). Here, the 3DTV PSI may include PAT information, PMT information, and the like.

In operation S850, the 3DTV multiplexer generates a 3DTV TS by performing multiplexing based on a plurality of synchronized PESs, PCR as time information generated in the clock, and 3DTV PSI.

According to the above-described 3DTV TS generation method (and / or 3DTV multiplexing method based on the automatic synchronization method), a plurality of encoded streams can be automatically synchronized and multiplexed. The present invention can extract synchronization information included in each encoded stream output from a plurality of encoders and multiplex and synchronize a plurality of encoded streams on a frame basis based on the extracted synchronization information. For example, in the case of a stereoscopic 3DTV service, a 3DTV multiplexing apparatus based on an automatic synchronization scheme according to the present invention can receive a left video MPEG-2 transport stream TS and a right video MPEG-2 transport stream TS. At this time, the apparatus can perform synchronization on a frame-by-frame basis for the left image transmission stream and the right image transmission stream based on the synchronization information included in the ES in each transport stream. The 3DTV multiplexing apparatus based on the automatic synchronization method can generate and output a 3DTV transport stream by multiplexing a plurality of synchronized streams.

Although the above-described embodiments have been described with reference to stereoscopic 3DTV service, the present invention is not limited thereto. The present invention can be applied to all of the stereoscopic 3D images in the same or similar manner as in the above-described embodiment, when a plurality of images are stored or processed together, such as a free-view image, a multi-view image, and a panorama image.

According to the present invention, the disadvantage of the 3DTV multiplexer based on the passive synchronization scheme can be solved. INDUSTRIAL APPLICABILITY The present invention can provide a 3DTV service using a plurality of general DTV coders without expensive 3DTV dedicated coders, and is economically advantageous. In addition, the present invention is expected to contribute to the activation of 3DTV service by minimizing the economic burden of the 3DTV service provider. As described above, the automatic synchronization method based multiplexing method according to the present invention can be applied to a plurality of images (and / or multi images) correlated with each other, such as a multi-view 3D image, a free view image and a UHDTV service system performing parallel processing And can be extended and applied to a configured video service.

In the above-described embodiments, the methods are described on the basis of a flowchart as a series of steps or blocks, but the present invention is not limited to the order of steps, and some steps may occur in different orders or in a different order than the steps described above have. It will also be understood by those skilled in the art that the steps depicted in the flowchart illustrations are not exclusive and that other steps may be included or that one or more steps in the flowchart may be deleted without affecting the scope of the invention You will understand.

The above-described embodiments include examples of various aspects. While it is not possible to describe every possible combination for expressing various aspects, one of ordinary skill in the art will recognize that other combinations are possible. Accordingly, it is intended that the invention include all alternatives, modifications and variations that fall within the scope of the following claims.

Claims (15)

Deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to the left image and a right image PES corresponding to the right image;
Performing synchronization on the left image PES and the right image PES based on the frame unit delay value;
Generating a 3DTV TS (Transport Stream) by multiplexing the synchronized left image PES and the synchronized right image PES; And
Generating 3DTV PSI, which is 3D program configuration information, based on left image PSI (Program Specific Information) corresponding to the left image and right image PSI corresponding to the right image
Lt; / RTI >
In the 3DTV TS generation step,
Performs the multiplexing on the synchronized left image PES, the synchronized right image PES, and the 3DTV PSI,
Wherein the frame unit delay value is a value indicating a time difference between the left image and the right image in frame units,
Wherein the left image PSI includes a first PAT (Program Association Table) and a first PMT (Program Map Table), the right image PSI includes a second PAT and a second PMT,
In the 3DTV PSI generation step,
Generating a third PAT having information corresponding to both the first PMT and the second PMT by reconstructing the first PAT and the second PAT,
Changing a stream type value in one PMT corresponding to an additional stream among the first PMT and the second PMT,
A program information descriptor in which information indicating a program type provided in digital broadcasting is defined in one PMT corresponding to the additional stream,
And a video information descriptor in which information indicating a characteristic of an ES constituting video data is defined is inserted.
The method according to claim 1,
Wherein the deriving the frame-
Extracting a first synchronization information value from a first video ES (Elementary Stream) in the left video PES and extracting a second synchronization information value from a second video ES in the right video PES; And
And deriving the frame unit delay value based on the first synchronization information and the second synchronization information.
3. The method of claim 2,
Wherein the first synchronization information value is a value included in the first video ES and is counted in units of frames and the second synchronization information value is a value included in the second video ES,
In the frame unit delay value deriving step,
Wherein the difference value between the first synchronization information and the second synchronization information is determined as the frame unit delay value.
3. The method of claim 2,
Wherein the first synchronization information value is a value included in the first video ES in the form of a time code and the second synchronization information value is a value included in the second video ES in the form of a time code,
Wherein the deriving the frame-
Deriving a time difference value in seconds between the first synchronization information and the second synchronization information; And
And deriving the frame-by-frame delay value by multiplying the frame-by-frame time difference by a frame-per-second value.
The method according to claim 1,
The synchronized left picture PES further includes a first PTS (presentation time stamp) and a first DTS (Decoding Time Stamp), and the synchronized right picture PES further includes a second PTS and a second DTS,
The synchronization step may comprise:
Further comprising modifying the values of the first PTS, the first DTS, the second PTS, and the second DTS to a new value based on a third PTS input from a clock, 3DTV multiplexing method.
6. The method of claim 5,
In the modification step,
The value of the first PTS and the value of the second PTS are modified to the value of the third PTS,
Wherein the value of the first DTS is modified to a value obtained by adding the third PTS value to a value obtained by subtracting the first PTS value from the first DTS value,
Wherein the value of the second DTS is modified to a value obtained by subtracting the second PTS value from the second DTS value plus the third PTS value.
delete delete The method according to claim 1,
Further comprising: extracting the left image PES from a left image TS corresponding to the left image and extracting the right image PES from a right image TS corresponding to the right image. Way.
A delay value calculation module for deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to the left image and a right image PES corresponding to the right image;
A synchronization module for performing synchronization on the left image PES and the right image PES based on the frame unit delay value;
A 3DTV TS packetizer for multiplexing the synchronized left image PES and the synchronized right image PES to generate a 3DTV TS (Transport Stream); And
A 3DTV PSI generation module for generating 3DTV PSI, which is 3D program configuration information, based on a left image PSI (Program Specific Information) corresponding to the left image and a right image PSI corresponding to the right image
/ RTI >
The 3DTV TS packetizer performs multiplexing on the synchronized left image PES, the synchronized right image PES, and the 3DTV PSI,
The left image PSI includes a first PAT (Program Association Table) and a first PMT
Map Table), wherein the right image PSI includes a second PAT and a second PMT,
The 3DTV PSI generation module includes:
By reconfiguring the first PAT and the second PAT, the first PMT and the second PMT
Generates a third PAT having information corresponding to both PMTs,
One PMT corresponding to the additional stream among the first PMT and the second PMT,
0.0 > a < / RTI > stream type value,
A program information descriptor in which information indicating a program type provided in digital broadcasting is defined in one PMT corresponding to the additional stream and a video information descriptor in which information indicating the characteristics of an ES constituting video data is defined; Lt; / RTI >
Wherein the frame unit delay value is a value indicating a time difference between the left image and the right image in frame units.
11. The method of claim 10,
Wherein the delay value calculation module comprises:
A first synchronization information extractor for extracting a first synchronization information value from a first video ES (Elementary Stream) in the left video PES;
A second synchronization information extractor for extracting a second synchronization information value from a second video ES in the right video PES; And
And a delay value calculator for deriving the frame unit delay value based on the first synchronization information and the second synchronization information.
11. The method of claim 10,
The synchronized left picture PES further includes a first PTS (presentation time stamp) and a first DTS (Decoding Time Stamp), and the synchronized right picture PES further includes a second PTS and a second DTS,
Wherein the synchronization module comprises:
Further comprising a PTS / DTS modification module for modifying the values of the first PTS, the first DTS, the second PTS and the second DTS to a new value based on a third PTS input from a clock And the 3DTV multiplexing apparatus.
delete 11. The method of claim 10,
A first de-packetizer for extracting the left image PES from a left image TS corresponding to the left image; And
And a second inverse packetizer for extracting the right video PES from the right video TS corresponding to the right video.
Deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to the left image and a right image PES corresponding to the right image;
Performing synchronization on the left image PES and the right image PES based on the frame unit delay value; And
Generating 3DTV PSI, which is 3D program configuration information, based on left image PSI (Program Specific Information) corresponding to the left image and right image PSI corresponding to the right image
Lt; / RTI >
Wherein the frame unit delay value is a value indicating a time difference between the left image and the right image in frame units,
Wherein the left image PSI includes a first PAT (Program Association Table) and a first PMT (Program Map Table), the right image PSI includes a second PAT and a second PMT,
In the 3DTV PSI generation step,
Generating a third PAT having information corresponding to both the first PMT and the second PMT by reconstructing the first PAT and the second PAT,
Changing a stream type value in one PMT corresponding to an additional stream among the first PMT and the second PMT,
A program information descriptor in which information indicating a program type provided in digital broadcasting is defined in one PMT corresponding to the additional stream,
And inserting a video information descriptor in which information indicating a characteristic of an ES constituting the video data is defined.
KR1020120051878A 2012-05-16 2012-05-16 Method for 3dtv multiplexing and apparatus thereof KR101699367B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020120051878A KR101699367B1 (en) 2012-05-16 2012-05-16 Method for 3dtv multiplexing and apparatus thereof
US13/717,492 US9270972B2 (en) 2012-05-16 2012-12-17 Method for 3DTV multiplexing and apparatus thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120051878A KR101699367B1 (en) 2012-05-16 2012-05-16 Method for 3dtv multiplexing and apparatus thereof

Publications (2)

Publication Number Publication Date
KR20130128101A KR20130128101A (en) 2013-11-26
KR101699367B1 true KR101699367B1 (en) 2017-02-14

Family

ID=49580985

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120051878A KR101699367B1 (en) 2012-05-16 2012-05-16 Method for 3dtv multiplexing and apparatus thereof

Country Status (2)

Country Link
US (1) US9270972B2 (en)
KR (1) KR101699367B1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130276046A1 (en) * 2012-04-13 2013-10-17 Electronics And Telecommunications Research Institute Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof
RU2015105986A (en) * 2012-08-27 2016-09-10 Сони Корпорейшн SENDING DEVICE, TRANSMISSION METHOD, RECEIVING DEVICE AND RECEIVING METHOD
TW201428675A (en) 2013-01-08 2014-07-16 Pixart Imaging Inc Video generating system and related method thereof
JP2015186036A (en) * 2014-03-24 2015-10-22 ソニー株式会社 Information processor, information processing system, information processing method, and program
US20180309972A1 (en) * 2015-11-11 2018-10-25 Sony Corporation Image processing apparatus and image processing method
US10560682B2 (en) 2017-01-13 2020-02-11 Gopro, Inc. Methods and apparatus for providing a frame packing arrangement for panoramic content

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012085166A (en) 2010-10-13 2012-04-26 Sony Corp Video signal processing device, video signal processing method, and computer program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100763441B1 (en) 2006-09-30 2007-10-04 광주과학기술원 Synchronized multiplexing method, device therefor, demultiplexing method and device therefor
KR100973138B1 (en) 2008-05-08 2010-07-29 한양대학교 산학협력단 Method and system for remultiplex transport stream of multi mode stream in digital broadcasting
KR100972792B1 (en) * 2008-11-04 2010-07-29 한국전자통신연구원 Synchronizer and synchronizing method for stereoscopic image, apparatus and method for providing stereoscopic image
MX2010011683A (en) * 2009-02-19 2010-11-30 Panasonic Corp Recording medium, reproduction device, and integrated circuit.
KR20120036724A (en) 2010-10-08 2012-04-18 한국전자통신연구원 Method and appartus for synchronizing 3-dimensional image
KR101831775B1 (en) * 2010-12-07 2018-02-26 삼성전자주식회사 Transmitter and receiver for transmitting and receiving multimedia content, and reproducing method thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012085166A (en) 2010-10-13 2012-04-26 Sony Corp Video signal processing device, video signal processing method, and computer program

Also Published As

Publication number Publication date
KR20130128101A (en) 2013-11-26
US9270972B2 (en) 2016-02-23
US20130307924A1 (en) 2013-11-21

Similar Documents

Publication Publication Date Title
JP6034420B2 (en) Method and apparatus for generating 3D video data stream in which additional information for playback of 3D video is inserted and apparatus thereof, and method and apparatus for receiving 3D video data stream in which additional information for playback of 3D video is inserted
JP5575949B2 (en) Broadcast data transmission method and apparatus
KR101683119B1 (en) Broadcast transmitter, Broadcast receiver and 3D video processing method thereof
JP5785193B2 (en) Data stream generating method and apparatus for providing 3D multimedia service, data stream receiving method and apparatus for providing 3D multimedia service
KR100864826B1 (en) Method and Apparatus for 3D still image service over digital broadcasting
US9055280B2 (en) Method and apparatus for transmitting digital broadcasting stream using linking information about multi-view video stream, and method and apparatus for receiving the same
EP2744214A2 (en) Transmitting device, receiving device, and transceiving method thereof
US9210354B2 (en) Method and apparatus for reception and transmission
KR101856093B1 (en) Content providing apparatus and method, and content reproduction apparatus and method for synchronization between left and right stream in the stationary-mobile hybrid 3dtv broadcast
KR101699367B1 (en) Method for 3dtv multiplexing and apparatus thereof
US9516086B2 (en) Transmitting device, receiving device, and transceiving method thereof
WO2012081874A2 (en) Signaling method for a stereoscopic video service and apparatus using the method
KR20150004318A (en) Signal processing device and method for 3d service
WO2013011834A1 (en) Transmitter, transmission method and receiver
KR20150057149A (en) System and method for providing 3d broadcast service provision based on re-transmission broadcast networks
KR20110068821A (en) Method and apparatus for receiving and transmitting
KR101191498B1 (en) System and Method for synchronization of 3D broadcasting service using real-time broadcasting and non-real time additional broadcasting data
KR20140053777A (en) Method and apparatus for decoder buffering in hybrid coded video system
Lee et al. Delivery system and receiver for service-compatible 3DTV broadcasting
KR20150006340A (en) Method and apparatus for providing three-dimensional video
KR101203483B1 (en) The method to transmit 3 dimensional broadcasting, and the receiver
KR20140053938A (en) Method for transmitting a signal
KR20140080701A (en) Stereoscopic 3dtv re-synchronizing method and its apparatus using caption data

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant