KR101699367B1 - Method for 3dtv multiplexing and apparatus thereof - Google Patents
Method for 3dtv multiplexing and apparatus thereof Download PDFInfo
- Publication number
- KR101699367B1 KR101699367B1 KR1020120051878A KR20120051878A KR101699367B1 KR 101699367 B1 KR101699367 B1 KR 101699367B1 KR 1020120051878 A KR1020120051878 A KR 1020120051878A KR 20120051878 A KR20120051878 A KR 20120051878A KR 101699367 B1 KR101699367 B1 KR 101699367B1
- Authority
- KR
- South Korea
- Prior art keywords
- value
- pes
- video
- image
- 3dtv
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/167—Synchronising or controlling image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23608—Remultiplexing multiplex streams, e.g. involving modifying time stamps or remapping the packet identifiers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2365—Multiplexing of several video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2368—Multiplexing of audio and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/242—Synchronization processes, e.g. processing of PCR [Program Clock References]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
Abstract
A 3DTV multiplexing method according to the present invention includes deriving a frame unit delay value for a left image and a right image based on a left image PES corresponding to a left image and a right image PES corresponding to a right image, Performing synchronization on the left video PES and the right video PES, and multiplexing the synchronized left video PES and the synchronized right video PES to generate a 3DTV TS. According to the present invention, video service efficiency can be improved.
Description
The present invention relates to image processing, and more particularly, to a 3DTV multiplexing method and apparatus.
In addition to UDTV service, digital broadcasting service using 3D video is attracting attention as a next generation broadcasting service following HDTV. Based on the development of related technologies such as the launch of high-quality commercial stereoscopic display, each household can enjoy 3D video 3DTV services are expected to be available within the next few years. Especially, the 3D broadcasting service which is currently being provided as a commercial service or a trial service is a service using stereoscopic video mainly composed of left and right images.
In the process of processing a 3D image, a plurality of images correlated with each other can be stored or processed together. In addition, not only a 3D image but also a plurality of images correlated with each other can be stored or processed in a process of a free view image, a panorama image, a multi-view image, and a multi-divided image. Here, for example, the multi-divided image may include an image in which an ultrahigh-resolution image having a resolution corresponding to 4 to 16 times the HD image is divided into a plurality of HD images. In this manner, when a plurality of images correlated with each other are stored or processed together, a plurality of images must be synchronized with each other on a frame basis.
SUMMARY OF THE INVENTION The present invention provides a method and apparatus for generating a 3DTV TS capable of improving video service efficiency.
It is another object of the present invention to provide a 3DTV multiplexing method and apparatus which can improve the efficiency of video service.
Another aspect of the present invention is to provide an image synchronization method and apparatus for improving image service efficiency.
1. One embodiment of the present invention is a 3DTV multiplexing method. The method includes deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to a left image and a right image PES corresponding to a right image, Performing synchronization on the left video PES and the right video PES based on the unit delay value, and multiplexing the synchronized left video PES and the synchronized right video PES to generate a 3DTV TS (Transport Stream) And the frame unit delay value is a value indicating a time difference between the left image and the right image in frame units.
2. The method of claim 1, wherein deriving the frame unit delay value comprises: extracting a first synchronization information value from a first video ES (Elementary Stream) in the left video PES; Extracting a synchronization information value, and deriving the frame unit delay value based on the first synchronization information and the second synchronization information.
3. The apparatus of claim 3, wherein the first sync information value is a value included in the first video ES, the first sync information value being counted in units of frames, the second sync information value being counted in frame units, And the difference value between the first synchronization information and the second synchronization information may be determined as the frame unit delay value in the step of deriving the frame unit delay value.
4. The method of
5. The method of claim 1, wherein the synchronized left picture PES further comprises a first PTS (Presentation Time Stamp) and a first DTS (Decoding Time Stamp), and the synchronized right picture PES includes a second PTS and a second DTS The first PTS, the second PTS, and the second DTS, based on a third PTS input from a clock, to a new value As shown in FIG.
6. The method of claim 5, wherein in the modifying step, the value of the first PTS and the value of the second PTS are modified to the value of the third PTS, and the value of the first DTS is changed from the first DTS value The value of the second DTS is modified by adding the third PTS value to the value obtained by subtracting the first PTS value from the second DTS value minus the second PTS value, Lt; / RTI >
7. The method according to claim 1, further comprising the step of generating a 3DTV PSI, which is 3D program configuration information, based on a left image PSI (Program Specific Information) corresponding to the left image and a right image PSI corresponding to the right image In the 3DTV TS generation step, the synchronized left image PES, the synchronized right image PES, and the 3DTV PSI may be multiplexed.
8. The method of claim 7, wherein the left image PSI includes a first PAT (Program Association Table) and a first PMT (Program Map Table), the right image PSI includes a second PAT and a second PMT, Wherein the 3DTV PSI generating step generates a third PAT having information corresponding to both the first PMT and the second PMT by reconstructing the first PAT and the second PAT, 2 A program in which a stream type value is changed in one PMT corresponding to an additional stream in PMT and information indicating a program type provided in digital broadcasting is defined in one PMT corresponding to the additional stream An information descriptor and a video information descriptor in which information indicating the characteristics of the ES constituting the video data is defined.
9. The method of claim 1, wherein the method further comprises: extracting the left image PES from a transport stream (TS) corresponding to the left image and extracting the right image PES from a right image TS corresponding to the right image As shown in FIG.
10. Another embodiment of the present invention is a 3DTV multiplexing apparatus. The apparatus includes a delay value calculation module for deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to a left image and a right image PES corresponding to a right image, A synchronization module for performing synchronization on the left image PES and the right image PES based on the frame unit delay value and multiplexing on the synchronized left image PES and the synchronized right image PES, And a 3DTV TS packetizer for generating a transport stream, wherein the frame unit delay value is a value representing a time difference between the left image and the right image in frame units.
11. The apparatus of claim 10, wherein the delay value calculation module comprises: a first synchronization information extractor for extracting a first synchronization information value from a first video ES (Elementary Stream) in the left video PES; A second synchronization information extractor for extracting a second synchronization information value from the first synchronization information and a delay value calculator for deriving the frame unit delay value based on the first synchronization information and the second synchronization information.
12. The method of claim 10, wherein the synchronized left picture PES further comprises a first presentation time stamp (PTS) and a first decoding time stamp (DTS), and the synchronized right picture PES comprises a second PTS and a second DTS Wherein the synchronization module is configured to update the values of the first PTS, the first PTS, the second PTS, and the second DTS to a new value based on a third PTS input from a clock And may further include a PTS / DTS modification module to be modified.
13. The apparatus of claim 10, wherein the apparatus further comprises: a 3DTV PSI generation unit generating 3DTV PSI, which is 3D program configuration information, based on left image PSI (Program Specific Information) corresponding to the left image and right image PSI corresponding to the right image Module, and the 3DTV TS packetizer may perform multiplexing on the synchronized left image PES, the synchronized right image PES, and the 3DTV PSI.
14. The apparatus of claim 10, wherein the apparatus comprises: a first de-packetizer for extracting the left image PES from a left transport stream (TS) corresponding to the left image; And a second reverse packetizer for extracting the right video PES from the right video PES.
15. Another embodiment of the present invention is a method for image synchronization. The method includes deriving a frame unit delay value for the left image and the right image based on a left image PES (Packetized Elementary Stream) corresponding to a left image and a right image PES corresponding to a right image, And performing synchronization on the left image PES and the right image PES based on a unit delay value, wherein the frame unit delay value is a value representing a time difference between the left image and the right image in frame units .
According to the 3DTV TS generation method of the present invention, video service efficiency can be improved.
According to the 3DTV multiplexing method of the present invention, video service efficiency can be improved.
According to the image synchronization method of the present invention, image service efficiency can be improved.
1 is a diagram schematically showing an embodiment of a 3DTV TS generation process.
2 is a diagram schematically showing another embodiment of the 3DTV TS generation process.
3 is a block diagram schematically showing an embodiment of a 3DTV TS generating apparatus according to the present invention.
4 is a block diagram schematically illustrating an embodiment of a DTV encoder configuration.
5 is a block diagram schematically illustrating an embodiment of a 3DTV multiplexer configuration based on an automatic synchronization scheme according to the present invention.
FIG. 6 is a block diagram schematically illustrating an embodiment of a synchronization module included in the automatic synchronization-based 3DTV multiplexer of FIG.
7 is a block diagram schematically illustrating an embodiment of the delay value calculation module included in the synchronization module of FIG.
FIG. 8 is a flowchart schematically illustrating an embodiment of a 3DTV multiplexing method based on an automatic synchronization scheme according to the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the following description of the embodiments of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present disclosure rather unclear.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . In addition, the description of "including" a specific configuration in the present invention does not exclude a configuration other than the configuration, and means that additional configurations can be included in the practice of the present invention or the technical scope of the present invention.
The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.
In addition, the components shown in the embodiments of the present invention are shown independently to represent different characteristic functions, which does not mean that each component is composed of separate hardware or software constituent units. That is, each constituent unit is included in each constituent unit for convenience of explanation, and at least two constituent units of the constituent units may be combined to form one constituent unit, or one constituent unit may be divided into a plurality of constituent units to perform a function. The integrated embodiments and separate embodiments of the components are also included within the scope of the present invention, unless they depart from the essence of the present invention.
In addition, some of the components are not essential components to perform essential functions in the present invention, but may be optional components only to improve performance. The present invention can be implemented only with components essential for realizing the essence of the present invention except for the components used for performance improvement, Are also included in the scope of the present invention.
1 is a diagram schematically showing an embodiment of a 3DTV TS generation process. Here, the TS may mean a transport stream. 1 shows a 3DTV TS generation process for a stereoscopic 3DTV service.
When a plurality of images are stored or processed together as in a 3D image, a free view image, a multi-view image, a panoramic image, or the like, a plurality of image signals must be synchronized with each other on a frame basis. Since the stereoscopic video may be composed of a left video signal and a right video signal, the left video signal and the right video signal in the embodiment of FIG. 1 must be synchronized with each other on a frame basis.
In a 3D image, free-view image, multi-view image, panorama image, etc., a plurality of images can be encoded by different encoders. In this case, a plurality of transport streams can be output by a plurality of encoders. For example, in stereoscopic video, the left and right images can be individually encoded by a Moving Picture Experts Group (MPEG) -2 encoder and an AVC (Advanced Video Coding) encoder, respectively. At this time, as described above, a plurality of transport streams must be synchronized with each other on a frame basis. However, if there is no way to automatically synchronize a plurality of transport streams, a 3DTV dedicated encoder that encodes a plurality of images (e.g., left and right images in stereoscopic video) together can be used.
In the embodiment of FIG. 1, a 3DTV dedicated encoder may be used to generate a 3DTV TS. Referring to FIG. 1, the
2 is a diagram schematically showing another embodiment of the 3DTV TS generation process. FIG. 2 illustrates a 3DTV TS generation process for a stereoscopic 3DTV service.
As described above, when a plurality of images are stored or processed together as in a 3D image, a free view image, a multi-view image, a panoramic image, or the like, a plurality of image signals must be synchronized with each other on a frame basis. Since the stereoscopic video may be composed of a left video signal and a right video signal, the left video signal and the right video signal in the embodiment of FIG. 2 must be synchronized with each other on a frame basis.
In a 3D image, free-view image, multi-view image, panorama image, etc., a plurality of images can be encoded by different encoders. At this time, a plurality of transport streams can be output by a plurality of encoders. For example, as in the embodiment of FIG. 2, left and right images in stereo-scopic video can be separately encoded by an MPEG-2 encoder and an AVC encoder, respectively. At this time, as described above, a plurality of transport streams must be synchronized with each other on a frame basis. However, if there is no way to automatically synchronize a plurality of transport streams, a method of manually synchronizing a plurality of video signals on a frame-by-frame basis may be used.
Referring to FIG. 2, the
Referring again to FIG. 2, the 3DTV multiplexer /
In the above-described embodiment, a 3DTV multiplexing and remultiplexer based on a passive synchronization scheme can be used. In other words, according to the above-described embodiment, a person visually confirms a frame unit time difference between a left image and a right image, and a left image frame and a right image frame can be manually synchronized based on the time difference.
Meanwhile, when a 3DTV dedicated encoder is used as in the embodiment of FIG. 1 described above, new expensive equipment is required for generating a 3DTV TS. Also, the 3DTV TS generation method according to the embodiment of FIG. 1 has a disadvantage that the existing encoder can not be used. In the embodiment of FIG. 2, the stream multiplexed and output is reproduced in the 3DTV terminal for monitoring, and the synchronization is manually performed while watching the image of the person being reproduced. Therefore, the 3DTV TS generation method of FIG. 2 has a disadvantage that a monitoring terminal is necessarily required. In addition, the 3DTV TS generating method of FIG. 2 has the disadvantage of being accompanied by human manual operation. Therefore, in order to solve such a problem, a 3DTV multiplexing method based on an automatic synchronization method can be provided.
3 is a block diagram schematically showing an embodiment of a 3DTV TS generating apparatus according to the present invention.
3 shows a 3DTV TS generation apparatus for a stereoscopic 3DTV service. The left and right images constituting the stereoscopic image may be images having different viewpoints for the same scene. However, in the embodiments described below, a plurality of images processed together for 3DTV TS generation will be handled as having separate contents and / or programs even if they are images for the same scene.
The 3DTV TS generating apparatus according to 310 of FIG. 3 may include a
The automatic synchronization-based 3DTV multiplexer according to the present invention can receive two types of inputs as shown in 310 and 320 of FIG. In FIG. 3, the left and right video signals constituting the stereoscopic video can be separately encoded by the two
Referring to FIG. 3, the
In FIG. 3, the automatic synchronization-based
Referring to FIG. 3, the
In FIG. 3, the automatic synchronization-based
In FIG. 3, two
Accordingly, in order to provide a stereoscopic 3DTV service composed of two images including a left image and a right image, it is necessary that the left and right video streams outputted through the encoder (s) are automatically synchronized frame by frame have. The MPEG-2 3DTV TS created based on the automatic synchronization method can enable stereoscopic 3DTV service. An automatic synchronization scheme for a plurality of encoded streams (for example, an operation of a 3DTV multiplexer based on an automatic synchronization scheme) will be described later.
In the above-described embodiment, the output signal of the DTV encoder is described as an MPEG-2 TS, but the present invention is not limited thereto and each output signal may correspond to another type of transport stream.
4 is a block diagram schematically illustrating an embodiment of a DTV encoder configuration. The DTV encoder according to the embodiment of FIG. 4 includes an
Referring to FIG. 4, the
The
The
The TS packetizer 470 multiplexes the audio PES, the video PES, the PCR information generated in the
5 is a block diagram schematically illustrating an embodiment of a 3DTV multiplexer configuration based on an automatic synchronization scheme according to the present invention.
5, a 3DTV multiplexer based on the automatic synchronization scheme includes a
5, the
5, all of the signals input to the
The
The PTS /
The following Equation 1 shows an embodiment of a process for obtaining a new DTS value for one video PES.
[Equation 1]
Diff_DTS_PTS_PES_video1 = current_DTS_PES_video1 - current_PTS_PES_video1
New_DTS_PES_video1 = New_PTS + Diff_DTS_PTS_PES_video1
Here, New_PTS may represent a new PTS value input from the
The 3DTV
As described above, the PAT may include a list of all programs currently available in the TS, and may include a program number indicating which program is currently being transmitted and a PID corresponding to each program . Meanwhile, the TS output from the DTV encoder according to the embodiment of FIG. 4 may include one program, and two TSs (left TS and right TS) are input to the 3DTV multiplexer of FIG. 5, The 3DTV TS output from the 3DTV multiplexer according to the embodiment of the present invention may include two programs (a program corresponding to a left image and a program corresponding to a right image). Accordingly, the 3DTV
On the other hand, as described above, the PMT may include program elements constituting one program and / or video stream information constituting video data in the program. The PMT includes a program information descriptor (e.g., stereoscopic_program_info_descriptor) in which information indicating a program type provided in digital broadcasting is defined, a video information descriptor in which information indicating the characteristics of an ES constituting video data is defined , stereoscopic_video_info_descriptor) and / or a stream type (e.g., stream_type).
The 3DTV
In one embodiment, it is assumed that the TS is output from the MPEG-2 encoder and the AVC encoder and is input to the 3DTV multiplexer of FIG. Here, the TS outputted from the MPEG-2 encoder is referred to as an MPEG-2 TS, and the TS outputted from the AVC encoder is referred to as an AVC TS. At this time, in order to maintain compatibility between the existing DTV and the 3DTV, the 3DTV
In addition, the 3DTV
[Table 1]
[Table 2]
4, the 3DTV TS packetizer 570 may include a plurality of input synchronized PESs, a PCR which is time information generated from the
FIG. 6 is a block diagram schematically illustrating an embodiment of a synchronization module included in the automatic synchronization-based 3DTV multiplexer of FIG. The synchronization module according to the embodiment of FIG. 6 may include a first
Referring to FIG. 6, a plurality of PESs may be input to the synchronization module. In one embodiment, the plurality of PESs input to the synchronization module may be respectively an audio PES corresponding to a left image, a video PES corresponding to a left image, an audio PES corresponding to a right image, and a video PES corresponding to a right image.
The plurality of PESs input to the synchronization module may be stored in the PES storage buffer. Referring to FIG. 6, the first
The delay
If the frame-by-frame delay value is obtained for the left-image encoded stream and the right-image encoded stream, the frame-by-frame delay value is determined such that the program is completed unless an error occurs in the left image encoded stream and / The same value can be maintained until the time point. Therefore, the synchronization module (and / or the delay value calculation module) may not calculate the frame-by-frame delay value for each PES after calculating the frame-by-frame delay value first. In this case, the synchronization module (and / or the delay value calculation module) may periodically calculate the frame-by-frame delay value and perform synchronization while checking whether there is a change in the frame-by-frame delay value.
Details of the operation of the delay
Referring again to FIG. 6, the
Hereinafter, the left video PES and the left video audio PES will be collectively referred to as a left video PES, and the right video PES and the right video audio PES will be collectively referred to as a right video PES. In one embodiment, the
At this time, the signal delayed by the
7 is a block diagram schematically illustrating an embodiment of the delay value calculation module included in the synchronization module of FIG. The delay value calculation module according to the embodiment of FIG. 7 may include a first
Referring to FIG. 7, the first
In one embodiment, the synchronization information value included in the video ES may be a value that is incremented by one in units of frames. That is, the synchronization information value may be a value counted in frame units and included in the video ES. In this case, in the embodiment of FIG. 7, the difference value between the first synchronization information value and the second synchronization information value may correspond to the frame unit delay value. For example, if the first synchronization information value in ES1 is 5 and the second synchronization information value in ES2 is 2, the frame unit delay value derived from the delay value calculator may be 3.
In another embodiment, the synchronization information included in the video ES may be information in the form of a time code. That is, the synchronization information value may be a value included in the video ES in the form of time code composed of hour, minute, second, frame and the like. In this case, the
When the MPEG-2 video encoder is used, the above-described synchronization information can be included in the user data area in the video ES. Here, the user data may be represented by user_data as an example. In this case, the value of the synchronization information may be a value that is incremented by 1 in frame units, and / or a value that is counted and included in units of frames. As another example, the value of the synchronization information may be a value included in the form of a time code composed of hour, minute, second, frame and the like. When an AVC and / or HEVC (High Efficiency Video Coding) encoder is used, the above-described synchronization information may be included in the video ES in the form of SEI (Supplemental Enhancement Information). The location at which the synchronization information is inserted in the video ES and / or the type of synchronization information may be a predetermined predetermined location and / or a predetermined predetermined type. At this time, the delay value calculation module can know the position where the synchronization information is inserted and / or the type of synchronization information in the video ES without any additional information.
The delay value calculation module (and / or the delay value calculator 730) has a function of replacing or changing the value of the synchronization information used for calculating the frame unit delay value with a null value after calculating the frame unit delay value . Synchronization information for using a closed caption syntax may be inserted into the user data area in the video ES. Here, the closed caption may be a character string provided in synchronization with the voice of the broadcast program, and may be displayed on the screen only when the closed caption function is activated. At this time, when actual caption information is provided, confusion may occur between the synchronization information for the closed caption syntax and the synchronization information for synchronizing the plurality of video signals. Therefore, to reduce this confusion, the delay value calculation module (and / or delay value calculator 730) may delete the synchronization information used in the frame-by-frame delay value calculation.
FIG. 8 is a flowchart schematically illustrating an embodiment of a 3DTV multiplexing method based on an automatic synchronization scheme according to the present invention.
Referring to FIG. 8, the 3DTV multiplexer according to the embodiment of the present invention can extract PES and PSI for each of a plurality of images (S810).
For example, in a stereoscopic 3DTV service, a left video TS and a right video TS may be input to a 3DTV multiplexer. At this time, the 3DTV multiplexer can extract or generate the left video audio PES, the left video PES, and the left video PSI based on the left video TS. Also, the 3DTV multiplexer can extract or generate right video PES, right video PES and right video PSI based on the right video TS.
Referring again to FIG. 8, the 3DTV multiplexer performs synchronization on a plurality of extracted or generated PESs, and outputs a plurality of synchronized PESs (S820). For example, the 3DTV multiplexer can extract synchronization information from an ES included in each of a plurality of PESs, and can perform synchronization with respect to a left video signal and a right video signal on a frame-by-frame basis, based on the extracted synchronization information . The details of the synchronization performing method and the synchronization information have been described above, and will not be described here.
Also, the 3DTV multiplexer may modify the PTS / DTS value for each of the synchronized PESs to a new PTS / DTS value based on the new PTS value input from the clock (S830). The concrete embodiment of the PTS / DTS correction method has been described above, and therefore will not be described.
Referring again to FIG. 8, the 3DTV multiplexer may generate a 3DTV PSI corresponding to the program configuration information based on the left image PSI and the right image PSI (S840). Here, the 3DTV PSI may include PAT information, PMT information, and the like.
In operation S850, the 3DTV multiplexer generates a 3DTV TS by performing multiplexing based on a plurality of synchronized PESs, PCR as time information generated in the clock, and 3DTV PSI.
According to the above-described 3DTV TS generation method (and / or 3DTV multiplexing method based on the automatic synchronization method), a plurality of encoded streams can be automatically synchronized and multiplexed. The present invention can extract synchronization information included in each encoded stream output from a plurality of encoders and multiplex and synchronize a plurality of encoded streams on a frame basis based on the extracted synchronization information. For example, in the case of a stereoscopic 3DTV service, a 3DTV multiplexing apparatus based on an automatic synchronization scheme according to the present invention can receive a left video MPEG-2 transport stream TS and a right video MPEG-2 transport stream TS. At this time, the apparatus can perform synchronization on a frame-by-frame basis for the left image transmission stream and the right image transmission stream based on the synchronization information included in the ES in each transport stream. The 3DTV multiplexing apparatus based on the automatic synchronization method can generate and output a 3DTV transport stream by multiplexing a plurality of synchronized streams.
Although the above-described embodiments have been described with reference to stereoscopic 3DTV service, the present invention is not limited thereto. The present invention can be applied to all of the stereoscopic 3D images in the same or similar manner as in the above-described embodiment, when a plurality of images are stored or processed together, such as a free-view image, a multi-view image, and a panorama image.
According to the present invention, the disadvantage of the 3DTV multiplexer based on the passive synchronization scheme can be solved. INDUSTRIAL APPLICABILITY The present invention can provide a 3DTV service using a plurality of general DTV coders without expensive 3DTV dedicated coders, and is economically advantageous. In addition, the present invention is expected to contribute to the activation of 3DTV service by minimizing the economic burden of the 3DTV service provider. As described above, the automatic synchronization method based multiplexing method according to the present invention can be applied to a plurality of images (and / or multi images) correlated with each other, such as a multi-view 3D image, a free view image and a UHDTV service system performing parallel processing And can be extended and applied to a configured video service.
In the above-described embodiments, the methods are described on the basis of a flowchart as a series of steps or blocks, but the present invention is not limited to the order of steps, and some steps may occur in different orders or in a different order than the steps described above have. It will also be understood by those skilled in the art that the steps depicted in the flowchart illustrations are not exclusive and that other steps may be included or that one or more steps in the flowchart may be deleted without affecting the scope of the invention You will understand.
The above-described embodiments include examples of various aspects. While it is not possible to describe every possible combination for expressing various aspects, one of ordinary skill in the art will recognize that other combinations are possible. Accordingly, it is intended that the invention include all alternatives, modifications and variations that fall within the scope of the following claims.
Claims (15)
Performing synchronization on the left image PES and the right image PES based on the frame unit delay value;
Generating a 3DTV TS (Transport Stream) by multiplexing the synchronized left image PES and the synchronized right image PES; And
Generating 3DTV PSI, which is 3D program configuration information, based on left image PSI (Program Specific Information) corresponding to the left image and right image PSI corresponding to the right image
Lt; / RTI >
In the 3DTV TS generation step,
Performs the multiplexing on the synchronized left image PES, the synchronized right image PES, and the 3DTV PSI,
Wherein the frame unit delay value is a value indicating a time difference between the left image and the right image in frame units,
Wherein the left image PSI includes a first PAT (Program Association Table) and a first PMT (Program Map Table), the right image PSI includes a second PAT and a second PMT,
In the 3DTV PSI generation step,
Generating a third PAT having information corresponding to both the first PMT and the second PMT by reconstructing the first PAT and the second PAT,
Changing a stream type value in one PMT corresponding to an additional stream among the first PMT and the second PMT,
A program information descriptor in which information indicating a program type provided in digital broadcasting is defined in one PMT corresponding to the additional stream,
And a video information descriptor in which information indicating a characteristic of an ES constituting video data is defined is inserted.
Wherein the deriving the frame-
Extracting a first synchronization information value from a first video ES (Elementary Stream) in the left video PES and extracting a second synchronization information value from a second video ES in the right video PES; And
And deriving the frame unit delay value based on the first synchronization information and the second synchronization information.
Wherein the first synchronization information value is a value included in the first video ES and is counted in units of frames and the second synchronization information value is a value included in the second video ES,
In the frame unit delay value deriving step,
Wherein the difference value between the first synchronization information and the second synchronization information is determined as the frame unit delay value.
Wherein the first synchronization information value is a value included in the first video ES in the form of a time code and the second synchronization information value is a value included in the second video ES in the form of a time code,
Wherein the deriving the frame-
Deriving a time difference value in seconds between the first synchronization information and the second synchronization information; And
And deriving the frame-by-frame delay value by multiplying the frame-by-frame time difference by a frame-per-second value.
The synchronized left picture PES further includes a first PTS (presentation time stamp) and a first DTS (Decoding Time Stamp), and the synchronized right picture PES further includes a second PTS and a second DTS,
The synchronization step may comprise:
Further comprising modifying the values of the first PTS, the first DTS, the second PTS, and the second DTS to a new value based on a third PTS input from a clock, 3DTV multiplexing method.
In the modification step,
The value of the first PTS and the value of the second PTS are modified to the value of the third PTS,
Wherein the value of the first DTS is modified to a value obtained by adding the third PTS value to a value obtained by subtracting the first PTS value from the first DTS value,
Wherein the value of the second DTS is modified to a value obtained by subtracting the second PTS value from the second DTS value plus the third PTS value.
Further comprising: extracting the left image PES from a left image TS corresponding to the left image and extracting the right image PES from a right image TS corresponding to the right image. Way.
A synchronization module for performing synchronization on the left image PES and the right image PES based on the frame unit delay value;
A 3DTV TS packetizer for multiplexing the synchronized left image PES and the synchronized right image PES to generate a 3DTV TS (Transport Stream); And
A 3DTV PSI generation module for generating 3DTV PSI, which is 3D program configuration information, based on a left image PSI (Program Specific Information) corresponding to the left image and a right image PSI corresponding to the right image
/ RTI >
The 3DTV TS packetizer performs multiplexing on the synchronized left image PES, the synchronized right image PES, and the 3DTV PSI,
The left image PSI includes a first PAT (Program Association Table) and a first PMT
Map Table), wherein the right image PSI includes a second PAT and a second PMT,
The 3DTV PSI generation module includes:
By reconfiguring the first PAT and the second PAT, the first PMT and the second PMT
Generates a third PAT having information corresponding to both PMTs,
One PMT corresponding to the additional stream among the first PMT and the second PMT,
0.0 > a < / RTI > stream type value,
A program information descriptor in which information indicating a program type provided in digital broadcasting is defined in one PMT corresponding to the additional stream and a video information descriptor in which information indicating the characteristics of an ES constituting video data is defined; Lt; / RTI >
Wherein the frame unit delay value is a value indicating a time difference between the left image and the right image in frame units.
Wherein the delay value calculation module comprises:
A first synchronization information extractor for extracting a first synchronization information value from a first video ES (Elementary Stream) in the left video PES;
A second synchronization information extractor for extracting a second synchronization information value from a second video ES in the right video PES; And
And a delay value calculator for deriving the frame unit delay value based on the first synchronization information and the second synchronization information.
The synchronized left picture PES further includes a first PTS (presentation time stamp) and a first DTS (Decoding Time Stamp), and the synchronized right picture PES further includes a second PTS and a second DTS,
Wherein the synchronization module comprises:
Further comprising a PTS / DTS modification module for modifying the values of the first PTS, the first DTS, the second PTS and the second DTS to a new value based on a third PTS input from a clock And the 3DTV multiplexing apparatus.
A first de-packetizer for extracting the left image PES from a left image TS corresponding to the left image; And
And a second inverse packetizer for extracting the right video PES from the right video TS corresponding to the right video.
Performing synchronization on the left image PES and the right image PES based on the frame unit delay value; And
Generating 3DTV PSI, which is 3D program configuration information, based on left image PSI (Program Specific Information) corresponding to the left image and right image PSI corresponding to the right image
Lt; / RTI >
Wherein the frame unit delay value is a value indicating a time difference between the left image and the right image in frame units,
Wherein the left image PSI includes a first PAT (Program Association Table) and a first PMT (Program Map Table), the right image PSI includes a second PAT and a second PMT,
In the 3DTV PSI generation step,
Generating a third PAT having information corresponding to both the first PMT and the second PMT by reconstructing the first PAT and the second PAT,
Changing a stream type value in one PMT corresponding to an additional stream among the first PMT and the second PMT,
A program information descriptor in which information indicating a program type provided in digital broadcasting is defined in one PMT corresponding to the additional stream,
And inserting a video information descriptor in which information indicating a characteristic of an ES constituting the video data is defined.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120051878A KR101699367B1 (en) | 2012-05-16 | 2012-05-16 | Method for 3dtv multiplexing and apparatus thereof |
US13/717,492 US9270972B2 (en) | 2012-05-16 | 2012-12-17 | Method for 3DTV multiplexing and apparatus thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120051878A KR101699367B1 (en) | 2012-05-16 | 2012-05-16 | Method for 3dtv multiplexing and apparatus thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20130128101A KR20130128101A (en) | 2013-11-26 |
KR101699367B1 true KR101699367B1 (en) | 2017-02-14 |
Family
ID=49580985
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120051878A KR101699367B1 (en) | 2012-05-16 | 2012-05-16 | Method for 3dtv multiplexing and apparatus thereof |
Country Status (2)
Country | Link |
---|---|
US (1) | US9270972B2 (en) |
KR (1) | KR101699367B1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130276046A1 (en) * | 2012-04-13 | 2013-10-17 | Electronics And Telecommunications Research Institute | Receiving apparatus for receiving a plurality of signals through different paths and method for processing signals thereof |
RU2015105986A (en) * | 2012-08-27 | 2016-09-10 | Сони Корпорейшн | SENDING DEVICE, TRANSMISSION METHOD, RECEIVING DEVICE AND RECEIVING METHOD |
TW201428675A (en) | 2013-01-08 | 2014-07-16 | Pixart Imaging Inc | Video generating system and related method thereof |
JP2015186036A (en) * | 2014-03-24 | 2015-10-22 | ソニー株式会社 | Information processor, information processing system, information processing method, and program |
US20180309972A1 (en) * | 2015-11-11 | 2018-10-25 | Sony Corporation | Image processing apparatus and image processing method |
US10560682B2 (en) | 2017-01-13 | 2020-02-11 | Gopro, Inc. | Methods and apparatus for providing a frame packing arrangement for panoramic content |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012085166A (en) | 2010-10-13 | 2012-04-26 | Sony Corp | Video signal processing device, video signal processing method, and computer program |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100763441B1 (en) | 2006-09-30 | 2007-10-04 | 광주과학기술원 | Synchronized multiplexing method, device therefor, demultiplexing method and device therefor |
KR100973138B1 (en) | 2008-05-08 | 2010-07-29 | 한양대학교 산학협력단 | Method and system for remultiplex transport stream of multi mode stream in digital broadcasting |
KR100972792B1 (en) * | 2008-11-04 | 2010-07-29 | 한국전자통신연구원 | Synchronizer and synchronizing method for stereoscopic image, apparatus and method for providing stereoscopic image |
MX2010011683A (en) * | 2009-02-19 | 2010-11-30 | Panasonic Corp | Recording medium, reproduction device, and integrated circuit. |
KR20120036724A (en) | 2010-10-08 | 2012-04-18 | 한국전자통신연구원 | Method and appartus for synchronizing 3-dimensional image |
KR101831775B1 (en) * | 2010-12-07 | 2018-02-26 | 삼성전자주식회사 | Transmitter and receiver for transmitting and receiving multimedia content, and reproducing method thereof |
-
2012
- 2012-05-16 KR KR1020120051878A patent/KR101699367B1/en active IP Right Grant
- 2012-12-17 US US13/717,492 patent/US9270972B2/en not_active Expired - Fee Related
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012085166A (en) | 2010-10-13 | 2012-04-26 | Sony Corp | Video signal processing device, video signal processing method, and computer program |
Also Published As
Publication number | Publication date |
---|---|
KR20130128101A (en) | 2013-11-26 |
US9270972B2 (en) | 2016-02-23 |
US20130307924A1 (en) | 2013-11-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6034420B2 (en) | Method and apparatus for generating 3D video data stream in which additional information for playback of 3D video is inserted and apparatus thereof, and method and apparatus for receiving 3D video data stream in which additional information for playback of 3D video is inserted | |
JP5575949B2 (en) | Broadcast data transmission method and apparatus | |
KR101683119B1 (en) | Broadcast transmitter, Broadcast receiver and 3D video processing method thereof | |
JP5785193B2 (en) | Data stream generating method and apparatus for providing 3D multimedia service, data stream receiving method and apparatus for providing 3D multimedia service | |
KR100864826B1 (en) | Method and Apparatus for 3D still image service over digital broadcasting | |
US9055280B2 (en) | Method and apparatus for transmitting digital broadcasting stream using linking information about multi-view video stream, and method and apparatus for receiving the same | |
EP2744214A2 (en) | Transmitting device, receiving device, and transceiving method thereof | |
US9210354B2 (en) | Method and apparatus for reception and transmission | |
KR101856093B1 (en) | Content providing apparatus and method, and content reproduction apparatus and method for synchronization between left and right stream in the stationary-mobile hybrid 3dtv broadcast | |
KR101699367B1 (en) | Method for 3dtv multiplexing and apparatus thereof | |
US9516086B2 (en) | Transmitting device, receiving device, and transceiving method thereof | |
WO2012081874A2 (en) | Signaling method for a stereoscopic video service and apparatus using the method | |
KR20150004318A (en) | Signal processing device and method for 3d service | |
WO2013011834A1 (en) | Transmitter, transmission method and receiver | |
KR20150057149A (en) | System and method for providing 3d broadcast service provision based on re-transmission broadcast networks | |
KR20110068821A (en) | Method and apparatus for receiving and transmitting | |
KR101191498B1 (en) | System and Method for synchronization of 3D broadcasting service using real-time broadcasting and non-real time additional broadcasting data | |
KR20140053777A (en) | Method and apparatus for decoder buffering in hybrid coded video system | |
Lee et al. | Delivery system and receiver for service-compatible 3DTV broadcasting | |
KR20150006340A (en) | Method and apparatus for providing three-dimensional video | |
KR101203483B1 (en) | The method to transmit 3 dimensional broadcasting, and the receiver | |
KR20140053938A (en) | Method for transmitting a signal | |
KR20140080701A (en) | Stereoscopic 3dtv re-synchronizing method and its apparatus using caption data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |