WO2012070716A1 - 디지털 방송에서 서비스 호환 방식 전송 방법 - Google Patents
디지털 방송에서 서비스 호환 방식 전송 방법 Download PDFInfo
- Publication number
- WO2012070716A1 WO2012070716A1 PCT/KR2011/000362 KR2011000362W WO2012070716A1 WO 2012070716 A1 WO2012070716 A1 WO 2012070716A1 KR 2011000362 W KR2011000362 W KR 2011000362W WO 2012070716 A1 WO2012070716 A1 WO 2012070716A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- broadcasting
- information
- service
- video
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23608—Remultiplexing multiplex streams, e.g. involving modifying time stamps or remapping the packet identifiers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/007—Aspects relating to detection of stereoscopic image format, e.g. for adaptation to the display format
Definitions
- the present invention relates to a service compatible method of 3D stereoscopic digital broadcasting in a broadcast MPEG-2 TS (Transport Stream) format used for transmitting and receiving digital TV.
- MPEG-2 TS Transport Stream
- ATSC North American Advanced Television Systems Committee
- ATSC is the committee or standards for developing digital television broadcasting standards in the United States.
- the ATSC standard is currently determined by the national standards of the United States, Canada, Mexico, and Korea, and other countries, including several countries in South America, intend to make it the standard.
- digital broadcasting standards include DVB developed in Europe and ISDB in Japan.
- the ATSC digital broadcast standard which can transmit high-quality video, audio and auxiliary data, is capable of transmitting data at a terrestrial broadcast rate of 19.39Mbps for 6MHz terrestrial broadcast channels and about 38Mbps for cable TV channels.
- the video compression technology used in the ATSC method uses the ISO / IEC 13818-2 MPEG-2 video standard, and the compression format uses MPEG-2 MP @ HL, that is, the Main Profile and High Level standards. The format and restrictions are defined.
- the following describes the transmission modes available for performing a new broadcast while maintaining compatibility with existing broadcast channels such as 3D stereoscopic broadcast, UHD TV broadcast, and multiview broadcast in the MPEG-2 TS format for broadcasting and receiving digital TV. Let's look at it.
- 3D stereoscopic broadcasting, UHD TV broadcasting, and multiview broadcasting will be collectively referred to as composite video broadcasting.
- composite video broadcasting As a transmission mode usable in the MPEG-2 TS format, a frame compatible mode and a service compatible mode are distinguished. As such, when two transmission modes are used in digital broadcasting, there is a need for a receiver to recognize a transmission mode used by a transmitter.
- An object of the present invention is to propose a method for transmitting detailed information of 3D broadcasting.
- Another object of the present invention is to propose a method of transmitting detailed information of a service compatibility scheme among transmission modes of 3D broadcasting.
- Another problem to be solved by the present invention is to propose a method for transmitting detailed information supporting both a TS level multiplexing method and an ES level multiplexing method when multiplexing left and right compressed bitstreams when implementing a service compatibility scheme.
- the detailed information transmission scheme of the present invention proposes a detailed information transmission scheme that supports both a TS-level multiplexing scheme and an ES-level multiplexing scheme when multiplexing left and right compressed bitstreams.
- the detailed information transmission scheme proposes a method of transmitting detailed information supporting both a TS-level multiplexing method and an ES-level multiplexing method when multiplexing left and right compressed bitstreams when implementing a service compatible method during 3D broadcasting. .
- FIG. 1 illustrates a frame compatibility mode according to an embodiment of the present invention.
- FIG. 2 illustrates a service compatibility mode according to an embodiment of the present invention.
- FIG. 3 illustrates a multiplexing method of a TS level and a multiplexing method of an ES level according to an embodiment of the present invention.
- FIG. 4 illustrates a structure of a program map table (PMT) syntax according to an embodiment of the present invention.
- PMT program map table
- FIG. 5 illustrates a service_compatible_stereoscopic_video_descriptor according to an embodiment of the present invention.
- FIG. 6 illustrates a stereoscopic_stream_descriptor according to an embodiment of the present invention.
- FIG 7 illustrates MPEG2_video_3d_frame_frame_packing_arrangement_descriptor according to an embodiment of the present invention.
- FIG. 8 is a flowchart illustrating a process of performing multiplexing in a service compatibility mode according to an embodiment of the present invention.
- FIG. 9 illustrates a process of separating bit strings of left and right images when MVC bit string assembling is performed in an ES-level bit string multiplexing method according to an embodiment of the present invention.
- digital broadcasting is classified into 3D stereoscopic broadcasting, ultra high definition (UHD) TV broadcasting, and multi-view broadcasting.
- HD broadcasting transmits two screens
- 3D stereoscopic broadcasting transmits two screens
- UHD broadcasting includes four screens (4k)
- multiview broadcasting includes two or more screens.
- a packet identifier (PID: Packet Identifier) in MPEG2-TS is assigned to the left and right images to transmit a 3D stereoscopic image in stereo format, and multiplexed transmission is performed.
- PID Packet Identifier
- UHD video generally has a number of horizontal and vertical pixels of 4000 (4k ⁇ 3840x2160) to 8000 (8k ⁇ 7680x4320).
- UHD video is four times clearer than HD (2k, 1920x1080) on a 4k scale, given that the screen resolution depends on the number of pixels called pixels. Compared to 8k, the difference is up to 16 times in sharpness. Even at the refresh rate, or frames per second, HD is 30 HD, while 60Hz is 60 screens per second, so you can enjoy a much more natural and dynamic picture.
- Multi-view broadcasting can be viewed as a 3D stereoscopic image by combining two images from different angles up, down, left, and right according to the viewing angle of the viewer. If the television is equipped with a multi-view display, when the actor appears on the screen, the person on the left sees the actor's left face, and the person on the right sees the actor's right face in 3D stereoscopic image. In other words, it is an advanced form of 3D stereoscopic broadcasting.
- the present invention proposes a transmission and reception standard for an appropriate transmission mode when performing a new broadcast while maintaining compatibility with an existing broadcast channel when using any one of 3D stereoscopic broadcasting, UHD TV broadcasting, and multiview broadcasting. .
- FIG. 1A illustrates a frame compatibility mode
- FIG. 1B illustrates an example of a method of synthesizing an image to configure a frame compatibility mode.
- 1A and 1B are both examples of 3D stereoscopic broadcasting, and UHD broadcasting and multiview broadcasting may be extended in a similar manner.
- a frame compatibility mode will be described in detail with reference to FIGS. 1A and 1B.
- the frame compatibility mode transmits one frame in which left and right images are combined in one transmission band. Therefore, it is possible to maintain the same transmission / reception form as that used by conventional HD broadcasting.
- the previous HD broadcast transmits the video over the entire area
- the broadcast in frame compatibility mode transmits the synthesized video by the number of images. That is, as illustrated in FIG. 1B, the left image and the right image may be synthesized in one frame in various ways. As shown in FIG. 2B (a), half of a frame may be divided and synthesized, or as shown in FIG. 1B (b), the pixels may be divided and synthesized. Alternatively, as shown in FIG.
- the frame may be transmitted by alternating left and right images in chronological order.
- a process of reducing each image is necessary since the number of images is synthesized in one frame.
- You may need to increase or adjust the bitrate of the video compression.
- synthesize images As shown in the example of FIG. 1B, in the case of the 3D stereoscopic image, the left image and the right image may be changed or may be mixed in units of diagonal pixels.
- 2 shows a service compatibility mode.
- the service compatibility mode will be described in detail with reference to FIG. 2.
- 2 is a diagram illustrating 3D stereoscopic broadcasting, and UHD broadcasting and multiview broadcasting may be extended in a similar manner.
- the left image frame and the right image frame are separately compressed and transmitted in one transmission band instead of image synthesis. That is, as shown in FIG. 2, the left image frame and the right image frame are compressed using respective compression schemes, and the compressed left image frame and the right image frame are transmitted through one transmission band.
- one image is compressed to be compatible with existing HD broadcasting, while the other image is encoded and transmitted with a better compression scheme.
- one image of the left image and the right image is transmitted at high resolution, and the other image is transmitted using the low resolution.
- the left image is encoded and transmitted in the MPEG-2 Main profile
- the right image is encoded and transmitted in the MPEG-4 AVC / H.264 High profile.
- the left video transmits the video stream at a resolution of 1080i @ 60Hz and the right video transmits the video stream at a resolution of 720p @ 60Hz by the above-described encoding method.
- the left image is left as it is, the right image is subsampled in the vertical or horizontal direction, and the receiver generates one stereo image after restoring the sampled right image by the resolution of the left image.
- the transmission mode of the composite broadcast is divided into a frame compatibility mode and a service compatibility mode, and a transmitter uses one of two transmission modes.
- the compressed image is transmitted to the receiving end.
- the receiving end should recognize the transmission mode used by the transmitting end in order to decode the received compressed image.
- a conventional broadcast reception system that cannot process a composite video, only the primary view should be reproduced, ignoring the secondary view of the received composite video. As a result, it is possible to selectively receive a composite broadcast while maintaining compatibility with existing broadcast channels.
- FIG. 3 illustrates a left and right image multiplexing method used in a service compatibility mode according to an embodiment of the present invention.
- the left and right image multiplexing method used in the service compatibility mode according to an embodiment of the present invention will be described with reference to FIG. 3.
- the multiplexing method used in the service compatibility mode is divided into a TS level multiplexing method and an ES level multiplexing method.
- the TS-level multiplexing method assigns different PIDs to a packetized elementary stream (PES) that packetizes an elementary stream (ES) of left and right video, and needs to specify a PID of the reference video. . That is, as shown in FIG. 3, it can be seen that different PIDs are allocated to the left image and the right image, respectively.
- PES packetized elementary stream
- ES elementary stream
- the ES-level multiplexing method is a method in which a bitstream (ES), which compresses left and right images, is merged into one elementary stream (ES) and transmitted using one PID. Therefore, the ES-level multiplexing method needs a method of distinguishing a bitstream of a compressed left and right video from one elementary stream (ES).
- ES bitstream
- a byte offset may be used. That is, according to FIG. 3, one PID is assigned to the left image and the right image, and an example of describing an offset for distinguishing the left image and the right image is illustrated.
- the ES-level multiplexing method combines the compressed bit strings of the left and right images into one compressed bit string (ES), and allocates and transmits one PID to the packetized PES.
- ES compressed bit string
- a description is needed for de-assembling or bitstream extraction. For example, use MVC bitstream assembling. Or, additional syntax such as Byte Offset for separating two images is required. 9 and 10 illustrate the separation process and the combination and separation when using MVC bitstream combining.
- identification information for recognizing reception of a 3D image in a reception system capable of processing a 3D image is included in system information and received.
- PSI / PSIP Program Specific Information / Program and System Information Protocol
- PSIP Program Specific Information / Program and System Information Protocol
- a protocol for transmitting system information in a table format may be applicable to the present invention regardless of its name.
- PSI is a system standard of MPEG-2 defined for classifying channels and programs
- PSIP is an Advanced Television Systems Committee (ATSC) standard for classifying channels and programs.
- ATSC Advanced Television Systems Committee
- the PSI may include, as an example, a Program Association Table (PAT), a Conditional Access Table (CAT), a Program Map Table (PMT), and a Network Information Table (NIT).
- PAT Program Association Table
- CAT Conditional Access Table
- PMT Program Map Table
- NIT Network Information Table
- PAT is special information transmitted by a packet having a PID of '0', and transmits PID information of a corresponding PMT and PID information of a NIT for each program.
- the CAT transmits information about the pay broadcasting system used on the transmitting side.
- the PMT transmits the program identification number, PID information of a transport stream packet to which individual bit streams such as video and audio constituting the program are transmitted, and PID information to which PCR is delivered.
- NIT transmits the information of the actual transmission network. For example, a PAT table with a PID of 0 is parsed to find a program number and a PID of a PMT. When the PMT obtained from the PAT is parsed, the correlation between the components constituting the program can be known.
- FIG. 4 illustrates a structure of a program map table (PMT) syntax according to an embodiment of the present invention.
- PMT program map table
- the table_id field is a table identifier, and an identifier for identifying the PMT may be set.
- the section_syntax_indicator field is an indicator that defines the section format of the PMT.
- the section_length field represents a section length of the PMT.
- the program_number field indicates information of a program as information corresponding to the PAT.
- the version_number field represents a version number of the PMT.
- the current_next_indicator field is an identifier indicating whether the current table section is applicable.
- the section_number field indicates the section number of the current PMT section when the PMT is transmitted divided into one or more sections.
- the last_section_number field represents the last section number of the PMT.
- the PCR_PID field indicates a PID of a packet carrying a program clock reference (PCR) of a current program.
- the program_info_length field represents descriptor length information immediately following the program_info_length field in number of bytes. That is, the length of the descriptors included in the first loop.
- the stream_type field indicates the type and encoding information of the element stream included in the packet having the PID value indicated by the following elementary_PID field.
- the elementary_PID field represents an identifier of the element stream, that is, a PID value of a packet including the corresponding element stream.
- the ES_Info_length field represents descriptor length information immediately after the ES_Info_length field in number of bytes. That is, the length of the descriptors included in the second loop.
- the descriptor related to the composite information about the left and right images for the specific program number is present in the descriptor following the program_info_length syntax.
- descriptors related to the left and right image individual ESs are present in descriptors following the ES_info_length syntax.
- the composition information related descriptor for the left and right images is defined as service_compatible_stereoscopic_video_descriptor ().
- Information related to the frame packing arrangement has described MPEG2_video_3d-frame_packing_arrangement_descriptor () defined in the frame compatibility scheme.
- the MPEG2_video_3d_frame_frame_packing_arrangement_descriptor () position may be specified at the descriptor position under ES_info_length rather than the current position.
- the PID and the descriptor need to exist together.
- the descriptor related to the left and right images individual ES is defined by stereoscopic_stream_descriptor ().
- the positions of these descriptors are directly expressed in the syntax, but it is noted that they are actually included in a form selectively like the conventional descriptors.
- FIG. 4 first constructs a composition related descriptor for left and right images for a specific program number, and then configures descriptors related to individual ESs. But it is not limited thereto. That is, as described above, as the position of MPEG2_video_3d_frame_frame_packing_arrangement_descriptor () may be changed, the position of the descriptor in FIG. 4 may vary depending on the situation.
- Table 1 below illustrates the strip type shown in FIG.
- FIG. 5 illustrates a service_compatible_stereoscopic_video_descriptor according to an embodiment of the present invention.
- service_compatible_stereoscopic_video_descriptor according to an embodiment of the present invention will be described in detail with reference to FIG. 5.
- the service_compatible_stereoscopic_video_descriptor is a composite information related descriptor for the left and right images, and includes contents such as the resolution of the left and right images, sub-view restriction, whether the GOP structure is aligned, and whether the ES level multiplexing method is used.
- FIG. 5A shows an example of not using a PID
- FIG. 5B shows an example of using a PID.
- Resolution_Conversion_flag is 1, the Primary_Conversion_Type and Secondary_Conversion_Type syntaxes are present. If 0, the resolution_Conversion_flag means that the resolution of the left and right images is the same.
- Primary_Conversion_Type is the split content compared to the original video of the main video as shown in Table 2 below. However, the values and contents of Table 2 are exemplified and may be changed, reduced, or determined as necessary.
- Secondary_Conversion_Type is the content of segmentation compared to the original image of the sub-image as shown in Table 3.
- Allowance_of_Secondary_View_Presentation_flag 1
- Alignment_of_GOP_Structure_flag of 1 means that the GOP structures of the left and right images match, and if they do not match, it is signaled that additional processing for synchronization during presentation is required according to the GOP structure.
- the synchronization between the left and right images is basically implemented by the PTS, but is intended to help the necessary processing in advance by signaling to the receiver. For example, the overall delay is adjusted to the image having the larger delay among the left and right images.
- ES_level_composition_flag 1
- Primary_PID_flag is a flag that exists only in the TS level multiplexing scheme. If it is 1, it means that there is a Primary_PID syntax. Otherwise, stereoscopic_stream_descriptor () exists to identify the PID of the main image.
- Primary_PID may determine the main image among the PIDs included in the PMT by specifying the PID of the main image.
- Right_Is_Primary_flag is a flag present only in the TS level multiplexing scheme. If Right_Is_Primary_flag is present, if 1, the main image is the right image, otherwise the left image is the main image.
- the stereoscopic_stream_descriptor does not exist and the main image and the left image can be discriminated with one descriptor.
- Left_PID may be informed in a similar manner, and Right_Is_Primary_flag may be designated, and the like, without departing from the spirit of the present invention.
- stereoscopic_stream_descriptor may be present to determine whether the primary image and the right image are present for each image.
- the stereoscopic_stream_descriptor is an information related descriptor related to the left and right image individual ESs, and serves to specify whether the current ES is the primary view in the stereoscopic image.
- each ES is described, and in the case of an ES-level composition mode, one ES is described. Therefore, this descriptor considers both of them.
- Primary_flag if set to 1, means that the current ES is the primary view. If only the primary image is played, the primary image must be played.
- left_flag is set to 1, it means that the current ES is a bit string of the left image.
- Frist_Primary_flag When Frist_Primary_flag is set to 1, when two video bit streams are assembling (interleaving) in an arbitrary unit at ES-level, it signals that the first part is a bit stream corresponding to the main picture.
- First_Left_flag when set to 1, signals that the first part is a bit string corresponding to the left image when two video bit strings are assembling (interleaving) in an arbitrary unit at the ES-level.
- Information related to the frame packing arrangement uses MPEG2_video_3d_frame_frame_packing_arrangement_descriptor () defined in the frame compatibility mode, and semantics may be the same as the content defined in the frame compatibility mode.
- FIG. 8 is a flowchart illustrating a process of performing multiplexing in a service compatibility mode according to an embodiment of the present invention.
- a process of performing multiplexing in the service compatibility mode according to an embodiment of the present invention will be described with reference to FIG. 8.
- step S800 the PMT length is checked by section_length.
- Step S802 reads a syntax including program_number.
- step S804 the descriptor length is checked by program_info_length.
- Step S806 checks whether all descriptors related to program info have been read. If all are read, go to step S820 via A. If not, go to step S808 and read one descriptor related to program info.
- step S810 it is determined whether the service_compatible_stereoscopic_video_decriptor is used. If it is service_compatible_stereoscopic_video_decriptor, it moves to step S812. If it is not service_compatible_stereoscopic_video_decriptor, it moves to step S806.
- Step S812 analyzes the synthesis information related to the left and right images by reading a syntax including the ES_level_composition_flag.
- step S814 the ES_level_compatible_flag is set. If ES_level_compatible_flag is set, go to step S816; otherwise, go to step S818.
- Step S816 activates the ES level multiplexing structure mode
- step S818 activates the TS level multiplexing structure mode.
- step S820 confirm that all PMTs have been read. If all have been read, the process moves to step S842. If all have not been read, the process moves to step S822. Step S822 reads stream_type and elementary_PID.
- step S824 the descriptor length is checked by ES_info_length.
- Step S826 checks whether all ES info related descriptors have been read. If all have been read, go to step S820, and if not all, go to step S828.
- Step S828 reads one descriptor related to ES info and moves to step S830.
- step S830 it is determined whether the stereoscopic_stream_descriptor is used. If it is stereoscopic_stream_descriptor, it moves to step S834. If it is not stereoscopic_stream_descriptor, it moves to step S838.
- Step S838 reads First_Primary_flag and First_Left_flag, and in step S842, it is determined whether the video data located at the head of the data corresponding to the current element_PID corresponds to the main video and the left video.
- step S842 the CRC_32 is read and a data error is verified.
- FIG. 9 is a flowchart illustrating a process of separating bit strings of an ES level multiplexing scheme.
- a process of performing bit string separation of an ES level multiplexing scheme in a service compatibility mode according to an embodiment of the present invention will be described with reference to FIG. 9.
- the following procedure assumes that the bit strings are mixed left and right, and one of ordinary skill in the art will understand that the bit strings may be mixed right and left.
- Step S900 parses the PMT.
- Step S901 parses service_compatible_stereoscopic_video_descriptor () in the PMT.
- step S902 the ES_level_composition_flag is checked to determine whether the ES level is multiplexed.
- the step S903 checks the MVC_bitstream_assembling_flag and checks whether the bitstream assembling method (defined in the stereo_high_profile or the multiview_high_profile) of the bitstream corresponds to the ES level multiplexing method of the bitstream.
- Step S904 parses stereosopic_stream_descriptor () in the PMT.
- Step S905 checks First_Primary_flag.
- Step S906 checks First_Left_flag.
- Step S907 detects one AU from the received mixed bit stream.
- step S908 whether the left image of the data at the head of the data is checked from First_Left_flag identified in step S906.
- Step S909 determines whether the AU detected in step S907 is an odd-numbered AU. If the odd AU goes to step S910, the even AU goes to step S911.
- steps S910 and S911 the corresponding AU is divided into a left image bit or a right image bit string using the information determined from step S909.
- step S912 it is determined whether all the AUs have been read, and if all of the AUs have been read, the process moves to step S913.
- step S913 whether the left image of the data at the head of the data is checked from First_Primary_flag identified in step S906. If First_Primary_flag is 1, the process moves to step S914, and if 0, the process moves to step S915.
- Steps S914 and S915 determine which bit string of each bit string is the main image using the primary information determined from step S913.
- FIG. 10 is a diagram illustrating a bit string separation method according to the MVC Bitstream Extraction method corresponding to S907 to S912 of FIG. 9.
- Receiving the mixed left and right bit stream, as shown in Figure 9 is assembling in one AU unit, since it is possible to know whether the first AU is the main image or the left image, the view_id present in the NAL header by parsing the NAL header. Even without checking anchor_pic_flag, the bit strings of the left and right images can be separated as shown in FIG. 10.
- the present invention is not limited to 3D stereoscopic broadcast but reveals that it is a technology that can be applied to both complex broadcasting such as UHD TV broadcasting and multi-view broadcasting.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims (9)
- 디지털 방송 프로그램의 서비스 호환 방식에 대한 정보를 제공받는 방법에 있어서,상기 서비스 호환 방식과 관련된 정보를 지시하는 식별자를 설정하는 단계;수신된 상기 디지털 방송 프로그램에 포함된 식별자에 따라 상기 서비스 호환 방식을 구별하는 단계를 포함함을 특징으로 하는 정보를 제공 받는 방법.
- 제 1항에 있어서, 상기 서비스 호환 방식은 하나의 전송 대역에 좌영상 프레임과 우영상 프레임을 각각 압축하여 전송하는 방식임을 특징으로 하는 정보를 제공 받는 방법.
- 제 2항에 있어서, 상기 서비스 호환 방식과 관련된 정보는,좌우 영상의 해상도, 부 영상 시청 제한 정보, GOP 구조 정렬 여부, ES 수준의 다중화 여부에 관한 정보 중 적어도 하나임을 특징으로 하는 정보를 제공받는 방법.
- 제 3항에 있어서, 상기 좌우 영상의 해상도는,원본 영상 대비 부 영상의 분할 내용을 지시하는 식별자를 포함함을 특징으로 하는 정보를 제공받는 방법.
- 제 4항에 있어서, 상기 원본 영상 대비 부 영상의 분할 내용을 지시하는 식별자는,크기 변경없음, 가로 2분할, 세로 2분할, 가로/세로 2분할 중 적어도 하나를 지시하는 식별자를 포함함을 특징으로 하는 정보를 제공 받는 방법.
- 제 3항에 있어서, 상기 부 영상 시청 제한 정보는,상기 부 영상으로 2D 방송 서비스 여부를 지시하는 식별자를 포함함을 특징으로 하는 정보를 제공받는 방법.
- 제 1항에 있어서, 상기 디지털 방송은,3D 방송, UHD TV 방송, 멀티뷰 방송 중 적어도 하나를 포함함을 특징으로 하는 정보를 제공받는 방법.
- 디지털 방송 프로그램의 서비스 호환 방식에 대한 정보를 제공받는 방법에 있어서,수신된 스트림에 포함된 기준 영상임을 지시하는 식별자를 추출하는 단계;추출된 상기 식별자에 따라 기준 영상을 획득하는 단계를 포함함을 특징으로 하는 정보를 제공받는 방법.
- 제 8항에 있어서, 상기 스트림은,상기 스트림이 우 영상인지 좌 영상인지 구분하는 식별자를 더 포함함을 특징으로 하는 정보를 제공받는 방법.
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2818932A CA2818932C (en) | 2010-11-27 | 2011-01-18 | Method for service compatibility-type transmitting in digital broadcast |
MX2013005885A MX2013005885A (es) | 2010-11-27 | 2011-01-18 | Metodo para servicio tipo compatibilidad que transmite en radiodifusion digital. |
US13/989,678 US8928733B2 (en) | 2010-11-27 | 2011-01-18 | Method for service compatibility-type transmitting in digital broadcast |
US14/104,868 US9204124B2 (en) | 2010-11-27 | 2013-12-12 | Method for service compatibility-type transmitting in digital broadcast |
US14/919,101 US9635344B2 (en) | 2010-11-27 | 2015-10-21 | Method for service compatibility-type transmitting in digital broadcast |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2010-0119247 | 2010-11-27 | ||
KR1020100119247A KR20120058702A (ko) | 2010-11-27 | 2010-11-27 | 디지털 방송에서 서비스 호환 방식 전송 방법 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/989,678 A-371-Of-International US8928733B2 (en) | 2010-11-27 | 2011-01-18 | Method for service compatibility-type transmitting in digital broadcast |
US14/104,868 Continuation US9204124B2 (en) | 2010-11-27 | 2013-12-12 | Method for service compatibility-type transmitting in digital broadcast |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012070716A1 true WO2012070716A1 (ko) | 2012-05-31 |
Family
ID=46146044
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2011/000362 WO2012070716A1 (ko) | 2010-11-27 | 2011-01-18 | 디지털 방송에서 서비스 호환 방식 전송 방법 |
Country Status (5)
Country | Link |
---|---|
US (3) | US8928733B2 (ko) |
KR (1) | KR20120058702A (ko) |
CA (1) | CA2818932C (ko) |
MX (1) | MX2013005885A (ko) |
WO (1) | WO2012070716A1 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015037964A1 (ko) * | 2013-09-16 | 2015-03-19 | 삼성전자 주식회사 | 방송 수신 장치 및 그 제어 방법 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9288513B2 (en) | 2011-08-29 | 2016-03-15 | Aerovironment, Inc. | System and method of high-resolution digital data image transmission |
US8922587B2 (en) * | 2013-03-14 | 2014-12-30 | The United States Of America As Represented By The Secretary Of The Army | Crew shared video display system and method |
KR20160002679A (ko) * | 2013-05-01 | 2016-01-08 | 엘지전자 주식회사 | 신호 송수신 장치 및 신호 송수신 방법 |
CN103618913A (zh) * | 2013-12-13 | 2014-03-05 | 乐视致新电子科技(天津)有限公司 | 在智能电视中播放3d片源的方法及装置 |
JP6507797B2 (ja) * | 2015-03-31 | 2019-05-08 | 豊田合成株式会社 | 膝保護用エアバッグ装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002218502A (ja) * | 2001-01-22 | 2002-08-02 | Nippon Television Network Corp | 立体映像信号の伝送方法及び、そのシステム |
WO2008069613A1 (en) * | 2006-12-08 | 2008-06-12 | Electronics And Telecommunications Research Institute | System for transmitting/receiving digital realistic broadcasting based on non-realtime and method therefor |
KR20090036080A (ko) * | 2007-10-08 | 2009-04-13 | 엘지전자 주식회사 | Maf 파일 포맷을 구성하는 방법 및 이를 이용한 비디오 신호의 디코딩 장치 |
US20100134592A1 (en) * | 2008-11-28 | 2010-06-03 | Nac-Woo Kim | Method and apparatus for transceiving multi-view video |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6055012A (en) * | 1995-12-29 | 2000-04-25 | Lucent Technologies Inc. | Digital multi-view video compression with complexity and compatibility constraints |
KR100657322B1 (ko) * | 2005-07-02 | 2006-12-14 | 삼성전자주식회사 | 로컬 3차원 비디오를 구현하기 위한 인코딩/디코딩 방법 및장치 |
CN101371584B (zh) | 2006-01-09 | 2011-12-14 | 汤姆森特许公司 | 提供用于多视图视频编码的降低分辨率的更新模式的方法和装置 |
KR100810318B1 (ko) * | 2006-02-08 | 2008-03-07 | 삼성전자주식회사 | 디지털 멀티미디어 방송 제한 서비스 시스템 및 그 방법 |
CN101453662B (zh) * | 2007-12-03 | 2012-04-04 | 华为技术有限公司 | 立体视频通信终端、系统及方法 |
KR20090097015A (ko) * | 2008-03-10 | 2009-09-15 | 삼성전자주식회사 | 스케일러블 영상 부호화장치 및 스케일러블 영상복호화장치 |
KR101506219B1 (ko) * | 2008-03-25 | 2015-03-27 | 삼성전자주식회사 | 3차원 영상 컨텐츠 제공 방법, 재생 방법, 그 장치 및 그기록매체 |
JP5338166B2 (ja) * | 2008-07-16 | 2013-11-13 | ソニー株式会社 | 送信装置、立体画像データ送信方法、受信装置および立体画像データ受信方法 |
EP2395772A3 (en) * | 2008-09-30 | 2013-09-18 | Panasonic Corporation | Glasses and display device |
KR20100040640A (ko) * | 2008-10-10 | 2010-04-20 | 엘지전자 주식회사 | 수신 시스템 및 데이터 처리 방법 |
US8358331B2 (en) * | 2008-12-02 | 2013-01-22 | Lg Electronics Inc. | 3D caption display method and 3D display apparatus for implementing the same |
KR101591703B1 (ko) | 2009-02-13 | 2016-02-04 | 삼성전자주식회사 | 3차원 영상 데이터스트림 생성 방법 및 그 장치와 3차원 영상 데이터스트림 수신 방법 및 그 장치 |
US8289998B2 (en) * | 2009-02-13 | 2012-10-16 | Samsung Electronics Co., Ltd. | Method and apparatus for generating three (3)-dimensional image data stream, and method and apparatus for receiving three (3)-dimensional image data stream |
WO2010095440A1 (ja) * | 2009-02-20 | 2010-08-26 | パナソニック株式会社 | 記録媒体、再生装置、及び集積回路 |
CN102804785A (zh) | 2009-04-13 | 2012-11-28 | 瑞尔D股份有限公司 | 编码、解码和发布增强分辨率的立体视频 |
JP5627860B2 (ja) * | 2009-04-27 | 2014-11-19 | 三菱電機株式会社 | 立体映像配信システム、立体映像配信方法、立体映像配信装置、立体映像視聴システム、立体映像視聴方法、立体映像視聴装置 |
JP5448558B2 (ja) * | 2009-05-01 | 2014-03-19 | ソニー株式会社 | 送信装置、立体画像データの送信方法、受信装置、立体画像データの受信方法、中継装置および立体画像データの中継方法 |
JP5463747B2 (ja) * | 2009-06-15 | 2014-04-09 | ソニー株式会社 | 受信装置、送信装置、通信システム、表示制御方法、プログラム、及びデータ構造 |
KR101372376B1 (ko) * | 2009-07-07 | 2014-03-14 | 경희대학교 산학협력단 | 디지털 방송 시스템의 스테레오스코픽 비디오 수신 방법 |
US8493434B2 (en) * | 2009-07-14 | 2013-07-23 | Cable Television Laboratories, Inc. | Adaptive HDMI formatting system for 3D video transmission |
KR101694821B1 (ko) | 2010-01-28 | 2017-01-11 | 삼성전자주식회사 | 다시점 비디오스트림에 대한 링크 정보를 이용하는 디지털 데이터스트림 전송 방법와 그 장치, 및 링크 정보를 이용하는 디지털 데이터스트림 전송 방법과 그 장치 |
WO2011108903A2 (ko) * | 2010-03-05 | 2011-09-09 | 한국전자통신연구원 | 복수 전송 계층 연동형 3dtv 방송 서비스 제공을 위한 송신 및 수신 방법, 송신 및 수신 장치 |
US20130209063A1 (en) * | 2010-08-17 | 2013-08-15 | Lg Electronics Inc. | Digital receiver and content processing method in digital receiver |
WO2012026746A2 (en) | 2010-08-23 | 2012-03-01 | Lg Electronics Inc. | Method for providing 3d video data in a 3dtv |
CN103168473B (zh) * | 2010-10-16 | 2016-09-28 | Lg电子株式会社 | 数字接收机以及用于处理数字接收机中的3d 内容的方法 |
CN103202023A (zh) | 2010-10-25 | 2013-07-10 | 松下电器产业株式会社 | 编码方法、显示装置、解码方法 |
WO2012121567A2 (ko) * | 2011-03-10 | 2012-09-13 | 한국전자통신연구원 | 실시간 방송 프로그램의 기준 영상과 부가 영상간의 동기화 방법 및 장치 |
-
2010
- 2010-11-27 KR KR1020100119247A patent/KR20120058702A/ko active Application Filing
-
2011
- 2011-01-18 US US13/989,678 patent/US8928733B2/en active Active
- 2011-01-18 MX MX2013005885A patent/MX2013005885A/es active IP Right Grant
- 2011-01-18 CA CA2818932A patent/CA2818932C/en active Active
- 2011-01-18 WO PCT/KR2011/000362 patent/WO2012070716A1/ko active Application Filing
-
2013
- 2013-12-12 US US14/104,868 patent/US9204124B2/en active Active
-
2015
- 2015-10-21 US US14/919,101 patent/US9635344B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002218502A (ja) * | 2001-01-22 | 2002-08-02 | Nippon Television Network Corp | 立体映像信号の伝送方法及び、そのシステム |
WO2008069613A1 (en) * | 2006-12-08 | 2008-06-12 | Electronics And Telecommunications Research Institute | System for transmitting/receiving digital realistic broadcasting based on non-realtime and method therefor |
KR20090036080A (ko) * | 2007-10-08 | 2009-04-13 | 엘지전자 주식회사 | Maf 파일 포맷을 구성하는 방법 및 이를 이용한 비디오 신호의 디코딩 장치 |
US20100134592A1 (en) * | 2008-11-28 | 2010-06-03 | Nac-Woo Kim | Method and apparatus for transceiving multi-view video |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015037964A1 (ko) * | 2013-09-16 | 2015-03-19 | 삼성전자 주식회사 | 방송 수신 장치 및 그 제어 방법 |
Also Published As
Publication number | Publication date |
---|---|
US9204124B2 (en) | 2015-12-01 |
US20140168364A1 (en) | 2014-06-19 |
KR20120058702A (ko) | 2012-06-08 |
US20130250057A1 (en) | 2013-09-26 |
US8928733B2 (en) | 2015-01-06 |
CA2818932A1 (en) | 2012-05-31 |
US20160073086A1 (en) | 2016-03-10 |
CA2818932C (en) | 2017-11-07 |
MX2013005885A (es) | 2013-10-25 |
US9635344B2 (en) | 2017-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101683119B1 (ko) | 방송 송신기, 방송 수신기 및 3d 비디오 데이터 처리 방법 | |
WO2012070715A1 (ko) | 디지털 방송의 전송 모드 제공 및 인지 방법 | |
WO2011108903A2 (ko) | 복수 전송 계층 연동형 3dtv 방송 서비스 제공을 위한 송신 및 수신 방법, 송신 및 수신 장치 | |
WO2013019042A1 (ko) | 실시간으로 전송되는 기준 영상과 별도로 전송되는 부가 영상 및 콘텐츠를 연동하여 3d 서비스를 제공하기 위한 전송 장치 및 방법, 및 수신 장치 및 방법 | |
WO2010041905A2 (ko) | 수신 시스템 및 데이터 처리 방법 | |
WO2010117129A2 (en) | Broadcast transmitter, broadcast receiver and 3d video data processing method thereof | |
US10123069B2 (en) | Receiving apparatus, receiving method, and receiving display method for displaying images at specified display positions | |
EP2353299A2 (en) | Apparatus and method for synchronizing stereoscopic image, and apparatus and method for providing stereoscopic image based on the same | |
WO2012070716A1 (ko) | 디지털 방송에서 서비스 호환 방식 전송 방법 | |
WO2011046271A1 (en) | Broadcast receiver and 3d video data processing method thereof | |
US20140327740A1 (en) | Transmission apparatus, transmisson method, receiver and receiving method | |
KR20140000136A (ko) | 화상 데이터 송신 장치, 화상 데이터 송신 방법, 화상 데이터 수신 장치 및 화상 데이터 수신 방법 | |
US20140232823A1 (en) | Transmission device, transmission method, reception device and reception method | |
KR101818141B1 (ko) | 디지털 방송에서 서비스 호환 방식 전송 방법 | |
KR20110068821A (ko) | 송, 수신 장치 및 송, 수신 방법 | |
WO2012074331A2 (ko) | 스테레오스코픽 영상 정보의 전송 방법 및 장치 | |
WO2013055032A1 (ko) | 융합형 3dtv에서 컨텐츠 스트림에 접근하는 컨텐츠 제공 장치 및 방법, 그리고 컨텐츠 재생 장치 및 방법 | |
WO2017164551A1 (ko) | 방송 신호 송수신 방법 및 장치 | |
KR101779054B1 (ko) | 디지털 방송의 전송 모드 제공 및 인지 방법 | |
KR101277267B1 (ko) | 3차원 방송을 위한 데이터 코덱 방법 및 장치 | |
WO2016036012A1 (ko) | 방송 신호 송수신 방법 및 장치 | |
WO2013058455A1 (ko) | 비디오 신호의 보조 데이터 공간에 동기 정보를 추가하여 영상을 동기화하는 장치 및 방법 | |
KR20120139643A (ko) | 3차원 방송을 위한 데이터 코덱 방법 및 장치 | |
KR20120087869A (ko) | 3차원 방송을 위한 데이터 코덱 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11842604 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2818932 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2013/005885 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13989678 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11842604 Country of ref document: EP Kind code of ref document: A1 |