WO2007069189A2 - Method and processing unit for processing a video transport stream - Google Patents

Method and processing unit for processing a video transport stream Download PDF

Info

Publication number
WO2007069189A2
WO2007069189A2 PCT/IB2006/054755 IB2006054755W WO2007069189A2 WO 2007069189 A2 WO2007069189 A2 WO 2007069189A2 IB 2006054755 W IB2006054755 W IB 2006054755W WO 2007069189 A2 WO2007069189 A2 WO 2007069189A2
Authority
WO
WIPO (PCT)
Prior art keywords
viewing position
metadata
video
transport stream
viewer
Prior art date
Application number
PCT/IB2006/054755
Other languages
French (fr)
Other versions
WO2007069189A3 (en
Inventor
Yang Peng
Mo Li
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2007069189A2 publication Critical patent/WO2007069189A2/en
Publication of WO2007069189A3 publication Critical patent/WO2007069189A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/178Metadata, e.g. disparity information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4524Management of client data or end-user data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/597Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding

Definitions

  • the invention relates to a method and processing unit for processing a video transport stream.
  • Modern video processing units incorporate many technologies and methods for providing high quality video to viewers.
  • the video may come from broadcast or be stored in an optical disc, such as BD (Blu-ray Disc), HD-DVD (High Definition-Digital
  • the video can be 3D (three-dimensional) format or 2D format.
  • the technologies and methods can be Cathode Ray Tubes (CRT), Liquid Crystal Displays (LCD), Lasers, or so-called Digital Micromirror Devices (DMD).
  • CTR Cathode Ray Tubes
  • LCD Liquid Crystal Displays
  • DMD Digital Micromirror Devices
  • the method according to the invention aims at processing a video transport stream comprising a set of metadata, and the metadata each comprise a video signal and viewing position parameters.
  • the viewing position parameters indicating the relative position between a viewer and a video display which is intended to present the video signals.
  • the method comprises the steps of:
  • the invention also proposes a processing unit for implementing the different steps of said method according to the invention.
  • the advantage is that the proposed method and processing unit can dynamically select different video signals according to different viewing position information of the viewer, in order to present to the viewer the most suitable video according to his position compared to the display. In other words, it makes the video signal that is displayed adaptive to the viewer's position.
  • Fig. 1 is a schematic diagram illustrating a system for displaying a video signal to a viewer who is positioned at different viewing positions around a display;
  • Fig. 2 is a schematic block diagram illustrating a processing unit for processing video transport stream according to the invention
  • Fig. 3 is a schematic diagram illustrating a structure of a clip of AV streams
  • Fig. 4 is a flow chart diagram illustrating a method of processing a video transport stream according to the invention. Same reference numerals are used to denote similar parts throughout the figures.
  • FIG. 1 is a schematic diagram illustrating a system for displaying a video signal to a viewer who is located at different viewing positions around a display 100.
  • a processing unit 200 is used to receive viewing position information of a viewer, and outputting to the display 100 a video signal that corresponds to the viewing position information.
  • the display 100 is used to present the video signal to the viewer.
  • the viewing position corresponds to the position of the viewer located around the display 100, e.g. the viewer can be located in viewing position 1, or viewing position 2, or viewing position 3 to watch the content of a video signal.
  • the viewing position information indicating the relative position between the viewer and the display 100 may correspond to viewing angles, such as angles al, a2 or a3.
  • the viewing position information indicating the relative position between the viewer and the display 100 may correspond to viewing distances, such as distances dl, d2 or d3.
  • the processing unit 200 may be integrated in a player for optical discs (e.g. computer disc drive, standalone apparatus...) or digital TV STB.
  • the display 100 may correspond, for example, to a TV or a computer display.
  • Fig. 2 is a schematic block diagram illustrating a processing unit 200 for processing a video transport stream according to the invention.
  • the video transport stream comprises a set of metadata, and the metadata each comprise a video signal and viewing position parameters.
  • the viewing position parameters indicate the relative position between a viewer and a video display intended to present the video signals.
  • the viewing position parameters can correspond to viewing angles or viewing distances, as previously described.
  • the viewing position parameters can be pre-stored in a data structure, and indicates the position of a viewer relative to the display in order to best watch the content of the corresponding video signal.
  • the processing unit (200) comprises means for: - receiving (210) viewing position information of the viewer.
  • the receiving means (210) may correspond to a buffer.
  • the viewing position information can be detected by a number of sensors, a camera, or input directly by the viewer using a remote control, keyboard (not shown).
  • - selecting (220) in said transport stream the metadata that have viewing position parameters that correspond to said viewing position information.
  • the selecting means (220) may comprise means for comparing the viewing position information with the viewing position parameters, and means for choosing the metadata that have viewing position parameters closest to said viewing position information.
  • the receiving means (210) will receive viewing position information corresponding to the viewing position 2 from the sensors, camera, remote control, or keyboard.
  • the selecting means (220) will select the metadata in the video transport stream that have position parameters which are closest to the angle a2.
  • the selecting means (220) will select the metadata in the video transport stream that have position parameters which are closest to the angle d2.
  • the processing unit 200 may comprise a de-multiplexer for de-multiplexing the video transport stream. If the video transport stream is encoded, the processing unit 200 may also comprise a decoder for decoding the video signal before outputting the video signal. For example, if the video transport stream is encoded according to the MPEG standard, the decoder is an MPEG decoder.
  • the data structure containing the viewing position parameters of the metadata may be a PID (Program identifier) table, or a database referring to a PID table or ClipMark. Use of a PID table
  • the PID table can be provided together with a video transport stream (encoded or not encoded), or be provided as a separate file.
  • the PID table is similar to Cliplnfo (Clip Information) file defined in the BD (Blu-ray Disc) A/V application format.
  • the Cliplnfo file is used to specify attributes of the metadata, such as the time stamps of the access point in corresponding metadata. Two solutions may be provided for the PID table:
  • the PID table contains entry point (i.e. start mark of metadata), attribute information (e.g. time stamps, format, definition, connection with other metadata), and the viewing position parameters of the metadata: the processing unit 200 can select viewing position parameters in the PID table according to current viewing position information, such as viewing angle or viewing distance.
  • entry point i.e. start mark of metadata
  • attribute information e.g. time stamps, format, definition, connection with other metadata
  • viewing position parameters of the metadata the processing unit 200 can select viewing position parameters in the PID table according to current viewing position information, such as viewing angle or viewing distance.
  • the PID table does not contain viewing position parameters of the metadata; the PID table may comprise entry point, the attribute information, and metadata name used as a reference. As shown in Table 1 below, the metadata name [angle_id] is used as a reference (in bold in Table 1), so that the processing unit 200 can select a video signal according to current viewing position information of the viewer, such as viewing angle (or viewing distance).
  • Fig. 3 is a schematic diagram illustrating a structure of a clip AV stream.
  • the viewing position parameters of the metadata can be pre-defined in a ClipMark.
  • the ClipMark is defined in Blu-ray Disc Application format, and pre-defined between two adjacent metadata for storing entry point (start mark of metadata) and attribute information of metadata.
  • the ClipMark is further used to contain the viewing position parameters of the metadata.
  • Fig. 4 is a flow chart illustrating a method of processing a video transport stream according to the invention.
  • the video transport stream comprises a set of metadata, and the metadata each comprise a video signal and viewing position parameters as described in Fig.2.
  • the method comprises the steps of:
  • the selecting (420) is intended to compare said viewing position information with said viewing position parameters and choose the metadata that have viewing position parameters closest to said viewing position information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Library & Information Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention relates to a system and method of processing a video transport stream comprising a set of metadata, the metadata each comprising a video signal and viewing position parameters, said viewing position parameters indicating the relative position between a viewer and a video display intended to present video signals, said method comprising the steps of: - receiving (410) viewing position information of the viewer, - selecting (420) in said transport stream the metadata that have viewing position parameters corresponding to said viewing position information, - outputting (430) to the video display the video signal comprised in the selected metadata.

Description

METHOD AND PROCESSING UNIT FOR PROCESSING A VIDEO TRANSPORT
STREAM
FIELD OF THE INVENTION
The invention relates to a method and processing unit for processing a video transport stream.
BACKGROUND OF THE INVENTION
Modern video processing units incorporate many technologies and methods for providing high quality video to viewers. The video may come from broadcast or be stored in an optical disc, such as BD (Blu-ray Disc), HD-DVD (High Definition-Digital
Versatile Disc). The video can be 3D (three-dimensional) format or 2D format. The technologies and methods can be Cathode Ray Tubes (CRT), Liquid Crystal Displays (LCD), Lasers, or so-called Digital Micromirror Devices (DMD).
Most technologies and methods for providing high quality video are designed to enable a viewer watching video programs and being located at different positions around a display to experience watching stereoscopic video.
However, depending on these conventional technologies and methods to watch a video, a viewer has to wear special glasses using polarization or light shutters to filter out some unwanted video signal while enabling one respective video signal corresponding to the viewer's position to pass to the viewer's eyes. However, such technologies have some limitations since they do not provide a comfortable environment for the viewers.
OBJECT AND SUMMARY OF THE INVENTION
It is an object of the invention to provide an improved method and processing unit for processing a video transport stream.
The method according to the invention aims at processing a video transport stream comprising a set of metadata, and the metadata each comprise a video signal and viewing position parameters. The viewing position parameters indicating the relative position between a viewer and a video display which is intended to present the video signals. The method comprises the steps of:
- receiving viewing position information of the viewer,
- selecting in said transport stream the metadata that have viewing position parameters corresponding to said viewing position information, and
- outputting to the video display the video signal comprised in the selected metadata.
The invention also proposes a processing unit for implementing the different steps of said method according to the invention.
The advantage is that the proposed method and processing unit can dynamically select different video signals according to different viewing position information of the viewer, in order to present to the viewer the most suitable video according to his position compared to the display. In other words, it makes the video signal that is displayed adaptive to the viewer's position.
BRIEF DESCRIPTION OF THE DRAWINGS These and other aspects of the method and processing unit for processing a video transport stream according to the invention will become apparent from and will be elucidated with respect to the implementations and embodiments described hereinafter and with reference to the accompanying drawings, wherein:
Fig. 1 is a schematic diagram illustrating a system for displaying a video signal to a viewer who is positioned at different viewing positions around a display;
Fig. 2 is a schematic block diagram illustrating a processing unit for processing video transport stream according to the invention;
Fig. 3 is a schematic diagram illustrating a structure of a clip of AV streams; Fig. 4 is a flow chart diagram illustrating a method of processing a video transport stream according to the invention. Same reference numerals are used to denote similar parts throughout the figures.
DETAILED DESCRIPTION OF THE INVENTION Fig. 1 is a schematic diagram illustrating a system for displaying a video signal to a viewer who is located at different viewing positions around a display 100. A processing unit 200 is used to receive viewing position information of a viewer, and outputting to the display 100 a video signal that corresponds to the viewing position information. The display 100 is used to present the video signal to the viewer. The viewing position corresponds to the position of the viewer located around the display 100, e.g. the viewer can be located in viewing position 1, or viewing position 2, or viewing position 3 to watch the content of a video signal.
The viewing position information indicating the relative position between the viewer and the display 100 may correspond to viewing angles, such as angles al, a2 or a3. The viewing position information indicating the relative position between the viewer and the display 100 may correspond to viewing distances, such as distances dl, d2 or d3.
The processing unit 200 may be integrated in a player for optical discs (e.g. computer disc drive, standalone apparatus...) or digital TV STB. The display 100 may correspond, for example, to a TV or a computer display.
Fig. 2 is a schematic block diagram illustrating a processing unit 200 for processing a video transport stream according to the invention. The video transport stream comprises a set of metadata, and the metadata each comprise a video signal and viewing position parameters. The viewing position parameters indicate the relative position between a viewer and a video display intended to present the video signals. The viewing position parameters can correspond to viewing angles or viewing distances, as previously described. The viewing position parameters can be pre-stored in a data structure, and indicates the position of a viewer relative to the display in order to best watch the content of the corresponding video signal. The processing unit (200) comprises means for: - receiving (210) viewing position information of the viewer. The receiving means (210) may correspond to a buffer. The viewing position information can be detected by a number of sensors, a camera, or input directly by the viewer using a remote control, keyboard (not shown). - selecting (220) in said transport stream the metadata that have viewing position parameters that correspond to said viewing position information. The selecting means (220) may comprise means for comparing the viewing position information with the viewing position parameters, and means for choosing the metadata that have viewing position parameters closest to said viewing position information.
- outputting (230) to the video display the video signal included in the selected metadata.
For example, when the viewer moves from the viewing position 1 to the viewing position 2 (as shown in Fig. 1), the receiving means (210) will receive viewing position information corresponding to the viewing position 2 from the sensors, camera, remote control, or keyboard.
If the viewing position information is expressed by angle a2, the selecting means (220) will select the metadata in the video transport stream that have position parameters which are closest to the angle a2.
If the viewing position information is expressed by distance d2, the selecting means (220) will select the metadata in the video transport stream that have position parameters which are closest to the angle d2.
If the video transport stream is multiplexed, the processing unit 200 may comprise a de-multiplexer for de-multiplexing the video transport stream. If the video transport stream is encoded, the processing unit 200 may also comprise a decoder for decoding the video signal before outputting the video signal. For example, if the video transport stream is encoded according to the MPEG standard, the decoder is an MPEG decoder.
The data structure containing the viewing position parameters of the metadata may be a PID (Program identifier) table, or a database referring to a PID table or ClipMark. Use of a PID table
PID table: the PID table can be provided together with a video transport stream (encoded or not encoded), or be provided as a separate file. The PID table is similar to Cliplnfo (Clip Information) file defined in the BD (Blu-ray Disc) A/V application format. The Cliplnfo file is used to specify attributes of the metadata, such as the time stamps of the access point in corresponding metadata. Two solutions may be provided for the PID table:
Solution 1: The PID table contains entry point (i.e. start mark of metadata), attribute information (e.g. time stamps, format, definition, connection with other metadata), and the viewing position parameters of the metadata: the processing unit 200 can select viewing position parameters in the PID table according to current viewing position information, such as viewing angle or viewing distance.
Solution 2: The PID table does not contain viewing position parameters of the metadata; the PID table may comprise entry point, the attribute information, and metadata name used as a reference. As shown in Table 1 below, the metadata name [angle_id] is used as a reference (in bold in Table 1), so that the processing unit 200 can select a video signal according to current viewing position information of the viewer, such as viewing angle (or viewing distance).
Figure imgf000007_0001
Figure imgf000008_0001
Table 1
Use of ClipMark
Fig. 3 is a schematic diagram illustrating a structure of a clip AV stream. The viewing position parameters of the metadata can be pre-defined in a ClipMark. The ClipMark is defined in Blu-ray Disc Application format, and pre-defined between two adjacent metadata for storing entry point (start mark of metadata) and attribute information of metadata. In this example of the invention, the ClipMark is further used to contain the viewing position parameters of the metadata. Fig. 4 is a flow chart illustrating a method of processing a video transport stream according to the invention. The video transport stream comprises a set of metadata, and the metadata each comprise a video signal and viewing position parameters as described in Fig.2. The method comprises the steps of:
- receiving (410) viewing position information of the viewer.
- selecting (420) in said transport stream the metadata that have viewing position parameters corresponding to said viewing position information. The selecting (420) is intended to compare said viewing position information with said viewing position parameters and choose the metadata that have viewing position parameters closest to said viewing position information.
- outputting (430) to said video display the video signal comprised in the selected metadata.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word 'comprising' does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements and by means of a suitably programmed computer. In the means claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera does not indicate any ordering. These words are to be interpreted as names.

Claims

1. A method of processing a video transport stream comprising a set of metadata, the metadata each comprising a video signal and viewing position parameters, said viewing position parameters indicating the relative position between a viewer and a video display intended to present video signals, said method comprising the steps of:
- receiving (410) viewing position information of the viewer,
- selecting (420) in said transport stream the metadata that have viewing position parameters corresponding to said viewing position information, and
- outputting (430) to said video display the video signal comprised in the selected metadata.
2. A method as claimed in claim 1, wherein said selecting step (420) comprises the steps of:
- comparing said viewing position information with said viewing position parameters, and
- choosing the metadata that have viewing position parameters closest to said viewing position information.
3. A processing unit for processing a video transport stream comprising a set of metadata, the metadata each comprising a video signal and viewing position parameters, said viewing position parameters indicating the relative position between a viewer and a video display intended to present video signals, said apparatus comprising means for: - receiving (210) viewing position information of the viewer,
- selecting (220) in said transport stream the metadata that have viewing position parameters corresponding to said viewing position information, and
- outputting (230) to said video display the video signal comprised in the selected metadata.
4. A processing unit as claimed in claim 3, wherein the selecting (220) means comprises means for: - comparing said viewing position information with said viewing position parameters, and
- choosing the metadata that have viewing position parameters closest to said viewing position information.
5. An apparatus for playing a video signal, said apparatus comprising a processing unit (200) as claimed in claim 3 or 4.
PCT/IB2006/054755 2005-12-15 2006-12-12 Method and processing unit for processing a video transport stream WO2007069189A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CNA2005101317965A CN1984295A (en) 2005-12-15 2005-12-15 Method and unit for processing video transmission stream
CN200510131796.5 2005-12-15

Publications (2)

Publication Number Publication Date
WO2007069189A2 true WO2007069189A2 (en) 2007-06-21
WO2007069189A3 WO2007069189A3 (en) 2007-09-13

Family

ID=38020071

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2006/054755 WO2007069189A2 (en) 2005-12-15 2006-12-12 Method and processing unit for processing a video transport stream

Country Status (3)

Country Link
CN (1) CN1984295A (en)
TW (1) TW200826682A (en)
WO (1) WO2007069189A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009101998A1 (en) * 2008-02-14 2009-08-20 Sony Corporation Broadcast system, transmission device, transmission method, reception device, reception method, presentation device, presentation method, program, and recording medium
US8743178B2 (en) 2010-01-05 2014-06-03 Dolby Laboratories Licensing Corporation Multi-view video format control
US10116911B2 (en) 2012-12-18 2018-10-30 Qualcomm Incorporated Realistic point of view video method and apparatus

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5299018B2 (en) * 2009-03-26 2013-09-25 ソニー株式会社 Information processing apparatus, content processing method, and program
JP4915459B2 (en) * 2009-04-03 2012-04-11 ソニー株式会社 Information processing apparatus, information processing method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001272941A (en) * 2000-03-23 2001-10-05 Olympus Optical Co Ltd Video display device
EP1389020A1 (en) * 2002-08-07 2004-02-11 Electronics and Telecommunications Research Institute Method and apparatus for multiplexing multi-view three-dimensional moving picture
WO2004044904A1 (en) * 2002-11-12 2004-05-27 Lg Electronics Inc. Recording medium having data structure for managing reproduction of multiple reproduction path video data recorded thereon and recording and reproducing methods and apparatuses
US20040104935A1 (en) * 2001-01-26 2004-06-03 Todd Williamson Virtual reality immersion system
WO2004107752A1 (en) * 2003-05-31 2004-12-09 Koninklijke Philips Electronics N.V. Multi-programme recording in dvd compliant format
JP2005159592A (en) * 2003-11-25 2005-06-16 Nippon Hoso Kyokai <Nhk> Contents transmission apparatus and contents receiving apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001272941A (en) * 2000-03-23 2001-10-05 Olympus Optical Co Ltd Video display device
US20040104935A1 (en) * 2001-01-26 2004-06-03 Todd Williamson Virtual reality immersion system
EP1389020A1 (en) * 2002-08-07 2004-02-11 Electronics and Telecommunications Research Institute Method and apparatus for multiplexing multi-view three-dimensional moving picture
WO2004044904A1 (en) * 2002-11-12 2004-05-27 Lg Electronics Inc. Recording medium having data structure for managing reproduction of multiple reproduction path video data recorded thereon and recording and reproducing methods and apparatuses
WO2004107752A1 (en) * 2003-05-31 2004-12-09 Koninklijke Philips Electronics N.V. Multi-programme recording in dvd compliant format
JP2005159592A (en) * 2003-11-25 2005-06-16 Nippon Hoso Kyokai <Nhk> Contents transmission apparatus and contents receiving apparatus

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009101998A1 (en) * 2008-02-14 2009-08-20 Sony Corporation Broadcast system, transmission device, transmission method, reception device, reception method, presentation device, presentation method, program, and recording medium
JP2009194595A (en) * 2008-02-14 2009-08-27 Sony Corp Broadcast system, transmitter, transmission method, receiver, reception method, exhibition device, exhibition method, program, and recording medium
US8743178B2 (en) 2010-01-05 2014-06-03 Dolby Laboratories Licensing Corporation Multi-view video format control
US10116911B2 (en) 2012-12-18 2018-10-30 Qualcomm Incorporated Realistic point of view video method and apparatus

Also Published As

Publication number Publication date
WO2007069189A3 (en) 2007-09-13
CN1984295A (en) 2007-06-20
TW200826682A (en) 2008-06-16

Similar Documents

Publication Publication Date Title
US10158841B2 (en) Method and device for overlaying 3D graphics over 3D video
CN102137270B (en) 3D display handling of subtitles
US8269821B2 (en) Systems and methods for providing closed captioning in three-dimensional imagery
RU2547706C2 (en) Switching between three-dimensional and two-dimensional video images
CN102224737B (en) Combining 3D video and auxiliary data
US8854546B2 (en) Method and apparatus for displaying data content
US20110090304A1 (en) Method for indicating a 3d contents and apparatus for processing a signal
US20110293240A1 (en) Method and system for transmitting over a video interface and for compositing 3d video and 3d overlays
WO2007069189A2 (en) Method and processing unit for processing a video transport stream
US8542241B2 (en) Stereoscopic content auto-judging mechanism
US8704876B2 (en) 3D video processor and 3D video processing method
EP2693767A2 (en) Image processing apparatus and image processing method thereof
US20120154383A1 (en) Image processing apparatus and image processing method
EP2384008A1 (en) Stereoscopic content auto-judging mechanism
TWI273547B (en) Method and device of automatic detection and modification of subtitle position
CN102420999A (en) Electronic apparatus, reproduction system, reproduction method, and program
AU2011202552A1 (en) 3D display handling of subtitles

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06842440

Country of ref document: EP

Kind code of ref document: A2