WO2012130650A2 - Scene graph for defining a stereoscopic graphical object - Google Patents

Scene graph for defining a stereoscopic graphical object Download PDF

Info

Publication number
WO2012130650A2
WO2012130650A2 PCT/EP2012/054761 EP2012054761W WO2012130650A2 WO 2012130650 A2 WO2012130650 A2 WO 2012130650A2 EP 2012054761 W EP2012054761 W EP 2012054761W WO 2012130650 A2 WO2012130650 A2 WO 2012130650A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
stereoscopic
rendering
graphical object
dependent
Prior art date
Application number
PCT/EP2012/054761
Other languages
English (en)
French (fr)
Other versions
WO2012130650A3 (en
Inventor
Jobst Hoerentrup
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to US14/004,704 priority Critical patent/US10332299B2/en
Priority to CN201280015813.7A priority patent/CN103460292B/zh
Priority to KR1020137025645A priority patent/KR20140010120A/ko
Priority to JP2014501528A priority patent/JP6113142B2/ja
Priority to EP12709112.2A priority patent/EP2691956A2/de
Publication of WO2012130650A2 publication Critical patent/WO2012130650A2/en
Publication of WO2012130650A3 publication Critical patent/WO2012130650A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/005Tree description, e.g. octree, quadtree
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2541Blu-ray discs; Blue laser DVR discs

Definitions

  • the present invention relates to a scene graph suitable for defining stereoscopic graphical objects, to a method and an apparatus for creating such scene graphs, and to a method and an apparatus for rendering graphical objects based on such scene graphs. Furthermore, the invention relates to a storage medium comprising a scene graph for defining a stereoscopic graphical object. Finally, the invention relates to a method for initializing a rendering module and a storage medium comprising an application adapted to perform this method.
  • a scene graph In general, such a scene graph describes - usually in a hierarchical manner, e.g. a tree structure - how graphical elements are arranged in space and time to compose scenes.
  • a prominent example of a scene graph is the hypertext markup language HTML, as used for web pages in the internet.
  • HTML hypertext markup language
  • a scene graph can be created with the help of visual design software.
  • software capable of interpreting the scene graph renders it on the screen. For example, in case of HTML, rendering is performed by the internet browser.
  • scene graph or the user interface can be generated relatively easy with the help of visual design software tools.
  • scene graph is usually platform-independent. Only the rendering software depends on the platform.
  • the Blu-ray Disc Association has published stereoscopic 3D extensions to their pre-recorded format.
  • the format not only allows to store stereoscopic 3D video on the Blu-ray disc, it also supports creation of stereoscopic 3D user interfaces, for example 3D pop-up menus.
  • the Blu-ray 3D format is designed to enable backwards compatibility. The goal is that when authored properly, the same Blu-ray disc should be
  • the first problem is how to efficiently create such stereoscopic user interfaces.
  • the second problem is how to efficiently create the user interfaces for both stereoscopic 3D mode as well as for 2D mode.
  • a further option would be to make use of model virtual 3D worlds in form of full-fledged 3-dimensional scene graphs, as known from computer games. Such an approach is also capable of rendering a 2D version as well as a stereoscopic version of the scene.
  • the computational cost of such an approach is quite high and usually requires some hardware acceleration support, e.g. 3D graphics acceleration of modern computer graphics adapters. Such processing power is typically not available on consumer electronics devices.
  • the scene graph describing a spatial and/or temporal arrangement of the stereoscopic
  • graphical object comprises:
  • a scene graph for a stereoscopic graphical object the scene graph describing a spatial and/or temporal
  • the scene graph comprises information about image data for a base image for the stereoscopic graphical object, image data for a dependent image for the graphical object, and the spatial and/or temporal arrangement of the base image and the dependent image .
  • a storage medium comprises a scene graph for a stereoscopic graphical object, the scene graph describing a spatial and/or temporal arrangement of the stereoscopic
  • the scene graph comprises information about image data for a base image for the stereoscopic
  • the graphical object is composed of the base image and the
  • the graphical object is composed of the base image only .
  • a first aspect of the invention is the definition of a
  • stereoscopic' scene graph i.e. a scene graph that is
  • stereoscopic graphical objects e.g. stereoscopic user interfaces.
  • Such stereoscopic graphical objects may be provided, for example, on Blu-ray 3D discs.
  • the proposed scene graph is particularly suited for automatically deriving a 2D-representation from a stereoscopic graphical object by using only the specified base image. This makes the scene graph useful for heterogeneous 3D/2D playback systems.
  • 3D/2D compatible content e.g. a 3D Blu-ray disc
  • the content author is freed from creating one graphical object for stereoscopic 3D mode and another graphical object for monoscopic 2D mode.
  • an authoring system will provide a graphical user interface for defining or generating the base image and the dependent image .
  • the image data for the base image is contained in a base image mosaic and/or the image data for the dependent image is contained in a dependent image mosaic.
  • Image mosaics allow to reduce loading times and improve the player
  • image mosaics contain only image data for base images and only image data for dependent images it is guaranteed that no resources are loaded that, to at least some extent, contain images that are not needed for the current rendering mode.
  • the image mosaics will generally be generated by a dedicated software, which optimizes the distribution of the selected images into different image mosaics.
  • a method for rendering a stereoscopic graphical object comprises the steps of :
  • a scene graph which comprises information about image data for a base image and image data for a dependent image for the stereoscopic graphical object as well as a spatial and/or temporal arrangement of the base image and the dependent image;
  • an apparatus for rendering a stereoscopic graphical object comprises:
  • image data for a base image and image data for a dependent image for the stereoscopic graphical object as well as a spatial and/or temporal arrangement of the base image and the dependent image
  • the second aspect of the invention is related to a rendering method and apparatus, which make use of the scene graph, the base image, and the dependent image.
  • An important advantage of the proposed scene graph is that it is capable of being efficiently rendered either in stereo 3D mode or in monoscopic 2D mode. This makes the scene graph very useful for systems like Blu-ray 3D discs, where for example a stereo 3D user interface needs to be generated in stereo 3D mode when executing on a 3D player, and a monoscopic user interface when running on a system which is 2D-capable only.
  • a stereo 3D user interface needs to be generated in stereo 3D mode when executing on a 3D player
  • a monoscopic user interface when running on a system which is 2D-capable only.
  • a 2D rendering mode only the image data for the base images need to be retrieved and rendered, whereas in case of a 3D rendering mode the image data for the base images as well as the image data for the dependent images are
  • a transducer will retrieve the necessary images from a storage medium, e.g. an optical pickup in case of optical storage media or a reading head in case of a hard disk.
  • the image data for the dependent image are also retrieved in case of rendering a monoscopic version of the stereoscopic graphical object.
  • the image data for the dependent image are also retrieved in case of rendering a monoscopic version of the stereoscopic graphical object.
  • a 3D rendering mode is possible, i.e. if a 3D rendering mode is supported by a rendering device;
  • the image data for the base image and the image data for the dependent image specified by the scene graph are retrieved irrespective of an actual rendering mode.
  • a 3D rendering mode it is checked whether a 3D rendering mode would actually be possible.
  • a 3D player connected to a 3D display may be set to a 2D rendering mode, though a 3D rendering mode would be
  • the image data for the dependent images are favorably loaded even though they are no needed for the current rendering mode.
  • the user decides to switch to 3D rendering mode, all necessary images are already available. Hence, switching from a 2D rendering mode to a 3D rendering mode is accomplished very fast.
  • the available dependent images have not necessarily been retrieved
  • upon a transition from a 2D rendering mode to a 3D rendering mode it is determined whether specified image data for a dependent image has already been retrieved. If the image data for the dependent image has not yet been retrieved, it is subsequently retrieved in order to enable rendering of the graphical object in a 3D rendering mode. In this way it is ensured that in addition to the base images all necessary dependent images are available for rendering.
  • a method for initializing a rendering module which is switchable between a 3D rendering mode and a 2D rendering mode, and whose output is fed to a graphics subsystem, comprises the steps of:
  • an application adapted to perform the necessary steps is advantageously stored on a storage medium.
  • the application also includes the rendering module, which is switchable between the 3D rendering mode and the 2D rendering mode.
  • an 2D rendering module is available in addition to the switchable rendering module.
  • the 2D rendering module is initialized.
  • the switchable rendering module is initialized irrespective of the actual current rendering mode in case the graphics subsystem is capable of rendering in stereoscopic mode. In this way it is ensured that in case of switching from the 2D mode to a 3D mode 3D rendering can start immediately without the need to first terminate the 2D rendering module and initialize the switchable rendering module.
  • Fig. 3 shows a process of image loading during regular
  • Fig. 4 depicts an exemplary process of image loading upon switching of the rendering mode.
  • a stereoscopic' scene graph In order to define graphical objects that can be efficiently rendered in stereo 3D mode and in 2D mode, a stereoscopic' scene graph is defined.
  • the proposed scene graph specifies and uses stereo pairs of images. Similar stereo pairs form the basis of stereoscopic 3D.
  • One image is intended for the left eye and another image is intended for the right eye.
  • One image of a stereo pair represents a 'base image' (BI) .
  • the other image represents a 'dependent image' (DI) .
  • the base image is mandatory, whereas the dependent image is optional.
  • Example 1 Example 1
  • FIG. 1 illustrates a first exemplary definition of a stereo pair, which uses XML as a declarative language.
  • the ⁇ img> element defines an image element. Each such image element carries a unique identifier, specified in the 'id' attribute of the ⁇ img> element.
  • the ⁇ file> element defines the source of the image data. Among others, this can simply identify a file on some local file system.
  • the above example defines two images named 'ibgd' and 'ibgd_r'.
  • 'ibgd' carries a ' stereo_idref ' attribute. This attribute links the two images to form a stereo pair.
  • Such a stereo pair can be conveniently used as follows:
  • Example 2 illustrates a second exemplary definition of a stereo pair, which also uses XML as a declarative language.
  • Example 2
  • the ⁇ img> element defines an image element. It carries a unique identifier, specified in the "id" attribute.
  • the 'scr' attribute defines the source image data for one image of the stereo pair.
  • the 'src_r' attribute defines the source image data for the other image of the stereo pair.
  • Example 3 illustrates a third exemplary definition of a stereo pair. This example uses HTML as a declarative language.
  • the 'src' attribute of an ⁇ img> element defines the data source for the image.
  • a new attribute 'src_r' defines the data source of the associated second image to form a stereo pair.
  • decision step 1 it is detected whether 2D output or stereo 3D output is to be generated.
  • the mandatory base image BI is used for compositing 2 the graphical object.
  • a second decision step 3 detects whether the image to be rendered is a stereo image, i.e. whether in addition to the mandatory base image BI, a dependent image DI is declared.
  • the base image BI is used for compositing 4 the left channel output as well as the right channel output.
  • the base image BI is used for compositing 5 the left channel output and the dependent image DI is used for compositing the right channel output.
  • IM 'image mosaics'
  • Such image mosaics IM combine a number of elementary images into a larger image. This is exemplarily illustrated in Fig. 2. This technique decreases the time required to decode all the images at runtime, because each image decoding process includes some runtime overhead to set up the image decoder. Combining elementary images into an image mosaic IM avoids frequent set ups of the image decoder and the associated extra time.
  • one particular image mosaic IM does not combine one or more base images BI with one or more dependent images DI .
  • the set of base images BI is preferably combined into one or more image mosaics IM, so called base image mosaics (BI-IM)
  • the set of dependent images DI is combined separately into one or more different image mosaics IM, so called dependent image mosaics (DI-IM) .
  • the separation of images into base images BI and dependent images DI, or into base image mosaics BI-IM and dependent image mosaics DI-IM, allows to implement beneficial rules for an application capable of rendering the scene graph in either stereo 3D mode or 2D mode.
  • the application detects at runtime that rendering in 3D mode is not possible in the execution environment, e.g. the application is running on a 2D Blu-ray player or the application is running on a 3D Blu-ray player but only a non-3D screen is connected to the player, when images are to be loaded, any dependent image DI or
  • the application detects that stereo 3D mode is possible, the following two modes of loading image resources are preferably supported.
  • base images BI or base image mosaics BI-IM as well as dependent images DI or dependent image mosaics DI-IM are loaded into memory as
  • this mode includes the case where 3D mode would be possible in the execution environment, but the application is currently configured or signaled to render in 2D mode.
  • This mode has the advantage that no dependent images DI or dependent image mosaics DI-IM need to be loaded when transitioning from 2D rendering mode to 3D rendering mode, so a transition is comparably fast.
  • This mode could be circumscribed as an 'eager loading' mode.
  • the application detects whether the current rendering mode is 3D or 2D.
  • 3D rendering mode base images BI or base image mosaics BI-IM as well as dependent images DI or dependent image mosaics DI-IM are loaded into memory.
  • 2D mode only base images BI or base image mosaics BI-IM are loaded.
  • this mode when transitioning from 2D rendering mode to 3D rendering mode, necessary dependent images DI or dependent image mosaics DI-IM are loaded as needed. This mode could be circumscribed as a 'lazy loading' mode.
  • FIG. 3 An exemplary process of image loading during regular operation, i.e. not caused by any mode change, is depicted in Fig. 3.
  • a 3D rendering mode is possible. If this is not the case, only a base image BI or a base image mosaic BI-IM is loaded 9 and any dependent image DI or dependent image mosaic DI-IM is ignored. If, however, a 3D rendering mode is possible, the further process depends on the loading mode. Therefore, in a further decision step 10 the loading mode is determined. Of course, the further decision step 10 can be omitted when the loading mode is fixed for a specific implementation.
  • a base image BI or a base image mosaic BI-IM as well as a dependent image DI or dependent image mosaic DI-IM is loaded 11.
  • a mode change request 13 it is determined 14 whether currently a 3D rendering mode is possible. If this is not the case, no further steps need to be performed. If, however, a 3D rendering mode is possible, the further process depends on the transition type. Therefore, in a further decision step 15 the transition type is determined. In case of a transition from 3D rendering mode to 2D rendering mode, no further steps need to be performed as all necessary base images BI or base image mosaics BI-IM have already been loaded before for 3D rendering. In case of a transition from 2D rendering mode to 3D rendering mode, the further process depends on the loading mode that has been used before for in response to the image load request 7 of Fig. 3. Consequently, in yet a further decision step 16 the previously used loading mode is determined. Of course, the further
  • decision step 16 can be omitted when the loading mode is fixed for a specific implementation. In case an eager loading mode has been used, no further steps need to be performed as all necessary dependent images DI or dependent image mosaics DI-IM have already been loaded before. In case a lazy loading mode has been used, the necessary dependent images DI or dependent image mosaics DI-IM are loaded 17.
  • the part of an application which is capable of rendering a scene graph is a graphics rendering module.
  • Such an application is typically provided on a storage medium together with the content to be reproduced.
  • a typical task of the graphics rendering module is 'double-buffering', i.e. a technique where an application draws the next composition into an invisible back buffer, while the current composition is stored in a front buffer connected to the display. When requested, the graphics rendering module copies the content of the back buffer into the front buffer.
  • the double buffering technique avoids that intermediate compositions get visible on the display, which potentially causes flicker.
  • such a graphics rendering module needs two pipelines, each connected to one back buffer.
  • One back buffer is needed for compositing the left channel output, whereas another back buffer is needed for compositing the right channel output .
  • stereo graphics rendering module can be designed to support stereoscopic 3D rendering as well as 2D rendering. In the latter case, one of the two pipelines is used to generate the 2D output while the other pipeline remains unused. Further, the stereo 3D graphics rendering module can be designed to support dynamic mode switches between stereo 3D rendering and monoscopic 2D rendering. This means that a stereo 3D graphics renderer as outlined above is very flexible.
  • a stereo 3D rendering module allocates two back buffers, each allocating a considerable amount of the image memory.
  • a back buffer In case of a Blu-ray player, such a back buffer
  • image memory is usually limited, preferably the following beneficial rules for an application capable of rendering the stereoscopic scene graph in either stereo 3D mode or 2D mode are
  • the application When the application detects at runtime that rendering in 3D mode is not possible in the execution environment, e.g. because the application is running on a 2D Blu-ray player or the application is running on a 3D Blu-ray player but only a non-3D screen is connected to the player, the application creates and activates a graphics renderer implementation which is capable of rendering in 2D mode only.
  • a graphics renderer implementation which is capable of rendering in 2D mode only.
  • Such an implementation embeds a single pipeline only, and hence allocates memory to hold only a single back buffer.
  • the application When the application detects that stereo 3D mode is possible in the execution environment, the application creates and activates a graphics renderer implementation which is capable of rendering in 3D mode and in 2D mode.
  • a graphics renderer implementation which is capable of rendering in 3D mode and in 2D mode.
  • 'x' determines the position in horizontal direction
  • 'y' determines the position in vertical direction
  • 'z' usually only specifies in-front- of/behind relationships between individual elements, i.e. the composition order.
  • 'z' is used to determine the composition order of individual elements in the scene graph.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Image Generation (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)
  • Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)
PCT/EP2012/054761 2011-03-31 2012-03-19 Scene graph for defining a stereoscopic graphical object WO2012130650A2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US14/004,704 US10332299B2 (en) 2011-03-31 2012-03-19 Scene graph for defining a stereoscopic graphical object
CN201280015813.7A CN103460292B (zh) 2011-03-31 2012-03-19 定义立体视觉图形对象的场景图形
KR1020137025645A KR20140010120A (ko) 2011-03-31 2012-03-19 스테레오스코픽 그래픽 오브젝트를 정의하기 위한 장면 그래프
JP2014501528A JP6113142B2 (ja) 2011-03-31 2012-03-19 立体的なグラフィックオブジェクトを定義するためのシーングラフ
EP12709112.2A EP2691956A2 (de) 2011-03-31 2012-03-19 Szenengraph zur definition eines stereoskopischen grafischen objekts

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP11305373.0 2011-03-31
EP11305373A EP2506263A1 (de) 2011-03-31 2011-03-31 Stereoskopischer Szenengraph zur Definition von 3D- und 2D-kompatiblen graphischen Objekten

Publications (2)

Publication Number Publication Date
WO2012130650A2 true WO2012130650A2 (en) 2012-10-04
WO2012130650A3 WO2012130650A3 (en) 2013-01-10

Family

ID=44041523

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/054761 WO2012130650A2 (en) 2011-03-31 2012-03-19 Scene graph for defining a stereoscopic graphical object

Country Status (6)

Country Link
US (1) US10332299B2 (de)
EP (2) EP2506263A1 (de)
JP (1) JP6113142B2 (de)
KR (1) KR20140010120A (de)
CN (1) CN103460292B (de)
WO (1) WO2012130650A2 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9092658B2 (en) 2013-04-25 2015-07-28 Nvidia Corporation Automatic detection of stereoscopic content in video/image data

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140300611A1 (en) * 2013-03-15 2014-10-09 Trigger Happy, Ltd. Web and native code environment modular player and modular rendering system
US10861359B2 (en) * 2017-05-16 2020-12-08 Texas Instruments Incorporated Surround-view with seamless transition to 3D view system and method
CN109658516B (zh) * 2018-12-11 2022-08-30 国网江苏省电力有限公司常州供电分公司 Vr培训场景的创建方法、vr培训系统以及计算机可读存储介质
CN110717963B (zh) * 2019-08-30 2023-08-11 杭州群核信息技术有限公司 一种基于WebGL的可替换模型的混合渲染展示方法、系统及存储介质

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6590573B1 (en) 1983-05-09 2003-07-08 David Michael Geshwind Interactive computer system for creating three-dimensional image information and for converting two-dimensional image information for three-dimensional display systems
JP2001273520A (ja) 2000-03-23 2001-10-05 Famotik Ltd マルチメディアドキュメント統合表示システム
FR2828055B1 (fr) 2001-07-27 2003-11-28 Thomson Licensing Sa Procede et dispositif de codage d'une mosaique d'images
CA2380105A1 (en) * 2002-04-09 2003-10-09 Nicholas Routhier Process and system for encoding and playback of stereoscopic video sequences
JP2004102526A (ja) 2002-09-06 2004-04-02 Sony Corp 立体画像表示装置、表示処理方法及び処理プログラム
KR100488804B1 (ko) * 2002-10-07 2005-05-12 한국전자통신연구원 Mpeg-4 기반의 양안식 3차원 동영상 데이터 처리시스템 및 그 방법
AU2002368317A1 (en) 2002-11-01 2004-06-07 Sony Electronics Inc. A unified surface model for image based and geometric scene composition
US7417645B2 (en) 2003-03-27 2008-08-26 Microsoft Corporation Markup language and object model for vector graphics
JP4490074B2 (ja) 2003-04-17 2010-06-23 ソニー株式会社 立体視画像処理装置、立体視画像表示装置、立体視画像提供方法、および立体視画像処理システム
EP2105032A2 (de) * 2006-10-11 2009-09-30 Koninklijke Philips Electronics N.V. Erzeugung dreidimensionaler grafikdaten
KR100826519B1 (ko) * 2007-05-29 2008-04-30 한국전자통신연구원 Dmb 기반의 3차원 정지영상 서비스를 위한 객체 기술 및다중화 방법과, 그에 따른 복호화 장치 및 그 방법
US20080303832A1 (en) 2007-06-11 2008-12-11 Samsung Electronics Co., Ltd. Method of generating two-dimensional/three-dimensional convertible stereoscopic image bitstream and method and apparatus for displaying the same
GB0806183D0 (en) 2008-04-04 2008-05-14 Picsel Res Ltd Presentation of objects in 3D displays
KR20110036882A (ko) * 2008-06-24 2011-04-12 파나소닉 주식회사 기록매체, 재생장치, 집적회로, 재생방법, 프로그램
EP3197155A1 (de) * 2008-07-20 2017-07-26 Dolby Laboratories Licensing Corp. Kompatible stereoskopische videobereitstellung
KR100984816B1 (ko) 2008-08-20 2010-10-01 주식회사 컴퍼니원헌드레드 3d 그래픽스 사용자 인터페이스 모델링 언어 기반의 모바일 응용 프로그램 구성 방법
WO2010052857A1 (ja) * 2008-11-06 2010-05-14 パナソニック株式会社 再生装置、再生方法、再生プログラム、及び集積回路
US8933987B2 (en) * 2008-12-01 2015-01-13 Sharp Kabushiki Kaisha Content reproducing apparatus and recording medium for switching graphics and video images from 2D to 3D
KR101659576B1 (ko) * 2009-02-17 2016-09-30 삼성전자주식회사 영상 처리 방법 및 장치
DK2400774T3 (en) 2009-02-19 2015-10-19 Panasonic Ip Man Co Ltd Recording medium and playback devices
JP5510700B2 (ja) * 2009-04-03 2014-06-04 ソニー株式会社 情報処理装置、情報処理方法、及び、プログラム
WO2010143439A1 (ja) 2009-06-12 2010-12-16 パナソニック株式会社 再生装置、集積回路、記録媒体
US20110013888A1 (en) * 2009-06-18 2011-01-20 Taiji Sasaki Information recording medium and playback device for playing back 3d images
US9307224B2 (en) * 2009-11-23 2016-04-05 Samsung Electronics Co., Ltd. GUI providing method, and display apparatus and 3D image providing system using the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9092658B2 (en) 2013-04-25 2015-07-28 Nvidia Corporation Automatic detection of stereoscopic content in video/image data

Also Published As

Publication number Publication date
US20140002451A1 (en) 2014-01-02
EP2691956A2 (de) 2014-02-05
US10332299B2 (en) 2019-06-25
JP2014516432A (ja) 2014-07-10
WO2012130650A3 (en) 2013-01-10
CN103460292B (zh) 2017-02-15
KR20140010120A (ko) 2014-01-23
JP6113142B2 (ja) 2017-04-12
EP2506263A1 (de) 2012-10-03
CN103460292A (zh) 2013-12-18

Similar Documents

Publication Publication Date Title
CN102027749B (zh) 考虑特殊再现的再现装置、集成电路、再现方法
JP4959695B2 (ja) 対話型マルチメディア・プレゼンテーション管理の同期性
KR101606949B1 (ko) 다중 계층화된 슬라이드 전환
US7801409B2 (en) Glitch-free realtime playback
US10332299B2 (en) Scene graph for defining a stereoscopic graphical object
KR101317204B1 (ko) 동적 영상물의 프레임 정보를 생성하는 방법 및 이를이용한 장치
US20090289941A1 (en) Composite transition nodes for use in 3d data generation
KR101183383B1 (ko) 상호작용 멀티미디어 프리젠테이션 관리의 동기화 양태
JP2008545335A5 (de)
KR20080023318A (ko) 미디어 콘텐츠 렌더링의 특징
US8938093B2 (en) Addition of immersive interaction capabilities to otherwise unmodified 3D graphics applications
CN110599396A (zh) 信息处理方法及装置
JP2009508214A (ja) フォト・マンテル・ビューおよびアニメーション
CN109327698B (zh) 动态预览图的生成方法、系统、介质和电子设备
EP4088451A1 (de) Verfahren und vorrichtung zur beschreibung von medienszenen
EP4089515A2 (de) Benutzerschnittstellen-prozess zur berechnung von benutzerschnittstellen in der cloud
Sauer et al. U-create: Creative authoring tools for edutainment applications
US20150002516A1 (en) Choreography of animated crowds
US20050021552A1 (en) Video playback image processing
US20130232144A1 (en) Managing storyboards
KR100803216B1 (ko) 3차원 그래픽 데이터를 저작하는 방법 및 장치
KR100711194B1 (ko) 다중 레이어 기반의 이미지 및 오디오 아바타 파일 생성및 재생 방법
KR20070098364A (ko) 3차원 영상 데이터를 코드화하여 저장하는 장치 및 방법
JP2021093718A (ja) 情報処理装置及びプログラム
JP2021093618A (ja) 情報処理装置及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12709112

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 14004704

Country of ref document: US

ENP Entry into the national phase

Ref document number: 20137025645

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2014501528

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE