WO2009083863A1 - Reproduction et superposition de graphiques 3d sur une vidéo 3d - Google Patents

Reproduction et superposition de graphiques 3d sur une vidéo 3d Download PDF

Info

Publication number
WO2009083863A1
WO2009083863A1 PCT/IB2008/055338 IB2008055338W WO2009083863A1 WO 2009083863 A1 WO2009083863 A1 WO 2009083863A1 IB 2008055338 W IB2008055338 W IB 2008055338W WO 2009083863 A1 WO2009083863 A1 WO 2009083863A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
stream
graphics
depth
video
Prior art date
Application number
PCT/IB2008/055338
Other languages
English (en)
Inventor
Francesco Scalori
Philip S. Newton
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2009083863A1 publication Critical patent/WO2009083863A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background

Definitions

  • the present invention relates to a method of playback of an information stream suitable to be played back in on a three-dimensional (3D) display, the information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream.
  • the invention also related to an apparatus for playback of the information stream as described herein-above and to a signal comprising the information stream as described herein-above.
  • 3D displays With the introduction of new 3D displays, there is an opportunity for 3D video to break through to the mass consumer market. Such 3D displays are able to handle both 3D display and 2D display.
  • introducing 3D video does not only relate to introducing new displays capable of 3D display, but it also has impact on the whole content production and delivery chain.
  • the production of 3D video content is at an embryonic technology stage and various formats are proposed to be used each with their own advantages and disadvantages.
  • new coding methods were introduced for coding 3D content and new formats were proposed to include the 3D video stream in MPEG streams.
  • a known fact is that introduction of new formats is usually slow and a desired feature when introducing new format is backwards playback compatibility with the installed player base.
  • a missing area has been the carriage of 3D video content in a content distribution or publishing format such as Digital Video Broadcasting (DVB) or DVD and high definition format such as Blu-ray Disc (BD) or HD-DVD while maintaining backwards compatibility with the installed player base.
  • An important feature high definition publishing formats is the ability of content providers to provide multiple video stream such as picture-in-picture and graphics and interactive streams. For example, in case of BD, DVD and HD-DVD it is known that such systems allow playback of video and graphics (e.g. subtitles, navigation buttons) at the same time. Usually graphic stream such as subtitles should always appear in front of the main video and therefore are added later to the final picture to be displayed.
  • every pixel, belonging to either video or graphics stream has a depth relative to the display. Such depth is either directly associated therewith if a 2D+ depth coding of the 3D streams is used, or the depth information can be directly inferred from other coding systems, such as 2D + parallax information.
  • a 2D+ depth coding of the 3D streams is used, or the depth information can be directly inferred from other coding systems, such as 2D + parallax information.
  • Figure Ia illustrates the known overlaying of video and graphics stream, in the particular case of BD systems.
  • a main movie plane In such systems there exists a main movie plane, a presentation plane comprising static graphic objects, and an interactive plane comprising interactive objects.
  • the three planes are overlayed on each other: the main movie plane in the background, the presentation plane on top of the main movie and the interactive plane most forward.
  • the right image in Fig. Ia indicated the outputted image with the three planes overlayed.
  • Figure Ib is a 2D representation of how such planes might intertwine in case of 3D display of each stream. Due to depth, some part of the main movie plane may have a depth closer to the viewer than that of the graphics items. In such parts, the foreground graphic objects are punctured and text becomes difficult to read, while the general aspect of the displayed image is broken and unpleasing. In case of graphics streams, this is particularly problematic as the graphics may appear at any location in the video and is dependent on input from the user.
  • the object of the invention is reached by a method according to claim 1 for playback of an information stream suitable to be played back in on a three-dimensional (3D) display, the information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three-dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three-dimensional (3D) display of the at least one graphics stream.
  • the method comprises reading or receiving the information stream; determining an available depth range for 3D display, attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream, scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range and using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream.
  • This is based on the insight that the occlusion problem between different planes is solved by segmenting the interval of possible depth values which can be displayed by a display into non-overlapping ranges and assigning them to the existing presentation planes, such as the video and graphics plane, followed buy rescaling of the depth of each streams to assigned range.
  • the highest depth of a pixel of one plane is smaller than the lowest depth of a pixel in the next plane (going in the direction of increasing depth).
  • this concept is applicable to any playback system that displays at least two overlapping 3D graphic streams or non-moving stream of pictures, such as a slideshows, or two overlapping video streams.
  • the invention is also applicable to displaying 3D secondary video on top of 3D video or to displaying rendered 3D graphics objects on top of 3D graphic backgrounds.
  • the information stream further comprises overlay information for overlaying the at least one graphics stream onto the main video stream, the overlay information comprising the non-overlapping depth ranges wherein the non-overlapping depth ranges are preferably defined as depth percentages of the available depth range.
  • the range limits could be defined while authoring the content, hence giving authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles).
  • a non-absolute range indication e.g. a percentage relative to the maximum depth value of the target screen
  • the information stream is BD compatible and it comprises a video stream, a graphics stream and an interactive graphics stream, the interactive graphics stream being displayed in front of the graphics stream, which is displayed in front of the main video stream.
  • an optimal value for the depth ranges corresponds to the depth ranges to the main movie stream, graphics stream and interactive graphics stream being in the ratio 5:3:2.
  • the invention is also related to an apparatus for playback of an information stream suitable to be played back in on a three-dimensional (3D) display as defined in claim 7 and a signal as defined in claim 11.
  • Fig. Ia illustrates the known overlaying of video and graphics stream, in the particular case of BD systems
  • Fig. Ib is a 2D representation of occlusion of the graphics stream by the video stream when both streams are displayed in 2D.
  • Fig. 2 illustrates schematically a playback device wherein the invention is practiced
  • Fig. 3 illustrates schematically various presentation planes and the associated depth ranges according to an embodiment of the invention
  • Fig. 4 illustrates schematically an embodiment according to the invention of the real processing unit and the rendering stage.
  • Fig. 2 illustrates schematically a playback device wherein the invention is practiced. It is dully noted that this described a particular embodiment corresponding to playback from optical discs, but the source of the information stream is irrelevant, it may be provided locally on a recorded media such as optical media, hard disc or solid state memory, or it can be received with broadcasting via wired or wireless transmission systems, including the internet.
  • the invention may be implemented in any device for playback of video information, including, among others, hard-disc recorders, set top boxes (STB) and digital (satellite/terrestrial/cable) receivers.
  • Optical discs having a track, the track being the position of the series of prerecorded marks representing information, and arranged in accordance with a single spiral pattern constituting substantially parallel tracks on an information layer.
  • the optical disc may comprise one or more information layers of a recordable type.
  • prerecorded optical discs are CD-ROM, or DVD-ROM or high density disc such as HD DVD- ROM or BD-ROM.
  • CD-ROM and DVD-ROM optical discs are CD-ROM, or DVD-ROM or high density disc such as HD DVD- ROM or BD-ROM.
  • references ECMA- 130 and ECMA-267 ISO IEC 16449
  • the information is represented on the information layer by optically detectable marks along the track.
  • the track 12 on the optical disc is indicated by a pre-embossed track structure provided during manufacture of the blank optical disc.
  • the track structure is constituted, for example, by a pregroove, which enables a read/write head to follow the track during scanning.
  • the optical disc is intended for carrying user information according to a standardized format, to be playable on standardized playback devices.
  • the recording format includes the way information is recorded, encoded and logically mapped onto the recording space provided by the track.
  • the recordable space is usually subdivided into a lead-in area (LI) 31, a data zone (DZ) for recording the information and a lead-out area (LO).
  • the lead-in area (LI) usually comprises basic disc management information and information how to physically access the data zone (DZ).
  • said basic disc management information corresponds to the table of contents in CD systems or the formatting disc control blocks (FDCB) in DVD systems.
  • the user information recorded in the data zone (DZ) is further arranged according to an application format, for example comprising a predefined structure of files and directories.
  • the user information in the data zone is arranged according to a file system comprising file management information, such as ISO 9660 used in CD systems, available as ECMA-119, or UDF used in DVD systems, available as ECMA- 167.
  • the recording device is provided with scanning means for scanning the track of the optical disc, the scanning means comprising a drive unit 16 for rotating the optical disc 11, a head 18, a positioning unit 21 for coarsely positioning the head 18 in the radial direction on the track, and a control unit 17.
  • the head 18 comprises an optical system of a known type for generating a radiation beam 20 guided through optical elements for focusing said radiation beam 20 to a radiation spot 19 on the track 12 of the optical disc 11.
  • the radiation beam 20 is generated by a radiation source, e.g.
  • the head further comprises (not shown) a focusing actuator for moving the focus of the radiation beam 20 along the optical axis of said beam and a tracking actuator for fine positioning of the radiation spot 19 in a radial direction on the center of the track.
  • the tracking actuator may comprise coils for radially moving an optical element or may alternatively be arranged for changing the angle of a reflecting element.
  • the radiation reflected by the information layer is detected by a detector of a usual type, e.g. a four-quadrant diode, in the head 18 for generating a read signal and further detector signals, such as a tracking error and a focusing error signal for controlling said tracking and focusing actuators.
  • the control unit 17 controls the retrieving of information from the optical disc 11, and may be arranged for receiving commands from a user or from a host computer. To this end, the control unit 17 may comprise control circuitry, for example a microprocessor, a program memory and control gates, for performing the procedures described hereinafter.
  • the control unit 17 may also be implemented as a state machine in logic circuits.
  • the read signal is processed by a read processing unit comprising a demodulator 26, a de-formatter 27 and output unit 28 for processing the information and outputting said information to suitable means, such as display, speakers.
  • the functioning of the demodulator 26, the de-formatter 27 and the output unit 28 are controlled by the controller 17.
  • retrieving means for reading information include the drive unit 16, the head 18, the positioning unit 21 and the read processing unit.
  • the demodulator 26 is responsible for de-modulating a data signal from the channel signal, by using suitable channel decoder, e.g. as disclosed in US 5,920,272 or US 5,477,222.
  • the de-formatter 27 is responsible for using error correction codes and/or de-interleaving for extracting the information signal from the data signal.
  • the output unit 28 under the control of the control unit 17, is responsible for processing the information signal at logical level. Furthermore, it is noted that the information signal may be arranged according to a playback format, which may prescribe that management information is associated to the audio-video information. Hence the output unit is responsible for separating management information from the audio- video information, and for de-multiplexing and/or decoding the audio and/or video information. Suitable compression/de-compression means are described for audio in WO 98/16014-A1 (PHN 16452), and for video in the MPEG2 standard (ISO-IEC 13818). The recording format in which this the user information is to be recorded prescribes that management information for managing the recorded user information is also recorded onto the optical disc.
  • the video and audio information generated by the output unit 28 is sent to suitable means, such a suitable display for the video information.
  • suitable means such a suitable display for the video information.
  • 3D displays are known, one of them being described in US 6,069,650.
  • the display device comprises an LCD display comprising actively switchable Liquid Crystal lenticular lens. Depending on the image content a defined set of locations at the display can be switched to either 2D or 3D mode.
  • each plane is linked to an output of the dedicated decoder.
  • Primary video plane moving or still picture data from Primary video decoder is presented.
  • Secondary video plane moving picture data from Secondary video decoder is presented.
  • Presentation Graphics plane graphic data from either Presentation Graphics decoder or Text subtitle decoder is presented. And these data on two planes are firstly overlaid to make interim video data. Transparent ratio between the two planes is defined as alpha value in CLUT of Presentation Graphics plane.
  • the multiple views necessary for a 3D display can be computed based on a 2D picture and an additional picture, a so-called depth map, as described in Oliver Sheer- "JD Video Communication” , Wiley, 2005, pages 29-34.
  • the depth map conveys information about the depth of objects in the 2D image.
  • the grey scale values in the depth map indicate the depth of the associated pixel in the 2D image.
  • a stereo display can calculate the additional view required for stereo by using the depth value from the depth map and by calculating the required pixel transformation.
  • an MPEG 3D video stream would comprise a 2D video stream (as either a one program - or a elementary video transport stream) and, multiplexed with the 2D video stream, an auxiliary stream comprising additional information to enable 3D display (such as a depth map stream).
  • the 2D video + depth map was described as the preferred format for implementing the invention, it is not the only format that can be supported.
  • the 2D video + depth map may be extended by adding background de-occlusion information and transparency information, or stereo + depth may be used as input format.
  • the multiple views may be used as input signal and mapped directly onto the display (sub) pixels.
  • the format 2D + depth as previously described, i.e. the full resolution image is divided into four quadrants and one is used for the 2D content while another carries depth information.
  • each plane has before they are composed together into the final image that will be shown on the screen.
  • the inventors had the insight that the occlusion problem between different planes is solved by segmenting the interval of possible depth values which can be displayed by a display into non-overlapping ranges and assign them to the existing presentation planes, such as the video and graphics planes.
  • the highest depth of a pixel of one plane is smaller than the lowest depth of a pixel in the next plane (going in the direction of increasing depth).
  • this concept is applicable to any playback system that displays at least two overlapping 3D (static) stream of pictures, such as a slideshows, or two overlapping video streams.
  • the invention is also applicable to displaying 3D secondary video on top of 3D video or to displaying rendered 3D objects on top of 3D backgrounds.
  • the range limits could be defined while authoring the content, in order to give authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles).
  • a non-absolute range indication e.g. a percentage relative to the maximum depth value of the target screen
  • a method of playback in a basic embodiment of the invention, wherein 3D objects are overlaid over a 3D video stream, comprises steps of:
  • a second embodiment of the invention illustrated in Fig. 3, this is extended to three planes, such as a video plane and two graphics planes as used in BD systems.
  • 35, 36, and 37 indicate the relative depth of each of the Main Movie plane, Presentation plane and Interactive plane.
  • depth is illustrated as increased in the opposite direction to the viewer.
  • the range limits could be defined while authoring the content, in order to give authors freedom and control of assigning a bigger range to a plane (e.g. interactive graphics) and a smaller range to another one (e.g. subtitles).
  • a non absolute range indication e.g. a percentage relative to the maximum depth value of the target screen
  • a preferred choice of ranges is 50% to the main movie plane, 30% to the presentation plane and 20% to the interactive plane.
  • a further problem addressed by the inventors is how to provide such depth range choice needed to avoid the occlusion problems in a way that maintains backward compatibility with known systems.
  • BD systems it is known that three types of graphics segments exist - Object Definition Segments, which store the bitmap values of a certain graphics object: - Palette Definition Segments which provide the mapping between those values and real colours - Presentation and Interactive Composition Segments which provide information on the way in which the current graphics element should be added to the graphics plane.
  • Depth Map Object Definition Segment When implementing 3D objects, it is expected that two extra types of segments are used, namely Depth Map Object Definition Segment and Depth Map Palette Segment exist along with the previous ones.
  • a new data field, called depth_percentage is added to the Composition Segment structure. Since Composition Segments also hold a reference to the Depth map Palette Segment to be used with a certain graphics object, this allows rescaling that depth map according to the expressed percentage. Therefore this enables the association of a different portion of depth to different graphics planes, for example 20% to subtitles and 30% of the whole depth to interactive menus.
  • the same effect may be achieved if the depth_percentage field directly is included into the Depth Map Palette Segment definition.
  • the information stream processed by the output unit 28 is provided to a video processing unit 31, responsible for implementing the known function of the player model, such as buffering, demultiplexing, processing each elementary stream, executing received commands.
  • the video processing unit 31 is usually implemented as a combination of software and hardware.
  • the processed video information stream is provided to a rendering unit, which is responsible to processing the video information into a signal suitable for 3D display. It is noted that the rendering unit may also be implemented in the 3D display itself.
  • the function of the control unit 17, of the video processing unit 31 and of the rendering stage 32 may be implemented in a device by the same hardware and/or software block.
  • control unit 17 is adapted to determine an available depth range for the 3D display and for attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream and for scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range while the rendering unit adapted to use the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) rendering of the information stream.
  • depth determination means 29 are provided in the control unit, preferably implemented as firmware or as embedded software.
  • two additional function need to be performed within the functional block comprising the video processing unit 31 and the rendering stage 32, a specific embodiment of which is illustrated in more detail in Fig 4.
  • the received stream is demultiplexed and buffered in unit 41, the graphic stream sent to the Stream graphic processor 42.
  • two buffers 43 and 44 For generating the graphic images to be displayed for each of the right and left view, two buffers 43 and 44, under the control of a graphic controller are provided.
  • the two buffers 43 and 44 supply the two graphics plane processors 45 and 46.
  • both graphics decoders (43, 45 and, respectively, 44, 46) are adapted to take into account the value of depth_percentage present in the Composition Segments.
  • the depth map palette has to be adapted to the depth_percentage value, i.e. the minimum and maximum depth contained in the palette have to be within that percentage of the whole possible depth values.
  • BD and HD-DVD also support a secondary video plane for PIP.
  • the video in the PIP may appear side by side with the main video or in a quarter of the screen.
  • the primary video cannot be scaled so the secondary video always covers part of the primary video.
  • the secondary video is either fully transparent or fully opaque. So it is advantageous that, when the secondary video is fully opaque, the primary video does not punch through the other video during rendering of the 2D +depth information in the display, in a similar way as what can happen with the graphics.
  • an alternative solution is possible in the case of overlaying two video streams.
  • the primary video and the secondary video are combined both for the 2D and for the depth information.
  • the data from the secondary video plane simply overwrites the pixels of the primary video on the same presentation plane completely.
  • Such solution is possible in view of the fact that there is no semi-transparency and there is no strong requirement for occlusion information of the primary video that is concealed by the secondary video.
  • This in contrast to overlaying a graphics stream as the graphics stream may be semi-transparent and may overlay the video stream in any shape and location.
  • exemplary embodiments of the invention hereinabove were done with reference to a playback device for playback of information from an optical disc. It is noted that the source of the information is irrelevant, it may be provided locally on a recorded media such as optical media, hard disc or solid state memory, or it can be received with broadcasting via wired or wireless transmission systems, including the internet.
  • the invention may be implemented in any device for playback of video information, including, among others, hard-disc recorders, set top boxes (STB) and digital (satellite/terrestrial/cable) receivers.
  • the invention relates three- dimensional (3D) display of an information stream comprising a main video stream and auxiliary video information comprising video depth information for enabling three- dimensional (3D) display of the main video stream; at least one graphics stream and at least one auxiliary graphics information associated therewith, the at least one auxiliary graphics information comprising at least one graphics depth information for enabling three- dimensional (3D) display of the at least one graphics stream.
  • a method comprises reading or receiving the information stream; determining an available depth range for 3D display; attributing corresponding non-overlapping depth ranges to each of the main video stream and of the at least one graphics stream; scaling each of the video depth information and the at least one graphics depth information to the corresponding depth range and using the scaled video depth information and the at least one graphics depth information for three-dimensional (3D) display of the information stream.
  • the invention enables overlaying 3D graphics onto 3D without unwanted occlusion problems.
  • a computer program may be stored/distributed on a suitable medium, such as optical storage or supplied together with hardware parts, but may also be distributed in other forms, such as being distributed via the Internet or wired or wireless telecommunication systems.
  • a suitable medium such as optical storage or supplied together with hardware parts
  • a computer program may also be distributed in other forms, such as being distributed via the Internet or wired or wireless telecommunication systems.
  • system/device/apparatus claim enumerating several means several of these means may be embodied by one and the same item of hardware or software. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne l'affichage tridimensionnel (3D) d'un flux d'informations comprenant un flux vidéo principal et des informations vidéo auxiliaires comprenant une information de profondeur vidéo destinée à permettre l'affichage tridimensionnel (3D) du flux vidéo principal; au moins un flux graphique et au moins une information graphique auxiliaire associée à celui-ci, la ou les informations graphiques auxiliaires comprenant au moins une information de profondeur graphique destinée à permettre l'affichage tridimensionnel (3D) du flux ou des flux graphiques. Un procédé selon l'invention comporte les étapes consistant à: lire ou recevoir le flux d'informations; déterminer une plage de profondeur disponible pour l'affichage 3D; attribuer des plages de profondeur correspondantes sans recouvrement à chaque flux vidéo principal et chaque flux graphique; mettre l'information de profondeur vidéo ainsi que l'information ou les informations de profondeur graphique à l'échelle de la plage de profondeur correspondante et utiliser l'information de profondeur vidéo mise à l'échelle et l'information ou les informations de profondeur graphique pour l'affichage tridimensionnel (3D) du flux d'informations. L'invention permet la superposition de graphiques 3D sur une vidéo 3D sans problèmes d'occlusions indésirables.
PCT/IB2008/055338 2007-12-20 2008-12-16 Reproduction et superposition de graphiques 3d sur une vidéo 3d WO2009083863A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP07123796.0 2007-12-20
EP07123796 2007-12-20

Publications (1)

Publication Number Publication Date
WO2009083863A1 true WO2009083863A1 (fr) 2009-07-09

Family

ID=40394513

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2008/055338 WO2009083863A1 (fr) 2007-12-20 2008-12-16 Reproduction et superposition de graphiques 3d sur une vidéo 3d

Country Status (2)

Country Link
TW (1) TW200935873A (fr)
WO (1) WO2009083863A1 (fr)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010058368A1 (fr) * 2008-11-24 2010-05-27 Koninklijke Philips Electronics N.V. Combinaison de données vidéo 3d et de données auxiliaires
WO2010058362A1 (fr) * 2008-11-24 2010-05-27 Koninklijke Philips Electronics N.V. Extension de graphiques 2d dans une interface utilisateur graphique 3d
EP2309765A1 (fr) * 2009-09-11 2011-04-13 Disney Enterprises, Inc. Système et procédé pour le flux de travail de capture vidéo tridimensionnelle pour rendu dynamique
EP2309463A2 (fr) 2009-10-07 2011-04-13 Thomson Licensing Procédé d'affichage de vidéo en 3D avec insertion d'un objet graphique et terminal pour la mise en oeuvre du procédé
EP2320667A1 (fr) * 2009-10-20 2011-05-11 Koninklijke Philips Electronics N.V. Combinaison de données auxiliaires de vidéo 3D
EP2337368A1 (fr) * 2009-08-18 2011-06-22 Sony Corporation Dispositif de reproduction, procédé de reproduction, structure de données, support d'enregistrement, dispositif d'enregistrement, procédé d'enregistrement et programme
CN102164257A (zh) * 2010-02-05 2011-08-24 Lg电子株式会社 用于提供广播信息的图形用户界面的电子装置和方法
EP2467831A2 (fr) * 2009-08-17 2012-06-27 Samsung Electronics Co., Ltd. Procédé et appareil de traitement de signal pour la reproduction tridimensionnelle de données supplémentaires
EP2495979A1 (fr) 2011-03-01 2012-09-05 Thomson Licensing Procédé, appareil de reproduction et système pour afficher des informations vidéo 3D stéréoscopiques
EP2502424A2 (fr) * 2009-11-16 2012-09-26 LG Electronics Inc. Afficheur d'image et son procédé de fonctionnement
EP2525580A3 (fr) * 2011-05-20 2013-05-15 EchoStar Technologies L.L.C. Affichage 3D dynamiquement configurable
CN103155577A (zh) * 2010-10-01 2013-06-12 三星电子株式会社 显示装置和信号处理装置及其方法
EP2312859A3 (fr) * 2009-10-13 2013-06-26 Broadcom Corporation Procédé et système de communication de vidéo 3D via un lien de communication sans fil
WO2013120742A1 (fr) * 2012-02-13 2013-08-22 Thomson Licensing Procédé et dispositif pour insérer une animation graphique tridimensionnelle (3d) dans un contenu stéréoscopique 3d
EP2630803A2 (fr) * 2010-10-18 2013-08-28 Silicon Image, Inc. Combinaison de flux de données vidéo de dimensions différentes pour affichage simultané
US8786673B2 (en) 2011-01-07 2014-07-22 Cyberlink Corp. Systems and methods for performing video conversion based on non-linear stretch information
EP2453661A4 (fr) * 2009-07-10 2016-03-30 Panasonic Ip Man Co Ltd Support d enregistrement, dispositif de reproduction et circuit intégré
US9600923B2 (en) 2011-05-26 2017-03-21 Thomson Licensing Scale-independent maps
US9699438B2 (en) 2010-07-02 2017-07-04 Disney Enterprises, Inc. 3D graphic insertion for live action stereoscopic video
US10791314B2 (en) 2010-03-31 2020-09-29 Interdigital Ce Patent Holdings, Sas 3D disparity maps

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI462567B (zh) * 2010-06-18 2014-11-21 Realtek Semiconductor Corp 三維處理電路及處理方法
US9571811B2 (en) 2010-07-28 2017-02-14 S.I.Sv.El. Societa' Italiana Per Lo Sviluppo Dell'elettronica S.P.A. Method and device for multiplexing and demultiplexing composite images relating to a three-dimensional content
IT1401367B1 (it) 2010-07-28 2013-07-18 Sisvel Technology Srl Metodo per combinare immagini riferentesi ad un contenuto tridimensionale.
US20150245063A1 (en) * 2012-10-09 2015-08-27 Nokia Technologies Oy Method and apparatus for video coding
TWI510071B (zh) * 2013-09-18 2015-11-21 Vivotek Inc 播放視訊資料的前處理方法與播放介面裝置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0905988A1 (fr) * 1997-09-30 1999-03-31 Kabushiki Kaisha Toshiba Appareil d'affichage d'images tridimensionnelles
JP2004274125A (ja) * 2003-03-05 2004-09-30 Sony Corp 画像処理装置および方法
WO2008038205A2 (fr) * 2006-09-28 2008-04-03 Koninklijke Philips Electronics N.V. Affichage à menu 3d
WO2008115222A1 (fr) * 2007-03-16 2008-09-25 Thomson Licensing Système et procédé permettant la combinaison de texte avec un contenu en trois dimensions

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0905988A1 (fr) * 1997-09-30 1999-03-31 Kabushiki Kaisha Toshiba Appareil d'affichage d'images tridimensionnelles
JP2004274125A (ja) * 2003-03-05 2004-09-30 Sony Corp 画像処理装置および方法
WO2008038205A2 (fr) * 2006-09-28 2008-04-03 Koninklijke Philips Electronics N.V. Affichage à menu 3d
WO2008115222A1 (fr) * 2007-03-16 2008-09-25 Thomson Licensing Système et procédé permettant la combinaison de texte avec un contenu en trois dimensions

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010058362A1 (fr) * 2008-11-24 2010-05-27 Koninklijke Philips Electronics N.V. Extension de graphiques 2d dans une interface utilisateur graphique 3d
WO2010058368A1 (fr) * 2008-11-24 2010-05-27 Koninklijke Philips Electronics N.V. Combinaison de données vidéo 3d et de données auxiliaires
EP2453661A4 (fr) * 2009-07-10 2016-03-30 Panasonic Ip Man Co Ltd Support d enregistrement, dispositif de reproduction et circuit intégré
EP2467831A2 (fr) * 2009-08-17 2012-06-27 Samsung Electronics Co., Ltd. Procédé et appareil de traitement de signal pour la reproduction tridimensionnelle de données supplémentaires
EP2467831A4 (fr) * 2009-08-17 2013-04-17 Samsung Electronics Co Ltd Procédé et appareil de traitement de signal pour la reproduction tridimensionnelle de données supplémentaires
CN103024412A (zh) * 2009-08-18 2013-04-03 索尼公司 再现设备和再现方法以及记录设备和记录方法
US8488950B2 (en) 2009-08-18 2013-07-16 Sony Corporation Reproducing apparatus and reproducing method, data structure, recording medium, recording apparatus and recording method, and program
EP2337368A4 (fr) * 2009-08-18 2013-06-12 Sony Corp Dispositif de reproduction, procédé de reproduction, structure de données, support d'enregistrement, dispositif d'enregistrement, procédé d'enregistrement et programme
EP2337368A1 (fr) * 2009-08-18 2011-06-22 Sony Corporation Dispositif de reproduction, procédé de reproduction, structure de données, support d'enregistrement, dispositif d'enregistrement, procédé d'enregistrement et programme
CN103024412B (zh) * 2009-08-18 2015-10-28 索尼公司 再现设备和再现方法以及记录设备和记录方法
US8614737B2 (en) 2009-09-11 2013-12-24 Disney Enterprises, Inc. System and method for three-dimensional video capture workflow for dynamic rendering
EP2309765A1 (fr) * 2009-09-11 2011-04-13 Disney Enterprises, Inc. Système et procédé pour le flux de travail de capture vidéo tridimensionnelle pour rendu dynamique
EP2309463A3 (fr) * 2009-10-07 2011-07-27 Thomson Licensing Procédé d'affichage de vidéo en 3D avec insertion d'un objet graphique et terminal pour la mise en oeuvre du procédé
WO2011042479A1 (fr) 2009-10-07 2011-04-14 Thomson Licensing Procédé d'affichage d'une vidéo 3d avec insertion d'un élément graphique et terminal pour mettre en œuvre le procédé
EP2309463A2 (fr) 2009-10-07 2011-04-13 Thomson Licensing Procédé d'affichage de vidéo en 3D avec insertion d'un objet graphique et terminal pour la mise en oeuvre du procédé
EP2312859A3 (fr) * 2009-10-13 2013-06-26 Broadcom Corporation Procédé et système de communication de vidéo 3D via un lien de communication sans fil
EP2320667A1 (fr) * 2009-10-20 2011-05-11 Koninklijke Philips Electronics N.V. Combinaison de données auxiliaires de vidéo 3D
EP2502424A4 (fr) * 2009-11-16 2014-08-27 Lg Electronics Inc Afficheur d'image et son procédé de fonctionnement
EP2502424A2 (fr) * 2009-11-16 2012-09-26 LG Electronics Inc. Afficheur d'image et son procédé de fonctionnement
CN102164257A (zh) * 2010-02-05 2011-08-24 Lg电子株式会社 用于提供广播信息的图形用户界面的电子装置和方法
EP2355495A3 (fr) * 2010-02-05 2012-05-30 Lg Electronics Inc. Dispositif électronique et procédé pour fournir une interface d'utilisateur graphique pour informations de diffusion
US10791314B2 (en) 2010-03-31 2020-09-29 Interdigital Ce Patent Holdings, Sas 3D disparity maps
US9699438B2 (en) 2010-07-02 2017-07-04 Disney Enterprises, Inc. 3D graphic insertion for live action stereoscopic video
EP2624571A2 (fr) * 2010-10-01 2013-08-07 Samsung Electronics Co., Ltd Dispositif d'affichage, dispositif de traitement de signal et procédés correspondants
EP2624571A4 (fr) * 2010-10-01 2014-06-04 Samsung Electronics Co Ltd Dispositif d'affichage, dispositif de traitement de signal et procédés correspondants
CN103155577A (zh) * 2010-10-01 2013-06-12 三星电子株式会社 显示装置和信号处理装置及其方法
EP2630803A4 (fr) * 2010-10-18 2014-10-08 Silicon Image Inc Combinaison de flux de données vidéo de dimensions différentes pour affichage simultané
EP2630803A2 (fr) * 2010-10-18 2013-08-28 Silicon Image, Inc. Combinaison de flux de données vidéo de dimensions différentes pour affichage simultané
US8786673B2 (en) 2011-01-07 2014-07-22 Cyberlink Corp. Systems and methods for performing video conversion based on non-linear stretch information
EP2495979A1 (fr) 2011-03-01 2012-09-05 Thomson Licensing Procédé, appareil de reproduction et système pour afficher des informations vidéo 3D stéréoscopiques
WO2012116900A1 (fr) 2011-03-01 2012-09-07 Thomson Licensing Procédé et appareil de création d'informations vidéo 3d stéréoscopiques, et procédé et appareil d'affichage de ces informations vidéo 3d stéréoscopiques
US9547928B2 (en) 2011-03-01 2017-01-17 Thomson Licensing Method and apparatus for authoring stereoscopic 3D video information, and method and apparatus for displaying such stereoscopic 3D video information
US8923686B2 (en) 2011-05-20 2014-12-30 Echostar Technologies L.L.C. Dynamically configurable 3D display
EP2525580A3 (fr) * 2011-05-20 2013-05-15 EchoStar Technologies L.L.C. Affichage 3D dynamiquement configurable
US9600923B2 (en) 2011-05-26 2017-03-21 Thomson Licensing Scale-independent maps
US9685006B2 (en) 2012-02-13 2017-06-20 Thomson Licensing Dtv Method and device for inserting a 3D graphics animation in a 3D stereo content
WO2013120742A1 (fr) * 2012-02-13 2013-08-22 Thomson Licensing Procédé et dispositif pour insérer une animation graphique tridimensionnelle (3d) dans un contenu stéréoscopique 3d

Also Published As

Publication number Publication date
TW200935873A (en) 2009-08-16

Similar Documents

Publication Publication Date Title
WO2009083863A1 (fr) Reproduction et superposition de graphiques 3d sur une vidéo 3d
US9338428B2 (en) 3D mode selection mechanism for video playback
US11277600B2 (en) Switching between 3D video and 2D video
JP5859309B2 (ja) 3dビデオ及び補助データの組み合わせ
CA2691727C (fr) Support d'enregistrement, appareil de lecture, systeme lsi, methode de lecture, verres et dispositif d'affichage destines aux images 3d
JP5497679B2 (ja) 半導体集積回路
SG175863A1 (en) Entry points for 3d trickplay
KR20070014963A (ko) 기록매체, 데이터 재생방법 및 데이터 재생장치와 데이터기록방법 및 데이터 기록장치
KR101596832B1 (ko) 기록매체, 데이터 기록/재생 방법 및 데이터 기록/재생 장치
EP2320667A1 (fr) Combinaison de données auxiliaires de vidéo 3D
KR101537615B1 (ko) 기록매체, 데이터 기록/재생 방법 및 데이터 기록/재생 장치
KR101648450B1 (ko) 데이터 재생 방법 및 재생 장치
KR20080033404A (ko) 기록매체, 데이터 재생방법/장치 및 데이터 기록방법/장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08867777

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08867777

Country of ref document: EP

Kind code of ref document: A1