WO2010058977A2 - Recording medium, data recording/reproducing method and data recording/reproducing apparatus - Google Patents

Recording medium, data recording/reproducing method and data recording/reproducing apparatus Download PDF

Info

Publication number
WO2010058977A2
WO2010058977A2 PCT/KR2009/006826 KR2009006826W WO2010058977A2 WO 2010058977 A2 WO2010058977 A2 WO 2010058977A2 KR 2009006826 W KR2009006826 W KR 2009006826W WO 2010058977 A2 WO2010058977 A2 WO 2010058977A2
Authority
WO
WIPO (PCT)
Prior art keywords
information
data
graphic stream
graphic
display information
Prior art date
Application number
PCT/KR2009/006826
Other languages
French (fr)
Other versions
WO2010058977A3 (en
Inventor
Jong Yeul Suh
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2010058977A2 publication Critical patent/WO2010058977A2/en
Publication of WO2010058977A3 publication Critical patent/WO2010058977A3/en

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2541Blu-ray discs; Blue laser DVR discs

Definitions

  • the present invention relates to a recording medium, a data recording/reproducing method and a data recording/reproducing apparatus, and more particularly, to an apparatus and method for reproducing a graphic stream reproduced along with a video stream as a three-dimensional image and a recording medium for recording/reproducing the stream using the apparatus and method.
  • Such recording media include a Blu-ray disc, a near field recording medium and the like.
  • An object of the present invention devised to solve the problem lies in an apparatus and method for reproducing a graphic stream reproduced along with a video stream as a three-dimensional image and a recording medium for recording/reproducing the stream using the apparatus and method.
  • the object of the present invention can be achieved by providing a recording medium including a data zone including: a video stream including image data; a graphic stream reproduced along with the video stream; and a 3D display information for displaying the graphic stream as a three-dimensional image.
  • a data recording method including: recording a video stream including image data and a graphic stream reproduced along with the video stream in a data zone of a recording medium; and recording 3D display information for displaying the graphic stream as a three-dimensional image in the data zone.
  • a data reproducing method including: reading a video stream including image data and a graphic stream reproduced along with the video stream from a data zone of a recording medium; reading a 3D display information for displaying the graphic stream as a three- dimensional image from the data zone; and reproducing the graphic stream as the three-dimensional image according to the 3D display information.
  • a data recording apparatus including: a recording unit configured to record data on a recording medium; and a control unit configured to control the recording unit to record a video stream including image data and a graphic stream reproduced along with the video stream in a data zone of the recording medium and to record a 3D display information for displaying the graphic stream as a three-dimensional image in the data zone.
  • a data reproducing apparatus including: a reproducing unit configured to read data from a recording medium; a decoder configured to convert a graphic stream reproduced along with a video stream including image data into image data which is able to be output on a screen; and a control unit configured to control the reproducing unit to read the video stream and the graphic stream from a data zone of the recording medium and to read a 3D display information for displaying the graphic stream as a three-dimensional image from the data zone, and to control the decoder to convert the graphic stream into the three-dimensional image according to the 3D display information and to reproduce the three-dimensional image.
  • a recording medium a data recording/reproducing method and a data recording/reproducing apparatus of the present invention, it is possible to reproduce a graphic stream recorded/reproduced on/from the recording medium as a stereoscopic three-dimensional image.
  • FIG. 1 is a diagram showing the combined use of a recording medium, a data reproducing apparatus and a peripheral device according to an embodiment of the present invention.
  • FIG. 2 is a conceptual diagram of a 3D effect and 3D display information according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram showing the structure of a recording medium according to an embodiment of the present invention.
  • FIG. 4 is a conceptual diagram illustrating decoding of an AV stream into an image according to an embodiment of the present invention.
  • FIG. 5 is a structural diagram of a graphic segment according to an embodiment of the present invention.
  • FIG. 6 is a structural diagram of a graphic segment type according to an embodiment of the present invention.
  • FIG. 7 is a structural diagram of an Object Definition Segment (ODS) according to an embodiment of the present invention.
  • ODS Object Definition Segment
  • FIG. 8 is a structural diagram of an object_data_3D field according to an embodiment of the present invention.
  • FIG. 9 is a structural diagram of a Presentation Composition Segment (PCS) according to an embodiment of the present invention.
  • PCS Presentation Composition Segment
  • FIG. 10 is a structural diagram of a composition_object_3D field according to an embodiment of the present invention.
  • FIG. 11 is a conceptual diagram of a 3D object, to which an object skew vector is applied, according to an embodiment of the present invention.
  • FIG. 12 is a structural diagram of a Window Definition Segment (WDS) according to an embodiment of the present invention.
  • FIG. 13 is a structural diagram of a window_3D field according to an embodiment of the present invention.
  • FIG. 14 is a conceptual diagram of a 3D window, to which a window tilt vector is applied, according to an embodiment of the present invention.
  • FIG. 15 is a structural diagram of a button_3D field in an Interactive Composition
  • FIG. 16 is a structural diagram of a graphic segment type in a stereoscopic mode according to an embodiment of the present invention.
  • FIG. 17 is a structural diagram of a Window Definition Segment (WDS) for a stereoscopic display in a stereoscopic mode according to an embodiment of the present invention.
  • FIG. 18 is a structural diagram of a Window_SC field according to an embodiment of the present invention.
  • FIG. 19 is a conceptual diagram of a projection of a window of a stereoscopic mode onto a graphic plane according to an embodiment of the present invention.
  • FIG. 16 is a structural diagram of a graphic segment type in a stereoscopic mode according to an embodiment of the present invention.
  • FIG. 17 is a structural diagram of a Window Definition Segment (WDS) for a stereoscopic display in a stereoscopic mode according to an embodiment of the present invention.
  • FIG. 18 is a structural diagram of a Window_SC field according to an embodiment of the present invention.
  • FIG. 19 is
  • FIG. 20 is a structural diagram of a page_SC field in an ICS in a stereoscopic mode according to an embodiment of the present invention.
  • FIG. 21 is a structural diagram of a button_SC field in a stereoscopic mode according to an embodiment of the present invention.
  • FIG. 22 is a conceptual diagram of projection of a button onto a graphic plane in a stereoscopic mode according to an embodiment of the present invention.
  • FIG. 23 is a structural diagram of a Depth Definition Segment (DDS) type according to an embodiment of the present invention.
  • FIG. 24 is a structural diagram of a DDS according to an embodiment of the present invention.
  • FIG. 25 is a structural diagram of an example of a Graphic Depth Look-Up Table
  • FIG. 26 is a conceptual diagram illustrating decoding of an AV stream into a three- dimensional image using a DDS according to an embodiment of the present invention.
  • FIG. 27 is a block diagram of an AV decoder including a Depth Conversion System
  • FIG. 28 is a block diagram of a data recording/reproducing apparatus according to an embodiment of the present invention.
  • a Recording medium in the present invention includes all media on which data is recorded or will be recorded, such as an optical disc or a magnetic tape.
  • an optical disc such as a Blu-ray Disc (BD)
  • BD Blu-ray Disc
  • the stereoscopic mode uses binocular disparity to obtain a 3D effect by enabling left and right eyes of a user to view the same object in different directions.
  • a two- dimensional image having binocular disparity is split and output to the left eye and the right eye, and a three-dimensional image is displayed and provided to the user in a manner in which a left image and a right image are respectively and alternately exposed to the left eye and the right eye via special glasses such as polarization filter glasses.
  • the autostereoscopic mode includes a volumetric mode for forming an actual three- dimensional image on a space and a holographic mode for reconstructing light diffused from an object and displaying a three-dimensional image according to an actual state.
  • a recording medium 10 a data recording/reproducing method and a data recording/ reproducing apparatus according to the present invention will be described based on the volumetric mode and the holographic mode belonging to the autostereoscopic mode, and the stereoscopic mode will be separately described.
  • FIG. 1 is a diagram showing the combined use of a recording medium 10, a data reproducing apparatus 11 and a peripheral device according to an embodiment of the present invention.
  • the recording medium and the data reproducing apparatus according to the embodiment of the present invention have a main function for providing a three-dimensional image to a user.
  • Three-dimensional image data of an autostereoscopic or stereoscopic mode is recorded on the recording medium 10 shown in FIG. 1.
  • graphic data reproduced along with the three-dimensional image data is recorded on the recording medium 10.
  • the data reproducing apparatus 11 reads and decodes the three-dimensional image data recorded on the loaded recording medium 10. For example, if three-dimensional image data of a stereoscopic mode is recorded on the recording medium 10, the data reproducing apparatus 11 sequentially reads and decodes image data of a left view and image data of a right view into one piece of stereoscopic three-dimensional image data. In addition, the data reproducing apparatus 11 decodes graphic data into a three- dimensional image. The decoded three-dimensional image data and graphic data are provided to an output device 12.
  • the output device 12 renders, outputs and provides the three-dimensional image data and graphic data decoded by the data reproducing apparatus 11 to a user. If the three- dimensional image data of the stereoscopic mode and graphic data are output, the user wears special glasses 13 such as polarization filter glasses so as to view a three- dimensional image.
  • FIG. 2 is a conceptual diagram of a 3D effect and 3D display information according to an embodiment of the present invention.
  • the 3D effect of the three-dimensional image indicates that a three-dimensional image is displayed so as to be located at a constant distance from the output device 12 in a user direction.
  • a large 3D effect indicates that the three-dimensional image is displayed so as to appear to be close to the user and a small 3D effect indicates that the three-dimensional image is displayed so as to appear to be far from the user.
  • the recording medium 10 should include information indicating at which distance from the output device 12 the three-dimensional image data and the graphic data (a subtitle, a menu, etc.) are displayed so as to be located in the user direction.
  • this information is referred to as 3D display information.
  • FIG. 3 is a schematic diagram showing the structure of a recording medium 10 according to an embodiment of the present invention.
  • FIG. 3 shows a single-layer recording medium 10 composed of one layer.
  • the present invention is not limited thereto and is applicable to all recording media having two or more layers. Since the layers can be configured to the same structure, the present invention will be described based on a single layer.
  • the recording medium 10 according to the present invention includes an inner zone
  • the inner zone 210 is located at the inner circumference side of the recording medium 10 and the outer zone 230 is located at the outer circumference side of the recording medium 10.
  • a variety of information for controlling the recording medium 10 is stored in the inner zone 210 and the outer zone 230.
  • Data which a user desires to record is stored in the data zone 220.
  • a portion surrounded by the inner zone 210 and the outer zone 230 on the recording medium 10 is a zone in which data is actually recorded.
  • the data recorded in the data zone 220 may be composed of, for example, a mul- tiplexed AV stream of a specific movie title.
  • the AV stream includes a video stream which is image data, and a graphic stream which is image data reproduced along with the video stream, such as a menu or a subtitle.
  • the video stream is composed of two-dimensional image data or three-dimensional image data.
  • a video stream of an au- tostereoscopic mode may be composed of one three-dimensional video stream
  • a video stream of a stereoscopic mode may be composed of a two-dimensional video stream of a left view and a two-dimensional video stream of a right view.
  • the data reproducing apparatus 11 reads and decodes both the two- dimensional video stream of the left view and the two-dimensional video stream of the right view into one three-dimensional image and reproduces the three-dimensional image.
  • the graphic stream is image data reproduced along with the video stream, and includes a presentation graphic stream and an interactive graphic stream.
  • the presentation graphic stream is fixed image data and includes, for example, a subtitle, a picture, and the like.
  • the interactive graphic stream is image data representing a predetermined effect according to the selection of a user and includes, for example, a menu and the like.
  • the graphic stream is composed of two-dimensional image data or three-dimensional image data.
  • a graphic stream of an au- tostereoscopic mode may be composed of one three-dimensional graphic stream
  • a graphic stream of a stereoscopic mode may be composed of a two-dimensional graphic stream of a left view and a two-dimensional graphic stream of a right view.
  • the data reproducing apparatus 11 reads and decodes both the two- dimensional graphic stream of the left view and the two-dimensional graphic stream of the right view into one three-dimensional image and reproduces the three-dimensional image.
  • FIG. 4 is a conceptual diagram illustrating decoding of an AV stream into an image according to an embodiment of the present invention.
  • the data reproducing apparatus 11 generates a video plane onto which a video stream is projected and a graphic plane onto which a graphic stream is projected.
  • the data reproducing apparatus 11 overlays the graphic plane upon the video plane. Accordingly, a graphic image is displayed and output in front of a video image.
  • the data reproducing apparatus 11 decodes the overlaid video plane and graphic plane, that is, one image obtained by synthesizing the video image and the graphic image, and outputs the image to the user.
  • the data reproducing apparatus 11 generates a graphic plane, onto which a three- dimensional image is projected, using the above-described 3D display information, in order to reproduce the graphic stream as a three-dimensional image.
  • Information for displaying the graphic stream as the three-dimensional image, such as 3D display information, is recorded in a graphic segment.
  • the graphic segment is a data structure, in which display control information of a graphic stream such as a subtitle or a menu is stored, and is stored in the data zone 220 of the recording medium 10.
  • the graphic segment includes an Object Definition Segment (ODS) in which display control information of an object is recorded, a Presentation Composition Segment (PCS) in which display control information of a presentation graphic stream is recorded, an Interactive Composition Segment (ICS) in which display control information of an interactive graphic stream is recorded, and a Window Definition Segment (WDS) in which display control information of a window, onto which a presentation graphic stream is projected, is recorded.
  • ODS Object Definition Segment
  • PCS Presentation Composition Segment
  • ICS Interactive Composition Segment
  • WDS Window Definition Segment
  • WDS refer to US Laid-open Publication No. US 2006/0222334 Al and BD Technical White Paper opened by the Blu-ray Disc Association (BDA).
  • BDA Blu-ray Disc Association
  • FIG. 5 is a structural diagram of a graphic segment according to an embodiment of the present invention.
  • the graphic segment includes a segment_descriptor_3D field and a segment_data_3D field.
  • the segment_descriptor_3D field includes type information of the graphic segment and the segment_data_3D field includes various segments (ODS, PCS, ICS, WDS, etc.) in which the display control information of the graphic stream is recorded.
  • FIG. 6 is a structural diagram of a graphic segment type according to an embodiment of the present invention.
  • the type information of the graphic segment is included in the segment_descriptor_3D field of the graphic segment and is recorded in a segment_type_3D field.
  • the type information of the graphic segment includes type information of segments including 3D display information of the graphic stream.
  • an ODS ODS for 3D display
  • a PCS PCS for 3D display
  • a WDS WDS for 3D display
  • An ICS PCS for 3D display
  • the display control information of a three-dimensional interactive graphic stream has a value of OxIC as type information.
  • the type information of a graphic segment is a corresponding segment.
  • the data reproducing apparatus 11 identifies the corresponding segment using the type information recorded in the segment.
  • FIG. 7 is a structural diagram of an ODS according to an embodiment of the present invention.
  • the ODS is the display control information of an object configuring the graphic stream.
  • the 3D display information for three-dimensionally displaying an object is recorded in the ODS of the present invention.
  • the ODS includes a segment_descriptor_3D field and an object_data_fragment_3D field.
  • the type information of the ODS is recorded in the segment_descriptor_3D field. As described above, for example, a value of 0x19 may be recorded.
  • the display control information of the object configuring the graphic stream is recorded in the object_data_fragment_3D field
  • the display control information of the object included in one graphic stream is recorded in a data structure composed of an object_data_3D field
  • the object_data_3D field is recorded in the object_data_fragment_3D field. That is, the object_data_fragment_3D field includes various object_data_3D fields.
  • FIG. 8 is a structural diagram of an object_data_3D field according to an embodiment of the present invention.
  • the object_data_3D field includes an object_data_length_2D field, an object_data_length_3D field, and an rle_coded_line_3D field.
  • Size information of a data structure including the display control information of the object configured two-dimensionally or three-dimensionally is recorded in the object_data_length_2D field and the object_data_length_3D field.
  • the number of bytes of the data structure may be recorded in the object_data_length_2D field and the object_data_length_3D field.
  • the 3D display information of the object configured three-dimensionally is recorded in the rle_coded_line_3D field. That is, information indicating with which degree of 3D effect of a three-dimensional object is displayed to the user or at which distance from the output device 12 in the user direction the three-dimensional object should appear to be located is recorded.
  • the 3D display information of the present invention may be recorded as inches, number of pixels, or the like.
  • Run-Length Encoding data may be recorded in the rle_coded_line_3D field.
  • FIG. 9 is a structural diagram of a PCS according to an embodiment of the present invention.
  • the PCS is the display control information of the presentation graphic stream.
  • the PCS includes a segment_descriptor_3D field and a composition_object_3D field.
  • the type information of the PCS is recorded in the segment_descriptor_3D field. As described above, for example, a value of OxIA may be recorded.
  • the display control information of the object configuring the presentation graphic stream is recorded in the composition_object_3D field.
  • the object of the presentation graphic stream is projected onto a window which is a two- dimensional screen defined in the WDS and positional information of the object displayed in the window or the like is recorded in the composition_object_3D field.
  • the 3D display information of the three-dimensional presentation graphic stream is included in the composition_object_3D field.
  • FIG. 10 is a structural diagram of a composition_object_3D field according to an embodiment of the present invention.
  • composition_object_3D field includes a composition_object_depth_position field, a cropping_cube field and an object_skew field.
  • the 3D display information of the three-dimensional object is recorded in the com- position_object_depth_position field.
  • the 3D display information of the three- dimensional object is distance information corresponding to the height of the three- dimensional object in a z direction, if a vertically upward direction of the window is set to the z direction.
  • the 3D display information may be recorded as inches, number of pixels or the like.
  • Information representing a cropped portion in the three-dimensional object is recorded in the cropping_cube field, and includes an object_cropping_depth_position field and an object_cropping_depth field. Positional information of a point where the cropped portion begins in the z direction is recorded in the object_cropping_depth_position field, and size information of the cropped portion in the z direction is recorded in the object_cropping_depth field.
  • Information representing the skewed shape of the object is recorded in the object_skew field, and includes a horizontal_vector field, a vertical_vector field and a depth_vector field. If the surface of the window is defined by an x direction and a y direction, information indicating the skewed degree of the three-dimensional object in the x direction is recorded in the horizontal_vector field, information indicating the skewed degree of the three-dimensional object in the y direction is recorded in the vertical_vector field, and information indicating the skewed degree of the three- dimensional object in the z direction is recorded in the depth_vector field.
  • FIG. 11 is a conceptual diagram of a 3D object, to which an object skew vector is applied, according to an embodiment of the present invention.
  • the data reproducing apparatus 11 projects the three-dimensional object of the presentation graphic stream onto the window which is the two-dimensional screen.
  • the three-dimensional object projected onto the window is displayed as a three-dimensional image having a height, in the z direction, corresponding to the value of the composition_object_depth_position field.
  • the data reproducing apparatus 11 may display a three-dimensional image, in which a three-dimensional object is skewed, using an object_skew_flag field. As shown in the drawing, the data reproducing apparatus 11 displays the three- dimensional object in the window in a state of being skewed by the value of the horizontal_vector field in the x direction, by the value of the vertical_vector field in the y direction, and by the value of the depth_vector field in the z direction.
  • FIG. 12 is a structural diagram of a WDS according to an embodiment of the present invention.
  • the WDS is the display control information of the window.
  • the window is a two- dimensional screen onto which the object of the presentation graphic stream is projected and is projected onto the graphic plane.
  • the 3D display information of the window is recorded in the WDS.
  • the WDS includes a segment_descriptor_3D field and a window_3D field.
  • the type information of the WDS is recorded in the segment_descriptor_3D field. As described above, for example, a value of OxIB may be recorded.
  • FIG. 13 is a structural diagram of a window_3D field according to an embodiment of the present invention.
  • the window_3D field includes a window_depth_position field, a window_tilt_flag field, a horizontal_vector field, a vertical_vector field and a depth_vector field.
  • the 3D display information of the window is recorded in the window_depth_position field.
  • the 3D display information of the window is distance information from the graphic plane to the window in a z direction, if a vertically upward direction of the graphic plane is set to the z direction.
  • the 3D display information of the window may be recorded as inches, number of pixels or the like.
  • window_tilt_flag field Information indicating whether the window is displayed so as to be tilted is stored in the window_tilt_flag field. For example, if a value of 0 is recorded in the window_tilt_flag field, it is indicated that the window is located in parallel to the graphic plane and, if a value of 1 is recorded, the window is located so as to be tilted from the graphic plan by the below-described window tilt vector.
  • the window tilt vector representing information about a tilted degree of the window in the x direction is recorded in the horizontal_vector field
  • the window tilt vector representing information about a tilted degree of the window in the y direction is recorded in the vertical_vector field
  • the window tilt vector representing information about a tilted degree of the window in the z direction is recorded in the depth_vector field.
  • FIG. 14 is a conceptual diagram of a 3D window, to which a window tilt vector is applied, according to an embodiment of the present invention.
  • the data reproducing apparatus 11 projects the window onto the graphic plane.
  • the data reproducing apparatus 11 displays the window so as to be located in the graphic plane by the value of the window_depth_position field in the z direction.
  • the data reproducing apparatus 11 may display the window so as to be tilted using the window_tilt_flag field. As shown in the drawing, the data reproducing apparatus 11 displays the window to be tilted by the value of the horizontal_vector field in the x direction, by the value of the vertical_vector field in the y direction, and by the value of the depth_vector field in the z direction, when a value of 1 is recorded in the window_tilt_flag field.
  • FIG. 15 is a structural diagram a button_3D field in an ICS according to an embodiment of the present invention.
  • the ICS is the display control information of the interactive graphic stream.
  • the 3D display information of the interactive graphic stream is recorded in the ICS.
  • the interactive graphic stream represents a predetermined effect according to the selection of the user, such as a menu or a button.
  • the interactive graphic stream is projected onto the below-described page.
  • the display control information of the button is recorded in the button_3D field included in the ICS.
  • the 3D display information of a three-dimensional button is recorded in the button_3D field.
  • the button_3D field includes a button_depth_position field.
  • the 3D display information of the three-dimensional button is recorded in the button_depth_position field.
  • the 3D display information of the three-dimensional button is distance information corresponding to the height of the button in a z direction, if a vertically upward direction of the page onto which the three-dimensional button is projected is set to the z direction.
  • the 3D display information of the three- dimensional button may be recorded as inches, number of pixels or the like.
  • the graphic stream configured in the stereoscopic mode includes the two- dimensional graphic stream of the left view and the two-dimensional graphic stream of the right view. Accordingly, the object configuring the graphic stream of the stereoscopic mode requires target information which is information indicating to which of the left view and the right view the graphic stream corresponds.
  • FIG. 16 is a structural diagram of a graphic segment type in a stereoscopic mode according to an embodiment of the present invention.
  • segments including the display control information of the window and the interactive graphic stream configured in the stereoscopic mode may be defined.
  • the segments have type information such as the graphic segment described with reference to FIG. 5.
  • the type information of the WDS (WDS for stereoscopic display) including the display control information of the window of the stereoscopic mode has a value of OxID and the type information of the ICS (ICS for stereoscopic display) including the display control information of the interactive graphic stream of the stereoscopic mode has a value of OxIE.
  • FIG. 17 is a structural diagram of a WDS (WDS for stereoscopic mode) in a stereoscopic mode according to an embodiment of the present invention.
  • the WDS is the display control information of the window configured in the stereoscopic mode.
  • the window is a two- dimensional screen onto which the object of the presentation graphic stream is projected, and is projected onto the graphic plane.
  • the target information of the window is recorded in the WDS.
  • the WDS includes a segment_descriptor_3D field and a window_3D field.
  • the type information of the WDS is recorded in the segment_descriptor_3D field. As described above, for example, a value of OxID may be recorded.
  • FIG. 18 is a structural diagram of a window_SC field according to an embodiment of the present invention.
  • the window_SC field includes a target_view field.
  • Target information indicating which graphic plane is projected onto the window configured in the stereoscopic mode is recorded in the target_view field. For example, if the target_view field has a value of 0, the window is projected onto the graphic plane of the left view and, if the target_view field has a value of 1, the window is projected onto the graphic plane of the right view. Accordingly, the data reproducing apparatus 11 projects the window on the graphic plane of one view using the value of the target_view field.
  • the window configured in the stereoscopic mode is displayed two-di- mensionally, the 3D display information of the window may not be recorded in the WDS, unlike the above-described autostereoscopic mode.
  • FIG. 19 is a conceptual diagram of projection of a window of a stereoscopic mode onto a graphic plane according to an embodiment of the present invention.
  • the data reproducing apparatus 11 projects the object of the presentation graphic stream defined in the PCS onto a corresponding window.
  • the target information which is the information indicating onto which window, the object is projected may be included in the PCS.
  • Object 1 is projected onto Window 1
  • Object 2 is projected onto Window 2
  • Object 3 is projected onto Window 3.
  • the data reproducing apparatus 11 projects the window onto the graphic plane using the value of the target_view field included in the WDS. That is, the data reproducing apparatus 11 projects the window onto the graphic plane of the left view or the graphic plane of the right view using the value of the target_view field.
  • FIG. 20 is a structural diagram of a page_SC field in an ICS in a stereoscopic mode according to an embodiment of the present invention.
  • the presentation graphic stream is projected onto the window and the interactive graphic stream is projected onto the page.
  • the display control information of the page configured in the stereoscopic mode is recorded in the page_SC field in the ICS.
  • the ICS in which the display control information of the interactive graphic stream of the stereoscopic mode is recorded has a value of OxIE as type information.
  • the display control information of the page onto which the interactive graphic stream is projected is recorded in the page_SC field.
  • the page_SC field includes a target_view_id field, and target information indicating onto which graphic plane the page is projected is recorded in the target_view_id field. For example, if a value of 0 is recorded in the target_view_id field, the page is projected onto the graphic plane of the left view and, if a value of 1 is recorded, the page is projected onto the graphic plane of the right view. Accordingly, the data reproducing apparatus 11 projects the page onto the graphic plane of one view using the value of the target_view_id field.
  • the page_SC field includes a default_selected_button_id_ref field and a default_activated_button_id_ref field, in which the identification information of the button projected onto the page may be recorded.
  • FIG. 21 is a structural diagram of a button_SC field in a stereoscopic mode according to an embodiment of the present invention.
  • the display control information of the button of the stereoscopic mode is recorded in the button_SC field.
  • the identification information of the button and the identification information of objects configuring the button are recorded in the button_SC field.
  • the button_SC field includes a button_id field, and the identification information of the button is recorded in the button_id field.
  • the button_SC field includes a normal_left_start_object_id_ref field, a normal_left_end_object_id_ref field, a normal_right_start_object_id_ref field, and a normal_right_end_obj ect_id_ref field.
  • the identification information of the object displayed when the button is in a normal state is recorded in the above- described fields.
  • the button of the stereoscopic mode is composed of a button of a left view and a button of a right view. Accordingly, both the identification information of the object configuring the button of the left view and the identification information of the object configuring the button of the right view should be included in the button_SC field.
  • the identification information of the start object projected onto the button of the left view in the normal state is recorded in the normal_left_start_object_id_ref field, and the identification information of the end object projected onto the button of the left view in the normal state is recorded in the normal_left_end_object_id_ref field.
  • the identification information of the start object projected onto the button of the right view in the normal state is recorded in the normal_right_start_object_id_ref field
  • the identification information of the end object projected onto the button of the right view in the normal state is recorded in the normal_right_end_object_id_ref field.
  • the identification information of the object projected onto the button in a state in which the button is selected by the user and the identification information of the object projected in a state in which the button is activated may be configured similar to the identification information of the object in the normal state and may be recorded in the button_SC field. Accordingly, the data reproducing apparatus 11 generates buttons using the identification information of the object.
  • the button of the left view and the button of the right view are divided for a three- dimensional display and should be configured so as to perform the same function.
  • FIG. 22 is a conceptual diagram of projection of a button onto a graphic plane in a stereoscopic mode according to an embodiment of the present invention.
  • the button_SC field includes the identification information of the object configuring the button.
  • the identification information of the object includes the object projected onto the button in a normal state, a selection state and an active state and the identification information of the object projected onto the button of the left view or the button of the right view. Accordingly, the data reproducing apparatus 11 projects the object onto the button of one view, that is, the button of one state, using the identification information of the object included in the button_SC field.
  • the data reproducing apparatus 11 projects the button, onto which the object is projected, onto the page using the identification information of the button included in the page_SC field.
  • Button_Ll is projected onto Page_Ll
  • Button_Rl is projected onto Page_Rl.
  • the data reproducing apparatus 11 projects the page, onto which the button is projected, on the graphic plane using the value of the target_view_id field included in the page_SC field. That is, the data reproducing apparatus 11 projects the page onto the graphic plane of the left view or the graphic plane of the right view using the value of the target_view_id field.
  • a segment including the 3D display information of the graphic stream may be defined.
  • FIG. 23 is a structural diagram of a DDS type according to an embodiment of the present invention.
  • the DDS has type information of the graphic segment described with reference to
  • FIG. 24 is a structural diagram of a DDS according to an embodiment of the present invention.
  • the DDS which is the segment including the 3D display information of the graphic stream includes a segment_descriptor field, a depth_id field, a depth_version_number field and a depth_conversion_entry field.
  • the type information of the DDS is recorded in the segment_descriptor field. As described above, for example, a value of 0x20 may be recorded.
  • the identification information of the DDS in which the 3D display information is recorded is recorded in the depth_id field. Accordingly, the data reproducing apparatus
  • the segment such as the PCS or the ICS of the present invention further includes a depth_id_ref field in which the identification information of the DDS used by the graphic stream is recorded.
  • Version information which is changed whenever the information about the DDS is updated is recorded in the depth_version_number field.
  • the 3D display information of the graphic stream is recorded in the depth_conversion_entry field, and the depth_conversion_entry field includes a coded_depth_value field, an output_number_value field and an output_den_value field.
  • Coded 3D display information is recorded in the coded_depth_value field.
  • a reproduction environment such as the screen size of the output device 12 for outputting the graphic stream as a three-dimensional image or a distance between the screen and the user may vary according to users. Since the reproduction environment varies according to users, the graphic stream having the same 3D display information may be displayed to some users with different 3D effect.
  • a Depth Conversion system for performing the above operation may be defined.
  • the DCS according to a first embodiment of the present invention maps the 3D display information recorded in the DDS to distance information corresponding to the size of the output device 12. Accordingly, the DCS according to the first embodiment of the present invention may include a table for mapping the 3D display information to the distance information corresponding to the size of the output device 12. In the present invention, the above table is referred to as a Graphic Depth Look-Up Table (GDLUT).
  • GDLUT Graphic Depth Look-Up Table
  • FIG. 25 is a structural diagram of an example of a GDLUT according to an embodiment of the present invention.
  • the GDLUT includes the 3D display information recorded in the coded_depth_value field and a converted depth value which is a distance information value actually and physically represented by the output device 12 corresponding to the value of the 3D display information.
  • the DCS converts the 3D display information recorded in the coded_depth_value field of the DDS into the converted depth value corresponding thereto, using the GDLUT.
  • the 3D display information corresponds to the distance information associated with the width of the output device 12.
  • the GDLUT of the present invention may be configured using information about the height and the resolution of the output device 12 and each DDS may be configured using an independent GDLUT.
  • a DCS directly calculates a distance information value physically represented by the output device 12 using the 3D display information recorded in the DDS.
  • the DCS calculates the 3D display information suitable for the reproduction environment of the user using the size information of the output device 12 and the 3D display information recorded in the DDS.
  • the DCS may use the values of the output_number_value field, the output_den_value field and the coded_depth_value field recorded in the DDS, in order to calculate the distance information value physically represented by the output device 12.
  • the DCS may calculate the distance information value physically represented by the output device 12 by ((output_number_value field value/ output_den_value field value) *coded_depth_value field value*width value of output device 12).
  • FIG. 26 is a conceptual diagram of decoding of an AV stream into a three-dimensional image using a DDS according to an embodiment of the present invention.
  • the data reproducing apparatus 11 decodes a video stream and a graphic stream.
  • a primary video plane and a secondary video plane may be generated by the decoded video stream and a presentation graphics plane and an interactive graphics plane may be generated by the decoded graphic stream.
  • the data reproducing apparatus 11 overlays the graphic plane upon the video plane. Accordingly, a three-dimensional graphic image is displayed and output in front of a three-dimensional video image.
  • the data reproducing apparatus 11 decodes the overlaid video plane and graphic plane, that is, one image obtained by synthesizing the video image and the graphic image, and outputs the image to the user.
  • the data reproducing apparatus 11 outputs the video stream and the graphic stream respectively projected onto the video plane and the graphic plane as a three-dimensional image using the 3D display information recorded in the DDS.
  • the data reproducing apparatus 11 searches for the 3D display information Depth_PV of the primary video stream projected onto the primary video plane using the DDS.
  • the data reproducing apparatus 11 outputs the primary video stream as the three-dimensional image using Depth_PV.
  • the data reproducing apparatus 11 searches for the 3D display information of the stream projected onto the secondary video plane, the presentation graphics plane and the interactive graphics plane using the DDS, similar to the primary video stream, and then outputs the stream as the three-dimensional image using the searched 3D display information.
  • FIG. 27 is a block diagram of an AV decoder 30 including a DCS according to an embodiment of the present invention.
  • the AV decoder 301 of the present invention includes a coded data buffer 310, a graphics processor 320, an object buffer 330, a composition buffer 340, a graphics controller 350, a Color Look-up Table (CLUT) 360, and a GDLUT 370.
  • CLUT Color Look-up Table
  • the coded data buffer 310 sends a graphic stream to the graphics processor 320 according to a reproduction time of the graphic stream such that decoding begins.
  • the graphics processor 320 sends image data of an object to be output to a screen in the graphic stream to the object buffer 330 and sends a segment, which is information for controlling the image of the object, to the composition buffer 340.
  • the graphics controller 350 controls the graphics plane 360 such that the graphic plane of the graphic stream is generated by the image data of the object stored in the output buffer 330 according to the information about the segment stored in the composition buffer 340.
  • the graphics plane 360 generates the graphic plane which is the image data of the graphic stream under the control of the graphics controller 350, and the CLUT 360 controls the color and the transparency of the generated graphic plane under the control of the graphics controller 350 and sends the graphic plane to the GDLUT 370.
  • the GDLUT 370 controls the generated graphic plane to be displayed as the three- dimensional image using the 3D display information.
  • the graphics controller 350 searches for the 3D display information of the graphic stream projected onto the graphic pane using the DDS stored in the composition buffer 340.
  • the DCS converts the 3D display information of the graphic stream into the distance information value physically represented by the output device 12.
  • the GDLUT 370 uses the converted 3D display information, displays the graphic stream projected onto the graphic plane as the three-dimensional image.
  • FIG. 28 is a block diagram of a data recording/reproducing apparatus according to an embodiment of the present invention.
  • a recording/reproducing unit 70 records or reads data on or from the recording medium 10.
  • the recording/reproducing unit 70 may be composed of a pickup 70.
  • the pickup 70 includes a laser diode mounted therein so as to record data on the surface of the recording medium 10 or read a signal reflected from the recording medium 10.
  • a servo 90 controls a tracking and focusing operation of the pickup 70 and an operation of a spindle motor 110.
  • An R/F unit generates a focus error signal, which is a signal for detecting focus deviation and track deviation of a laser beam, and a tracking error signal using a signal output from the pickup 70.
  • the spindle motor 110 rotates a disc mounted in the data recording/reproducing apparatus.
  • a motor driving unit 110 drives the pickup 70 and the spindle motor 110 under the control of the servo 90.
  • a signal processing unit 40 restores an RF signal received from the R/F unit 80 to a desired reproduction signal value, modulates a data signal to be recorded in a format recordable in the recording medium 10, and outputs the modulated signal.
  • a bit encoder 50 converts a recorded signal output from the signal processing unit 40 into a bit stream
  • a pickup driving unit 60 converts the bit stream generated by the bit encoder 50 into an optical signal to be stored in the recording medium 10.
  • a memory 150 performs a buffer function for temporarily storing information as- sociated with the recording medium 10, such as defect information of the recording medium 10, or temporarily storing data which will be recorded on or reproduced from the recording medium 10.
  • a microcomputer 120 is configured to control the signal processing unit 40, the servo 90 and the memory 150 and to control a drive including components to perform a data recording or reproducing operation.
  • a 3D control unit 130 generates a segment including 3D display information for displaying a graphic stream as a three-dimensional image.
  • the segment includes an ODS, a PCS, an ICS, a WDS, or an ODS.
  • the structures of the segments were described above.
  • the 3D control unit 130 sends the generated segment to the signal processing unit 40 or the microcomputer 120 such that the segment is recorded on the recording medium 10 along with the graphic stream.
  • the 3D control unit 130 is configured by one component, but the present invention is not limited thereto.
  • the function of the 3D control unit 130 may be performed by interlocking several components with each other, and the 3D control unit 130 may be implemented in a state of being integrated with the microcomputer 120 and/or a host 140.
  • the 3D control unit 130 may be further connected to the signal processing unit 40, the bit encoder 50, the pickup driving unit 60, the pickup 70, the servo 90, the motor driving unit 100, the spindle motor 110, the microcomputer 120 and a drive including the memory 150 so as to be updated.
  • the host 140 controls all the components included in the data recording/reproducing apparatus and controls the recording or the reproducing of the recording medium 10 by interfacing with the user.
  • the host 140 sends a command for enabling the data recording/reproducing apparatus of the present invention to perform a specific function to the microcomputer 120, and the microcomputer 120 controls the components interlocked with each other in the data recording/reproducing apparatus according to the command.
  • microcomputer 120 and the host 140 may be separately included and operated or the functions of the microcomputer 120, the 3D control unit 130 and the host 140 may be combined so as to be operated as one control unit.
  • the host 140 may be a main controller of a computer, a server, an audio device or a video device. That is, the recording/reproducing apparatus of the present invention may be an optical drive provided to a Personal Computer (PC) or a player which is not mounted in a PC.
  • PC Personal Computer
  • the data recording/reproducing apparatus is applicable to a drive which is mounted in a PC or a player used as an independent product.
  • An AV encoder 20 converts an input signal into a signal in a specific format recordable in the recording medium 10 and transmits the converted signal to the signal processing unit 40, in order to record data on the recording medium 10.
  • the AV encoder 20 may encode the input signal to an MPEG-format signal.
  • the signal processing unit 40 adds an error correction code (ECC) to the signal encoded by the AV encoder 20, converts the signal into a format recordable in the recording medium 10, and sends the converted signal to the bit encoder 50.
  • ECC error correction code
  • An AV decoder 30 finally decodes a reproduced signal of the recording medium 10 received from the signal processing unit 40 and provides output data such as a video signal, an audio signal and a graphic signal to the user.
  • the graphic signal may be composed of a signal of a graphic stream.
  • the AV decoder 30 decodes the graphic stream into a three-dimensional image using 3D display information for displaying the graphic stream as the three-dimensional image and outputs the three-dimensional image.
  • the configuration of the AV decoder 30 for decoding the graphic stream into the three-dimensional image using the DDS was described above with reference to FIG. 27.
  • the components of the data recording/reproducing apparatus according to the present invention may be implemented by software or hardware so as to perform the respective functions or may be implemented by interlocking hardware and software with each other.
  • Mode for the Invention may be implemented by software or hardware so as to perform the respective functions or may be implemented by interlocking hardware and software with each other.
  • a recording medium a data recording/reproducing method and a data recording/reproducing apparatus of the present invention, it is possible to reproduce a graphic stream recorded/reproduced on/from the recording medium as a stereoscopic three-dimensional image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A method and apparatus for reproducing a graphic stream reproduced along with a video stream as a three-dimensional image when recording/reproducing data and a recording medium for recording/reproducing the stream using the method and apparatus are disclosed. The recording medium includes a video stream including image data, a graphic stream reproduced along with the video stream, and a data zone in which stereoscopic display information for displaying the graphic stream as a three-dimensional image is recorded. Accordingly, it is possible to reproduce the graphic stream recorded/reproduced on/from the recording medium as a stereoscopic three-dimensional image.

Description

RECORDING MEDIUM, DATA RECORDING/REPRODUCING METHOD AND DATA RECORDING/REPRODUCING APPARATUS
The present invention relates to a recording medium, a data recording/reproducing method and a data recording/reproducing apparatus, and more particularly, to an apparatus and method for reproducing a graphic stream reproduced along with a video stream as a three-dimensional image and a recording medium for recording/reproducing the stream using the apparatus and method.
Recently, in accordance with rapid technological development, high-capacity recording media have appeared. Such recording media include a Blu-ray disc, a near field recording medium and the like.
Early recording media mainly performed a function for recording/reproducing a two-dimensional image with low capacity. However, as high-capacity recording media have appeared, a function for recording/reproducing a three-dimensional image with high capacity is required. Accordingly, recording media having three-dimensional image data recorded thereon and apparatuses for recording/reproducing the data are currently being developed.
In addition, there is a need for a method and apparatus for displaying a graphic stream reproduced along with a video stream which is three-dimensional image data, such as a menu or a subtitle, as a three-dimensional image.
An object of the present invention devised to solve the problem lies in an apparatus and method for reproducing a graphic stream reproduced along with a video stream as a three-dimensional image and a recording medium for recording/reproducing the stream using the apparatus and method.
The object of the present invention can be achieved by providing a recording medium including a data zone including: a video stream including image data; a graphic stream reproduced along with the video stream; and a 3D display information for displaying the graphic stream as a three-dimensional image.
In another aspect of the present invention, provided herein is a data recording method, the method including: recording a video stream including image data and a graphic stream reproduced along with the video stream in a data zone of a recording medium; and recording 3D display information for displaying the graphic stream as a three-dimensional image in the data zone.
In another aspect of the present invention, provided herein is a data reproducing method, the method including: reading a video stream including image data and a graphic stream reproduced along with the video stream from a data zone of a recording medium; reading a 3D display information for displaying the graphic stream as a three-dimensional image from the data zone; and reproducing the graphic stream as the three-dimensional image according to the 3D display information.
In another aspect of the present invention, provided herein is a data recording apparatus including: a recording unit configured to record data on a recording medium; and a control unit configured to control the recording unit to record a video stream including image data and a graphic stream reproduced along with the video stream in a data zone of the recording medium and to record a 3D display information for displaying the graphic stream as a three-dimensional image in the data zone.
In another aspect of the present invention, provided herein is a data reproducing apparatus including: a reproducing unit configured to read data from a recording medium; a decoder configured to convert a graphic stream reproduced along with a video stream including image data into image data which is able to be output on a screen; and a control unit configured to control the reproducing unit to read the video stream and the graphic stream from a data zone of the recording medium and to read a 3D display information for displaying the graphic stream as a three-dimensional image from the data zone, and to control the decoder to convert the graphic stream into the three-dimensional image according to the 3D display information and to reproduce the three-dimensional image.
According to a recording medium, a data recording/reproducing method and a data recording/reproducing apparatus of the present invention, it is possible to reproduce a graphic stream recorded/reproduced on/from the recording medium as a stereoscopic three-dimensional image. In addition, it is possible to efficiently manage 3D display information for displaying the graphic stream as the three-dimensional image. In addition, it is possible to provide a decoder for efficiently decoding the graphic stream.
The accompanying drawings, which are included to provide a further understanding of the invention, illustrate embodiments of the invention and along with the description serve to explain the principle of the invention.
In the drawings:
FIG. 1 is a diagram showing the combined use of a recording medium, a data reproducing apparatus and a peripheral device according to an embodiment of the present invention.
FIG. 2 is a conceptual diagram of a 3D effect and 3D display information according to an embodiment of the present invention.
FIG. 3 is a schematic diagram showing the structure of a recording medium according to an embodiment of the present invention.
FIG. 4 is a conceptual diagram illustrating decoding of an AV stream into an image according to an embodiment of the present invention.
FIG. 5 is a structural diagram of a graphic segment according to an embodiment of the present invention.
FIG. 6 is a structural diagram of a graphic segment type according to an embodiment of the present invention.
FIG. 7 is a structural diagram of an Object Definition Segment (ODS) according to an embodiment of the present invention.
FIG. 8 is a structural diagram of an object_data_3D field according to an embodiment of the present invention.
FIG. 9 is a structural diagram of a Presentation Composition Segment (PCS) according to an embodiment of the present invention.
FIG. 10 is a structural diagram of a composition_object_3D field according to an embodiment of the present invention.
FIG. 11 is a conceptual diagram of a 3D object, to which an object skew vector is applied, according to an embodiment of the present invention.
FIG. 12 is a structural diagram of a Window Definition Segment (WDS) according to an embodiment of the present invention.
FIG. 13 is a structural diagram of a window_3D field according to an embodiment of the present invention.
FIG. 14 is a conceptual diagram of a 3D window, to which a window tilt vector is applied, according to an embodiment of the present invention.
FIG. 15 is a structural diagram of a button_3D field in an Interactive Composition Segment (ICS) according to an embodiment of the present invention.
FIG. 16 is a structural diagram of a graphic segment type in a stereoscopic mode according to an embodiment of the present invention.
FIG. 17 is a structural diagram of a Window Definition Segment (WDS) for a stereoscopic display in a stereoscopic mode according to an embodiment of the present invention.
FIG. 18 is a structural diagram of a Window_SC field according to an embodiment of the present invention.
FIG. 19 is a conceptual diagram of a projection of a window of a stereoscopic mode onto a graphic plane according to an embodiment of the present invention.
FIG. 20 is a structural diagram of a page_SC field in an ICS in a stereoscopic mode according to an embodiment of the present invention.
FIG. 21 is a structural diagram of a button_SC field in a stereoscopic mode according to an embodiment of the present invention.
FIG. 22 is a conceptual diagram of projection of a button onto a graphic plane in a stereoscopic mode according to an embodiment of the present invention.
FIG. 23 is a structural diagram of a Depth Definition Segment (DDS) type according to an embodiment of the present invention.
FIG. 24 is a structural diagram of a DDS according to an embodiment of the present invention.
FIG. 25 is a structural diagram of an example of a Graphic Depth Look-Up Table (GDLUT) according to an embodiment of the present invention.
FIG. 26 is a conceptual diagram illustrating decoding of an AV stream into a three-dimensional image using a DDS according to an embodiment of the present invention.
FIG. 27 is a block diagram of an AV decoder including a Depth Conversion System (DCS) according to an embodiment of the present invention.
FIG. 28 is a block diagram of a data recording/reproducing apparatus according to an embodiment of the present invention.
Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. In the following description, the same components are denoted by the same terms and the same reference numerals, for convenience of description.
In addition, although the terms used in the present invention are selected from generally known and used terms, some of the terms mentioned in the description of the present invention have been selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Furthermore, the present invention should be understood not simply by the actual terms used but by the meanings of each term lying within.
Hereinafter, a Recording medium in the present invention includes all media on which data is recorded or will be recorded, such as an optical disc or a magnetic tape. For convenience of description and better understanding of the present invention, an optical disc, such as a Blu-ray Disc (BD), will hereinafter be exemplarily used as a recording medium in the present invention. It should be noted that technical ideas of the present invention are applicable to other recording media without departing from the scope and spirit of the invention.
As a method for providing a three-dimensional image to a user, there are a stereoscopic mode which requires special glasses and an autostereoscopic mode which does not require special glasses.
The stereoscopic mode uses binocular disparity to obtain a 3D effect by enabling left and right eyes of a user to view the same object in different directions. A two-dimensional image having binocular disparity is split and output to the left eye and the right eye, and a three-dimensional image is displayed and provided to the user in a manner in which a left image and a right image are respectively and alternately exposed to the left eye and the right eye via special glasses such as polarization filter glasses.
The autostereoscopic mode includes a volumetric mode for forming an actual three-dimensional image on a space and a holographic mode for reconstructing light diffused from an object and displaying a three-dimensional image according to an actual state.
A recording medium 10, a data recording/reproducing method and a data recording/reproducing apparatus according to the present invention will be described based on the volumetric mode and the holographic mode belonging to the autostereoscopic mode, and the stereoscopic mode will be separately described.
FIG. 1 is a diagram showing the combined use of a recording medium 10, a data reproducing apparatus 11 and a peripheral device according to an embodiment of the present invention. The recording medium and the data reproducing apparatus according to the embodiment of the present invention have a main function for providing a three-dimensional image to a user.
Three-dimensional image data of an autostereoscopic or stereoscopic mode is recorded on the recording medium 10 shown in FIG. 1. In addition, graphic data reproduced along with the three-dimensional image data, such as a menu and a subtitle, is recorded on the recording medium 10.
The data reproducing apparatus 11 reads and decodes the three-dimensional image data recorded on the loaded recording medium 10. For example, if three-dimensional image data of a stereoscopic mode is recorded on the recording medium 10, the data reproducing apparatus 11 sequentially reads and decodes image data of a left view and image data of a right view into one piece of stereoscopic three-dimensional image data. In addition, the data reproducing apparatus 11 decodes graphic data into a three-dimensional image. The decoded three-dimensional image data and graphic data are provided to an output device 12.
The output device 12 renders, outputs and provides the three-dimensional image data and graphic data decoded by the data reproducing apparatus 11 to a user. If the three-dimensional image data of the stereoscopic mode and graphic data are output, the user wears special glasses 13 such as polarization filter glasses so as to view a three-dimensional image.
FIG. 2 is a conceptual diagram of a 3D effect and 3D display information according to an embodiment of the present invention.
The 3D effect of the three-dimensional image indicates that a three-dimensional image is displayed so as to be located at a constant distance from the output device 12 in a user direction. A large 3D effect indicates that the three-dimensional image is displayed so as to appear to be close to the user and a small 3D effect indicates that the three-dimensional image is displayed so as to appear to be far from the user.
In order to implement the 3D effect of the three-dimensional image, the recording medium 10 should include information indicating at which distance from the output device 12 the three-dimensional image data and the graphic data (a subtitle, a menu, etc.) are displayed so as to be located in the user direction. In the present invention, this information is referred to as 3D display information.
In the drawing, it is determined at which distance from the output device 12 a 3D menu is located such that the 3D effect is displayed, by 3D display information of the 3D menu, and it is determined at which distance from the output device 12 a 3D subtitle is located such that the 3D effect is displayed, by 3D display information of the 3D subtitle.
Hereinafter, 3D display information of a graphic stream which is reproduced along with a video stream of image data, such as a subtitle or a menu, will be described.
First, FIG. 3 is a schematic diagram showing the structure of a recording medium 10 according to an embodiment of the present invention.
FIG. 3 shows a single-layer recording medium 10 composed of one layer. However, the present invention is not limited thereto and is applicable to all recording media having two or more layers. Since the layers can be configured to the same structure, the present invention will be described based on a single layer.
The recording medium 10 according to the present invention includes an inner zone 210, an outer zone 230, and a data zone 220.
The inner zone 210 is located at the inner circumference side of the recording medium 10 and the outer zone 230 is located at the outer circumference side of the recording medium 10. A variety of information for controlling the recording medium 10 is stored in the inner zone 210 and the outer zone 230.
Data which a user desires to record is stored in the data zone 220. A portion surrounded by the inner zone 210 and the outer zone 230 on the recording medium 10 is a zone in which data is actually recorded.
The data recorded in the data zone 220 may be composed of, for example, a multiplexed AV stream of a specific movie title. The AV stream includes a video stream which is image data, and a graphic stream which is image data reproduced along with the video stream, such as a menu or a subtitle.
The video stream is composed of two-dimensional image data or three-dimensional image data. As the three-dimensional image data, a video stream of an autostereoscopic mode may be composed of one three-dimensional video stream, and a video stream of a stereoscopic mode may be composed of a two-dimensional video stream of a left view and a two-dimensional video stream of a right view. In the stereoscopic mode, the data reproducing apparatus 11 reads and decodes both the two-dimensional video stream of the left view and the two-dimensional video stream of the right view into one three-dimensional image and reproduces the three-dimensional image.
The graphic stream is image data reproduced along with the video stream, and includes a presentation graphic stream and an interactive graphic stream.
The presentation graphic stream is fixed image data and includes, for example, a subtitle, a picture, and the like. The interactive graphic stream is image data representing a predetermined effect according to the selection of a user and includes, for example, a menu and the like.
The graphic stream is composed of two-dimensional image data or three-dimensional image data. As the three-dimensional image data, a graphic stream of an autostereoscopic mode may be composed of one three-dimensional graphic stream, and a graphic stream of a stereoscopic mode may be composed of a two-dimensional graphic stream of a left view and a two-dimensional graphic stream of a right view. In the stereoscopic mode, the data reproducing apparatus 11 reads and decodes both the two-dimensional graphic stream of the left view and the two-dimensional graphic stream of the right view into one three-dimensional image and reproduces the three-dimensional image.
FIG. 4 is a conceptual diagram illustrating decoding of an AV stream into an image according to an embodiment of the present invention.
The data reproducing apparatus 11 generates a video plane onto which a video stream is projected and a graphic plane onto which a graphic stream is projected. The data reproducing apparatus 11 overlays the graphic plane upon the video plane. Accordingly, a graphic image is displayed and output in front of a video image. The data reproducing apparatus 11 decodes the overlaid video plane and graphic plane, that is, one image obtained by synthesizing the video image and the graphic image, and outputs the image to the user.
The data reproducing apparatus 11 generates a graphic plane, onto which a three-dimensional image is projected, using the above-described 3D display information, in order to reproduce the graphic stream as a three-dimensional image.
Information for displaying the graphic stream as the three-dimensional image, such as 3D display information, is recorded in a graphic segment. The graphic segment is a data structure, in which display control information of a graphic stream such as a subtitle or a menu is stored, and is stored in the data zone 220 of the recording medium 10.
The graphic segment includes an Object Definition Segment (ODS) in which display control information of an object is recorded, a Presentation Composition Segment (PCS) in which display control information of a presentation graphic stream is recorded, an Interactive Composition Segment (ICS) in which display control information of an interactive graphic stream is recorded, and a Window Definition Segment (WDS) in which display control information of a window, onto which a presentation graphic stream is projected, is recorded. The details of the segments will be described later.
In the present invention, the general contents of the ODS, the PCS, the ICS and the WDS refer to US Laid-open Publication No. US 2006/0222334 A1 and BD Technical White Paper opened by the Blu-ray Disc Association (BDA).
Hereinafter, the graphic segment including the 3D display information of the graphic stream will be described.
FIG. 5 is a structural diagram of a graphic segment according to an embodiment of the present invention.
The graphic segment includes a segment_descriptor_3D field and a segment_data_3D field. The segment_descriptor_3D field includes type information of the graphic segment and the segment_data_3D field includes various segments (ODS, PCS, ICS, WDS, etc.) in which the display control information of the graphic stream is recorded.
FIG. 6 is a structural diagram of a graphic segment type according to an embodiment of the present invention.
The type information of the graphic segment is included in the segment_descriptor_3D field of the graphic segment and is recorded in a segment_type_3D field.
The type information of the graphic segment includes type information of segments including 3D display information of the graphic stream. In the type information of the segments including the 3D display information of the graphic stream, an ODS (ODS for 3D display) in which the display control information of a 3-dimensional object is recorded has a value 0x19 as type information. A PCS (PCS for 3D display) in which the display control information of a three-dimensional presentation graphic stream is recorded has a value of 0x1A as type information. A WDS (WDS for 3D display) in which the display control information of a three-dimensional window is recorded has a value of 0x1B as type information. An ICS (PCS for 3D display) in which the display control information of a three-dimensional interactive graphic stream is recorded has a value of 0x1C as type information.
The type information of a graphic segment is a corresponding segment. The data reproducing apparatus 11 identifies the corresponding segment using the type information recorded in the segment.
FIG. 7 is a structural diagram of an ODS according to an embodiment of the present invention.
The ODS is the display control information of an object configuring the graphic stream. The 3D display information for three-dimensionally displaying an object is recorded in the ODS of the present invention. The ODS includes a segment_descriptor_3D field and an object_data_fragment_3D field.
The type information of the ODS is recorded in the segment_descriptor_3D field. As described above, for example, a value of 0x19 may be recorded.
The display control information of the object configuring the graphic stream is recorded in the object_data_fragment_3D field, the display control information of the object included in one graphic stream is recorded in a data structure composed of an object_data_3D field, and the object_data_3D field is recorded in the object_data_fragment_3D field. That is, the object_data_fragment_3D field includes various object_data_3D fields.
FIG. 8 is a structural diagram of an object_data_3D field according to an embodiment of the present invention.
The object_data_3D field includes an object_data_length_2D field, an object_data_length_3D field, and an rle_coded_line_3D field.
Size information of a data structure including the display control information of the object configured two-dimensionally or three-dimensionally is recorded in the object_data_length_2D field and the object_data_length_3D field. For example, the number of bytes of the data structure may be recorded in the object_data_length_2D field and the object_data_length_3D field.
The 3D display information of the object configured three-dimensionally is recorded in the rle_coded_line_3D field. That is, information indicating with which degree of 3D effect of a three-dimensional object is displayed to the user or at which distance from the output device 12 in the user direction the three-dimensional object should appear to be located is recorded. The 3D display information of the present invention may be recorded as inches, number of pixels, or the like. For example, Run-Length Encoding data may be recorded in the rle_coded_line_3D field.
FIG. 9 is a structural diagram of a PCS according to an embodiment of the present invention.
The PCS is the display control information of the presentation graphic stream. The 3D display information of the presentation graphic stream is recorded in the PCS. The PCS includes a segment_descriptor_3D field and a composition_object_3D field.
The type information of the PCS is recorded in the segment_descriptor_3D field. As described above, for example, a value of 0x1A may be recorded.
The display control information of the object configuring the presentation graphic stream is recorded in the composition_object_3D field. As described below, the object of the presentation graphic stream is projected onto a window which is a two-dimensional screen defined in the WDS and positional information of the object displayed in the window or the like is recorded in the composition_object_3D field. In addition, the 3D display information of the three-dimensional presentation graphic stream is included in the composition_object_3D field.
FIG. 10 is a structural diagram of a composition_object_3D field according to an embodiment of the present invention.
The composition_object_3D field includes a composition_object_depth_position field, a cropping_cube field and an object_skew field.
The 3D display information of the three-dimensional object is recorded in the composition_object_depth_position field. The 3D display information of the three-dimensional object is distance information corresponding to the height of the three-dimensional object in a z direction, if a vertically upward direction of the window is set to the z direction. The 3D display information may be recorded as inches, number of pixels or the like.
Information representing a cropped portion in the three-dimensional object is recorded in the cropping_cube field, and includes an object_cropping_depth_position field and an object_cropping_depth field. Positional information of a point where the cropped portion begins in the z direction is recorded in the object_cropping_depth_position field, and size information of the cropped portion in the z direction is recorded in the object_cropping_depth field.
Information representing the skewed shape of the object is recorded in the object_skew field, and includes a horizontal_vector field, a vertical_vector field and a depth_vector field. If the surface of the window is defined by an x direction and a y direction, information indicating the skewed degree of the three-dimensional object in the x direction is recorded in the horizontal_vector field, information indicating the skewed degree of the three-dimensional object in the y direction is recorded in the vertical_vector field, and information indicating the skewed degree of the three-dimensional object in the z direction is recorded in the depth_vector field.
FIG. 11 is a conceptual diagram of a 3D object, to which an object skew vector is applied, according to an embodiment of the present invention.
As described above, the data reproducing apparatus 11 projects the three-dimensional object of the presentation graphic stream onto the window which is the two-dimensional screen. The three-dimensional object projected onto the window is displayed as a three-dimensional image having a height, in the z direction, corresponding to the value of the composition_object_depth_position field.
In addition, the data reproducing apparatus 11 may display a three-dimensional image, in which a three-dimensional object is skewed, using an object_skew_flag field. As shown in the drawing, the data reproducing apparatus 11 displays the three-dimensional object in the window in a state of being skewed by the value of the horizontal_vector field in the x direction, by the value of the vertical_vector field in the y direction, and by the value of the depth_vector field in the z direction.
FIG. 12 is a structural diagram of a WDS according to an embodiment of the present invention.
The WDS is the display control information of the window. The window is a two-dimensional screen onto which the object of the presentation graphic stream is projected and is projected onto the graphic plane. The 3D display information of the window is recorded in the WDS. The WDS includes a segment_descriptor_3D field and a window_3D field.
The type information of the WDS is recorded in the segment_descriptor_3D field. As described above, for example, a value of 0x1B may be recorded.
The display control information of the window is recorded in the window_3D field.
FIG. 13 is a structural diagram of a window_3D field according to an embodiment of the present invention.
The window_3D field includes a window_depth_position field, a window_tilt_flag field, a horizontal_vector field, a vertical_vector field and a depth_vector field.
The 3D display information of the window is recorded in the window_depth_position field. The 3D display information of the window is distance information from the graphic plane to the window in a z direction, if a vertically upward direction of the graphic plane is set to the z direction. The 3D display information of the window may be recorded as inches, number of pixels or the like.
Information indicating whether the window is displayed so as to be tilted is stored in the window_tilt_flag field. For example, if a value of 0 is recorded in the window_tilt_flag field, it is indicated that the window is located in parallel to the graphic plane and, if a value of 1 is recorded, the window is located so as to be tilted from the graphic plan by the below-described window tilt vector.
If the surface of the graphic plane is defined by an x direction and a y direction, the window tilt vector representing information about a tilted degree of the window in the x direction is recorded in the horizontal_vector field, the window tilt vector representing information about a tilted degree of the window in the y direction is recorded in the vertical_vector field, and the window tilt vector representing information about a tilted degree of the window in the z direction is recorded in the depth_vector field.
FIG. 14 is a conceptual diagram of a 3D window, to which a window tilt vector is applied, according to an embodiment of the present invention.
As described above, the data reproducing apparatus 11 projects the window onto the graphic plane. The data reproducing apparatus 11 displays the window so as to be located in the graphic plane by the value of the window_depth_position field in the z direction.
The data reproducing apparatus 11 may display the window so as to be tilted using the window_tilt_flag field. As shown in the drawing, the data reproducing apparatus 11 displays the window to be tilted by the value of the horizontal_vector field in the x direction, by the value of the vertical_vector field in the y direction, and by the value of the depth_vector field in the z direction, when a value of 1 is recorded in the window_tilt_flag field.
FIG. 15 is a structural diagram a button_3D field in an ICS according to an embodiment of the present invention.
The ICS is the display control information of the interactive graphic stream. The 3D display information of the interactive graphic stream is recorded in the ICS. The interactive graphic stream represents a predetermined effect according to the selection of the user, such as a menu or a button. The interactive graphic stream is projected onto the below-described page.
In the interactive graphic stream, the display control information of the button is recorded in the button_3D field included in the ICS. In addition, the 3D display information of a three-dimensional button is recorded in the button_3D field. As shown in the drawing, the button_3D field includes a button_depth_position field.
The 3D display information of the three-dimensional button is recorded in the button_depth_position field. The 3D display information of the three-dimensional button is distance information corresponding to the height of the button in a z direction, if a vertically upward direction of the page onto which the three-dimensional button is projected is set to the z direction. The 3D display information of the three-dimensional button may be recorded as inches, number of pixels or the like.
As described above, unlike the graphic stream configured in the autostereoscopic mode, the graphic stream configured in the stereoscopic mode includes the two-dimensional graphic stream of the left view and the two-dimensional graphic stream of the right view. Accordingly, the object configuring the graphic stream of the stereoscopic mode requires target information which is information indicating to which of the left view and the right view the graphic stream corresponds.
FIG. 16 is a structural diagram of a graphic segment type in a stereoscopic mode according to an embodiment of the present invention.
In the present invention, segments including the display control information of the window and the interactive graphic stream configured in the stereoscopic mode may be defined. The segments have type information such as the graphic segment described with reference to FIG. 5.
As shown in the drawing, the type information of the WDS (WDS for stereoscopic display) including the display control information of the window of the stereoscopic mode has a value of 0x1D and the type information of the ICS (ICS for stereoscopic display) including the display control information of the interactive graphic stream of the stereoscopic mode has a value of 0x1E.
FIG. 17 is a structural diagram of a WDS (WDS for stereoscopic mode) in a stereoscopic mode according to an embodiment of the present invention.
In the stereoscopic mode, the WDS is the display control information of the window configured in the stereoscopic mode. As described above, the window is a two-dimensional screen onto which the object of the presentation graphic stream is projected, and is projected onto the graphic plane. The target information of the window is recorded in the WDS. The WDS includes a segment_descriptor_3D field and a window_3D field.
The type information of the WDS is recorded in the segment_descriptor_3D field. As described above, for example, a value of 0x1D may be recorded.
The display control information of the window is recorded in the window_SC field.
FIG. 18 is a structural diagram of a window_SC field according to an embodiment of the present invention.
The window_SC field includes a target_view field. Target information indicating which graphic plane is projected onto the window configured in the stereoscopic mode is recorded in the target_view field. For example, if the target_view field has a value of 0, the window is projected onto the graphic plane of the left view and, if the target_view field has a value of 1, the window is projected onto the graphic plane of the right view. Accordingly, the data reproducing apparatus 11 projects the window on the graphic plane of one view using the value of the target_view field.
Since the window configured in the stereoscopic mode is displayed two-dimensionally, the 3D display information of the window may not be recorded in the WDS, unlike the above-described autostereoscopic mode.
FIG. 19 is a conceptual diagram of projection of a window of a stereoscopic mode onto a graphic plane according to an embodiment of the present invention.
First, the data reproducing apparatus 11 projects the object of the presentation graphic stream defined in the PCS onto a corresponding window. In the present invention, the target information which is the information indicating onto which window, the object is projected may be included in the PCS. In the drawing, Object 1 is projected onto Window 1, Object 2 is projected onto Window 2, and Object 3 is projected onto Window 3.
The data reproducing apparatus 11 projects the window onto the graphic plane using the value of the target_view field included in the WDS. That is, the data reproducing apparatus 11 projects the window onto the graphic plane of the left view or the graphic plane of the right view using the value of the target_view field.
FIG. 20 is a structural diagram of a page_SC field in an ICS in a stereoscopic mode according to an embodiment of the present invention.
The presentation graphic stream is projected onto the window and the interactive graphic stream is projected onto the page. The display control information of the page configured in the stereoscopic mode is recorded in the page_SC field in the ICS. The ICS in which the display control information of the interactive graphic stream of the stereoscopic mode is recorded has a value of 0x1E as type information.
The display control information of the page onto which the interactive graphic stream is projected is recorded in the page_SC field. The page_SC field includes a target_view_id field, and target information indicating onto which graphic plane the page is projected is recorded in the target_view_id field. For example, if a value of 0 is recorded in the target_view_id field, the page is projected onto the graphic plane of the left view and, if a value of 1 is recorded, the page is projected onto the graphic plane of the right view. Accordingly, the data reproducing apparatus 11 projects the page onto the graphic plane of one view using the value of the target_view_id field.
The page_SC field includes a default_selected_button_id_ref field and a default_activated_button_id_ref field, in which the identification information of the button projected onto the page may be recorded.
FIG. 21 is a structural diagram of a button_SC field in a stereoscopic mode according to an embodiment of the present invention.
The display control information of the button of the stereoscopic mode is recorded in the button_SC field. As the display control information of the button, the identification information of the button and the identification information of objects configuring the button are recorded in the button_SC field.
The button_SC field includes a button_id field, and the identification information of the button is recorded in the button_id field.
In addition, the button_SC field includes a normal_left_start_object_id_ref field, a normal_left_end_object_id_ref field, a normal_right_start_object_id_ref field, and a normal_right_end_object_id_ref field.
The identification information of the object displayed when the button is in a normal state, that is, when the button is not selected by the user, is recorded in the above-described fields. The button of the stereoscopic mode is composed of a button of a left view and a button of a right view. Accordingly, both the identification information of the object configuring the button of the left view and the identification information of the object configuring the button of the right view should be included in the button_SC field.
Accordingly, the identification information of the start object projected onto the button of the left view in the normal state is recorded in the normal_left_start_object_id_ref field, and the identification information of the end object projected onto the button of the left view in the normal state is recorded in the normal_left_end_object_id_ref field. Similarly, the identification information of the start object projected onto the button of the right view in the normal state is recorded in the normal_right_start_object_id_ref field, and the identification information of the end object projected onto the button of the right view in the normal state is recorded in the normal_right_end_object_id_ref field.
In addition, the identification information of the object projected onto the button in a state in which the button is selected by the user and the identification information of the object projected in a state in which the button is activated may be configured similar to the identification information of the object in the normal state and may be recorded in the button_SC field. Accordingly, the data reproducing apparatus 11 generates buttons using the identification information of the object.
The button of the left view and the button of the right view are divided for a three-dimensional display and should be configured so as to perform the same function.
FIG. 22 is a conceptual diagram of projection of a button onto a graphic plane in a stereoscopic mode according to an embodiment of the present invention.
First, the button_SC field includes the identification information of the object configuring the button. The identification information of the object includes the object projected onto the button in a normal state, a selection state and an active state and the identification information of the object projected onto the button of the left view or the button of the right view. Accordingly, the data reproducing apparatus 11 projects the object onto the button of one view, that is, the button of one state, using the identification information of the object included in the button_SC field.
The data reproducing apparatus 11 projects the button, onto which the object is projected, onto the page using the identification information of the button included in the page_SC field. In the drawing, Button_L1 is projected onto Page_L1 and Button_R1 is projected onto Page_R1.
Finally, the data reproducing apparatus 11 projects the page, onto which the button is projected, on the graphic plane using the value of the target_view_id field included in the page_SC field. That is, the data reproducing apparatus 11 projects the page onto the graphic plane of the left view or the graphic plane of the right view using the value of the target_view_id field.
Unlike the PCS and the ICS, in the present invention, a segment including the 3D display information of the graphic stream may be defined. A Depth Definition Segment (DDS) which is a segment including the 3D display information of the graphic stream and an AV decoder 30 for decoding the graphic stream using the DDS will be described with reference to FIG. 23 and the subsequent figures. In the recording medium 10 including the DDS recorded therein, the 3D display information of the graphic stream need not be recorded in the PCS and the ICS.
FIG. 23 is a structural diagram of a DDS type according to an embodiment of the present invention.
The DDS has type information of the graphic segment described with reference to FIG. 5. As shown in the drawing, the type information of the DDS has a value of 0x20.
FIG. 24 is a structural diagram of a DDS according to an embodiment of the present invention.
The DDS which is the segment including the 3D display information of the graphic stream includes a segment_descriptor field, a depth_id field, a depth_version_number field and a depth_conversion_entry field.
The type information of the DDS is recorded in the segment_descriptor field. As described above, for example, a value of 0x20 may be recorded.
The identification information of the DDS in which the 3D display information is recorded is recorded in the depth_id field. Accordingly, the data reproducing apparatus 11 identifies the graphic stream which uses the 3D display information of the DDS, using the depth_id field.
Accordingly, the segment such as the PCS or the ICS of the present invention further includes a depth_id_ref field in which the identification information of the DDS used by the graphic stream is recorded.
Version information which is changed whenever the information about the DDS is updated is recorded in the depth_version_number field.
The 3D display information of the graphic stream is recorded in the depth_conversion_entry field, and the depth_conversion_entry field includes a coded_depth_value field, an output_number_value field and an output_den_value field.
Coded 3D display information is recorded in the coded_depth_value field.
A reproduction environment such as the screen size of the output device 12 for outputting the graphic stream as a three-dimensional image or a distance between the screen and the user may vary according to users. Since the reproduction environment varies according to users, the graphic stream having the same 3D display information may be displayed to some users with different 3D effect.
Accordingly, an operation for converting the 3D display information so as to be suitable for the reproduction environments such that the 3D display information recorded in the DDS is displayed with the same 3D effect in various reproduction environments. In the present invention, a Depth Conversion system (DCS) for performing the above operation may be defined.
The DCS according to a first embodiment of the present invention maps the 3D display information recorded in the DDS to distance information corresponding to the size of the output device 12. Accordingly, the DCS according to the first embodiment of the present invention may include a table for mapping the 3D display information to the distance information corresponding to the size of the output device 12. In the present invention, the above table is referred to as a Graphic Depth Look-Up Table (GDLUT).
FIG. 25 is a structural diagram of an example of a GDLUT according to an embodiment of the present invention.
The GDLUT includes the 3D display information recorded in the coded_depth_value field and a converted depth value which is a distance information value actually and physically represented by the output device 12 corresponding to the value of the 3D display information. The DCS converts the 3D display information recorded in the coded_depth_value field of the DDS into the converted depth value corresponding thereto, using the GDLUT.
In the drawing, as the example of the GDLUT, the 3D display information corresponds to the distance information associated with the width of the output device 12. Further, the GDLUT of the present invention may be configured using information about the height and the resolution of the output device 12 and each DDS may be configured using an independent GDLUT.
A DCS according to a second embodiment of the present invention directly calculates a distance information value physically represented by the output device 12 using the 3D display information recorded in the DDS. The DCS calculates the 3D display information suitable for the reproduction environment of the user using the size information of the output device 12 and the 3D display information recorded in the DDS. The DCS may use the values of the output_number_value field, the output_den_value field and the coded_depth_value field recorded in the DDS, in order to calculate the distance information value physically represented by the output device 12. For example, the DCS may calculate the distance information value physically represented by the output device 12 by ((output_number_value field value/output_den_value field value)*coded_depth_value field value*width value of output device 12).
FIG. 26 is a conceptual diagram of decoding of an AV stream into a three-dimensional image using a DDS according to an embodiment of the present invention.
In order to decode the AV stream into the three-dimensional image, the data reproducing apparatus 11 decodes a video stream and a graphic stream. A primary video plane and a secondary video plane may be generated by the decoded video stream and a presentation graphics plane and an interactive graphics plane may be generated by the decoded graphic stream.
The data reproducing apparatus 11 overlays the graphic plane upon the video plane. Accordingly, a three-dimensional graphic image is displayed and output in front of a three-dimensional video image. The data reproducing apparatus 11 decodes the overlaid video plane and graphic plane, that is, one image obtained by synthesizing the video image and the graphic image, and outputs the image to the user.
The data reproducing apparatus 11 outputs the video stream and the graphic stream respectively projected onto the video plane and the graphic plane as a three-dimensional image using the 3D display information recorded in the DDS. In the drawing, the data reproducing apparatus 11 searches for the 3D display information Depth_PV of the primary video stream projected onto the primary video plane using the DDS. The data reproducing apparatus 11 outputs the primary video stream as the three-dimensional image using Depth_PV.
The data reproducing apparatus 11 searches for the 3D display information of the stream projected onto the secondary video plane, the presentation graphics plane and the interactive graphics plane using the DDS, similar to the primary video stream, and then outputs the stream as the three-dimensional image using the searched 3D display information.
FIG. 27 is a block diagram of an AV decoder 30 including a DCS according to an embodiment of the present invention.
The AV decoder 301 of the present invention includes a coded data buffer 310, a graphics processor 320, an object buffer 330, a composition buffer 340, a graphics controller 350, a Color Look-up Table (CLUT) 360, and a GDLUT 370.
The coded data buffer 310 sends a graphic stream to the graphics processor 320 according to a reproduction time of the graphic stream such that decoding begins.
The graphics processor 320 sends image data of an object to be output to a screen in the graphic stream to the object buffer 330 and sends a segment, which is information for controlling the image of the object, to the composition buffer 340.
The graphics controller 350 controls the graphics plane 360 such that the graphic plane of the graphic stream is generated by the image data of the object stored in the output buffer 330 according to the information about the segment stored in the composition buffer 340.
The graphics plane 360 generates the graphic plane which is the image data of the graphic stream under the control of the graphics controller 350, and the CLUT 360 controls the color and the transparency of the generated graphic plane under the control of the graphics controller 350 and sends the graphic plane to the GDLUT 370.
The GDLUT 370 controls the generated graphic plane to be displayed as the three-dimensional image using the 3D display information. In detail, the graphics controller 350 searches for the 3D display information of the graphic stream projected onto the graphic pane using the DDS stored in the composition buffer 340. The DCS converts the 3D display information of the graphic stream into the distance information value physically represented by the output device 12. Using the converted 3D display information, the GDLUT 370 displays the graphic stream projected onto the graphic plane as the three-dimensional image.
FIG. 28 is a block diagram of a data recording/reproducing apparatus according to an embodiment of the present invention.
In the overall configuration of the data recording/reproducing apparatus of the present invention, first, a recording/reproducing unit 70 records or reads data on or from the recording medium 10. In the embodiment of the present invention, the recording/reproducing unit 70 may be composed of a pickup 70. The pickup 70 includes a laser diode mounted therein so as to record data on the surface of the recording medium 10 or read a signal reflected from the recording medium 10.
A servo 90 controls a tracking and focusing operation of the pickup 70 and an operation of a spindle motor 110.
An R/F unit generates a focus error signal, which is a signal for detecting focus deviation and track deviation of a laser beam, and a tracking error signal using a signal output from the pickup 70.
The spindle motor 110 rotates a disc mounted in the data recording/reproducing apparatus.
A motor driving unit 110 drives the pickup 70 and the spindle motor 110 under the control of the servo 90.
A signal processing unit 40 restores an RF signal received from the R/F unit 80 to a desired reproduction signal value, modulates a data signal to be recorded in a format recordable in the recording medium 10, and outputs the modulated signal.
A bit encoder 50 converts a recorded signal output from the signal processing unit 40 into a bit stream, and a pickup driving unit 60 converts the bit stream generated by the bit encoder 50 into an optical signal to be stored in the recording medium 10.
A memory 150 performs a buffer function for temporarily storing information associated with the recording medium 10, such as defect information of the recording medium 10, or temporarily storing data which will be recorded on or reproduced from the recording medium 10.
A microcomputer 120 is configured to control the signal processing unit 40, the servo 90 and the memory 150 and to control a drive including components to perform a data recording or reproducing operation.
A 3D control unit 130 generates a segment including 3D display information for displaying a graphic stream as a three-dimensional image. The segment includes an ODS, a PCS, an ICS, a WDS, or an ODS. The structures of the segments were described above.
The 3D control unit 130 sends the generated segment to the signal processing unit 40 or the microcomputer 120 such that the segment is recorded on the recording medium 10 along with the graphic stream.
The 3D control unit 130 is configured by one component, but the present invention is not limited thereto. The function of the 3D control unit 130 may be performed by interlocking several components with each other, and the 3D control unit 130 may be implemented in a state of being integrated with the microcomputer 120 and/or a host 140. The 3D control unit 130 may be further connected to the signal processing unit 40, the bit encoder 50, the pickup driving unit 60, the pickup 70, the servo 90, the motor driving unit 100, the spindle motor 110, the microcomputer 120 and a drive including the memory 150 so as to be updated.
The host 140 controls all the components included in the data recording/reproducing apparatus and controls the recording or the reproducing of the recording medium 10 by interfacing with the user. In detail, the host 140 sends a command for enabling the data recording/reproducing apparatus of the present invention to perform a specific function to the microcomputer 120, and the microcomputer 120 controls the components interlocked with each other in the data recording/reproducing apparatus according to the command.
The microcomputer 120 and the host 140 may be separately included and operated or the functions of the microcomputer 120, the 3D control unit 130 and the host 140 may be combined so as to be operated as one control unit.
The host 140 may be a main controller of a computer, a server, an audio device or a video device. That is, the recording/reproducing apparatus of the present invention may be an optical drive provided to a Personal Computer (PC) or a player which is not mounted in a PC.
Accordingly, the data recording/reproducing apparatus according to the present invention is applicable to a drive which is mounted in a PC or a player used as an independent product.
An AV encoder 20 converts an input signal into a signal in a specific format recordable in the recording medium 10 and transmits the converted signal to the signal processing unit 40, in order to record data on the recording medium 10. For example, the AV encoder 20 may encode the input signal to an MPEG-format signal. The signal processing unit 40 adds an error correction code (ECC) to the signal encoded by the AV encoder 20, converts the signal into a format recordable in the recording medium 10, and sends the converted signal to the bit encoder 50.
An AV decoder 30 finally decodes a reproduced signal of the recording medium 10 received from the signal processing unit 40 and provides output data such as a video signal, an audio signal and a graphic signal to the user. The graphic signal may be composed of a signal of a graphic stream. The AV decoder 30 decodes the graphic stream into a three-dimensional image using 3D display information for displaying the graphic stream as the three-dimensional image and outputs the three-dimensional image. The configuration of the AV decoder 30 for decoding the graphic stream into the three-dimensional image using the DDS was described above with reference to FIG. 27.
The components of the data recording/reproducing apparatus according to the present invention may be implemented by software or hardware so as to perform the respective functions or may be implemented by interlocking hardware and software with each other.
Various embodiments have been described in the best mode for carrying out the invention.
According to a recording medium, a data recording/reproducing method and a data recording/reproducing apparatus of the present invention, it is possible to reproduce a graphic stream recorded/reproduced on/from the recording medium as a stereoscopic three-dimensional image. In addition, it is possible to efficiently manage 3D display information for displaying the graphic stream as the three-dimensional image. In addition, it is possible to provide a decoder for efficiently decoding the graphic stream.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims (37)

  1. A recording medium comprising:
    a data zone including:
    a video stream including image data;
    a graphic stream reproduced along with the video stream; and
    a 3D display information for displaying the graphic stream as a three-dimensional image.
  2. The recording medium according to claim 1, wherein:
    the graphic stream includes a presentation graphic stream which is a fixed image data, and
    the 3D display information is recorded in a Presentation Composition Segment (PCS) including a display control information of the presentation graphic stream.
  3. The recording medium according to claim 2, wherein the 3D display information includes information for displaying the presentation graphic stream as a skewed three-dimensional image.
  4. The recording medium according to claim 2, wherein the 3D display information includes a positional information of a cropped three-dimensional image in the presentation graphic stream.
  5. The recording medium according to claim 1, wherein:
    the graphic stream includes an interactive graphic stream including a image data representing a specific effect according to selection of a user, and
    the 3D display information is recorded in an Interactive Composition Segment (ICS) including a display control information of the interactive graphic stream.
  6. The recording medium according to claim 5, wherein:
    the interactive graphic stream includes a button image data and a page image data onto which the button image data is projected, and
    the 3D display information includes an identification information of the button image data onto which an image of an object is projected and an identification information of a graphic plane on which the page image data is projected.
  7. The recording medium according to claim 1, wherein:
    the 3D display information includes information for displaying a window, onto which an image of the graphic stream is projected, as the three-dimensional image, and an identification information of a graphic plane onto which image of the window is projected, and
    the 3D display information is recorded in a Window Definition Segment (WDS) which is a segment including display control information of the window.
  8. The recording medium according to claim 7, wherein the 3D display information includes information for displaying the window as a skewed three-dimensional image.
  9. The recording medium according to claim 1, wherein the 3D display information is recorded in a Depth Definition Segment (DDS) which is a segment composed of the 3D display information of the graphic stream.
  10. The recording medium according to claim 1, wherein the data zone includes a Graphic Depth Look-Up Table (GDLUT) which including information for converting the 3D display information into a distance information actually displayed by an output device.
  11. A data recording method, the method comprising:
    recording a video stream including image data and a graphic stream reproduced along with the video stream in a data zone of a recording medium; and
    recording a 3D display information for displaying the graphic stream as a three-dimensional image in the data zone.
  12. The data recording method according to claim 11, wherein:
    the graphic stream includes a presentation graphic stream which is a fixed image data, and
    the 3D display information includes information for displaying the presentation graphic stream as a skewed three-dimensional image and the 3D display information is recorded in a Presentation Composition Segment (PCS) including a display control information of the presentation graphic stream.
  13. The data recording method according to claim 11, wherein:
    the graphic stream includes an interactive graphic stream including a image data representing a specific effect according to selection of a user, and
    the 3D display information is recorded in an Interactive Composition Segment (ICS) including a display control information of the interactive graphic stream.
  14. The data recording method according to claim 13, wherein:
    the interactive graphic stream includes a button image data and a page image data onto which the a button image data is projected, and
    the 3D display information includes an identification information of the button image data onto which an image of an object is projected and an identification information of a graphic plane onto which the page image data is projected.
  15. The data recording method according to claim 11, wherein:
    the 3D display information includes information for displaying a window, onto which an image of the graphic stream is projected, as the three-dimensional image, an identification information of a graphic plane onto which an image of the window is projected, and information for displaying the window as a skewed three-dimensional image, and
    the 3D display information is recorded in a Window Definition Segment (WDS) which is a segment including display control information of the window.
  16. The data recording method according to claim 11, wherein:
    the 3D display information is recorded in a Depth Definition Segment (DDS) which is a segment composed of the 3D display information of the graphic stream, and
    the data zone includes a Graphic Depth Look-Up Table (GDLUT) which is information for converting the 3D display information into a distance information actually displayed by an output device.
  17. A data reproducing method, the method comprising:
    reading a video stream including image data and a graphic stream reproduced along with the video stream from a data zone of a recording medium;
    reading a 3D display information for displaying the graphic stream as a three-dimensional image from the data zone; and
    reproducing the graphic stream as the three-dimensional image according to the 3D display information.
  18. The data reproducing method according to claim 17, wherein:
    the graphic stream includes a presentation graphic stream which is a fixed image data, and
    the 3D display information includes information for displaying the presentation graphic stream as a skewed three-dimensional image and is recorded in a Presentation Composition Segment (PCS) including a display control information of the presentation graphic stream.
  19. The data reproducing method according to claim 17, wherein:
    the graphic stream includes an interactive graphic stream including a image data representing a specific effect according to selection of a user, and
    the 3D display information is recorded in an Interactive Composition Segment (ICS) including a display control information of the interactive graphic stream.
  20. The data reproducing method according to claim 19, wherein:
    the interactive graphic stream includes a button image data and a page image data onto which the button image data is projected, and
    the 3D display information includes an identification information of the button image data onto which an image of an object is projected and an identification information of a graphic plane on which an image of a page is projected.
  21. The data reproducing method according to claim 17, wherein:
    the 3D display information includes information for displaying a window, onto which an image of the graphic stream is projected, as the three-dimensional image, an identification information of a graphic plane onto which an image of the window is projected, and information for displaying the window as a skewed three-dimensional image, and
    the 3D display information is recorded in a Window Definition Segment (WDS) which is a segment including a display control information of the window.
  22. The data reproducing method according to claim 17, wherein:
    the 3D display information is recorded in a Depth Definition Segment (DDS) which is a segment composed of the 3D display information of the graphic stream, and
    the data zone includes a Graphic Depth Look-Up Table (GDLUT) which is information for converting the 3D display information into a distance information actually displayed by an output device.
  23. A data recording apparatus comprising:
    a recording unit configured to record data on a recording medium; and
    a control unit configured to control the recording unit to record a video stream including image data and a graphic stream reproduced along with the video stream in a data zone of the recording medium and to record a 3D display information for displaying the graphic stream as a three-dimensional image in the data zone.
  24. The data recording apparatus according to claim 23, wherein:
    the graphic stream includes a presentation graphic stream which is a fixed image data, and
    the 3D display information includes information for displaying the presentation graphic stream as a skewed three-dimensional image and is recorded in a Presentation Composition Segment (PCS) including a display control information of the presentation graphic stream.
  25. The data recording apparatus according to claim 23, wherein:
    the graphic stream includes an interactive graphic stream including an image data representing a specific effect according to selection of a user, and
    the 3D display information is recorded in an Interactive Composition Segment (ICS) including a display control information of the interactive graphic stream.
  26. The data recording apparatus according to claim 25, wherein:
    the interactive graphic stream includes a button image data and a page image data onto which the button image data is projected, and
    the 3D display information includes an identification information of the button image data onto which an image of an object is projected and an identification information of a graphic plane on which an image of a page is projected.
  27. The data recording apparatus according to claim 23, wherein:
    the 3D display information includes information for displaying a window, onto which an image of the graphic stream is projected, as the three-dimensional image, an identification information of a graphic plane onto which an image of the window is projected, and information for displaying the window as a skewed three-dimensional image, and
    the 3D display information is recorded in a Window Definition Segment (WDS) which is a segment including a display control information of the window.
  28. The data recording apparatus according to claim 23, wherein:
    the 3D display information is recorded in a Depth Definition Segment (DDS) which is a segment composed of the 3D display information of the graphic stream, and
    the data zone includes a Graphic Depth Look-Up Table (GDLUT) which is information for converting the 3D display information into a distance information actually displayed by an output device.
  29. A data reproducing apparatus comprising:
    a reproducing unit configured to read data from a recording medium;
    a decoder configured to convert a graphic stream reproduced along with a video stream including image data into image data which is able to be output on a screen; and
    a control unit configured to control the reproducing unit to read the video stream and the graphic stream from a data zone of the recording medium and to read a 3D display information for displaying the graphic stream as a three-dimensional image from the data zone, and to control the decoder to convert the graphic stream into the three-dimensional image according to the 3D display information and to reproduce the three-dimensional image.
  30. The data reproducing apparatus according to claim 29, wherein:
    the graphic stream includes a presentation graphic stream which is a fixed image data, and
    the 3D display information includes information for displaying the presentation graphic stream as a skewed three-dimensional image and is recorded in a Presentation Composition Segment (PCS) including a display control information of the presentation graphic stream.
  31. The data reproducing apparatus according to claim 29, wherein:
    the graphic stream includes an interactive graphic stream including an image data representing a specific effect according to selection of a user, and
    the 3D display information is recorded in an Interactive Composition Segment (ICS) including a display control information of the interactive graphic stream.
  32. The data reproducing apparatus according to claim 31, wherein:
    the interactive graphic stream includes a button image data and a page image data onto which the button image data is projected, and
    the 3D display information includes an identification information of the button image data onto which an image of an object is projected and an identification information of a graphic plane on which an image of a page is projected.
  33. The data reproducing apparatus according to claim 29, wherein:
    the 3D display information includes information for displaying a window, onto which an image of the graphic stream is projected, as the three-dimensional image, an identification information of a graphic plane onto which an image of the window is projected, and information for displaying the window as a skewed three-dimensional image, and
    the 3D display information is recorded in a Window Definition Segment (WDS) which is a segment including a display control information of the window.
  34. The data reproducing apparatus according to claim 29, wherein:
    the 3D display information is recorded in a Depth Definition Segment (DDS) which is a segment composed of the 3D display information of the graphic stream, and
    the data zone includes a Graphic Depth Look-Up Table (GDLUT) which is information for converting the 3D display information into a distance information actually displayed by an output device.
  35. The data reproducing apparatus according to claim 34, wherein the decoder includes a GDLUT unit configured to convert the 3D display information into the distance information actually displayed by the output device.
  36. The data reproducing apparatus according to claim 35, wherein the GDLUT unit converts the 3D display information into the distance information actually displayed by the output device using the GDLUT.
  37. The data reproducing apparatus according to claim 35, wherein the GDLUT unit calculates the distance information actually displayed by the output device using the 3D display information and a user reproduction environment information.
PCT/KR2009/006826 2008-11-21 2009-11-19 Recording medium, data recording/reproducing method and data recording/reproducing apparatus WO2010058977A2 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US11664308P 2008-11-21 2008-11-21
US61/116,643 2008-11-21
US11724408P 2008-11-24 2008-11-24
US61/117,244 2008-11-24
US11792408P 2008-11-25 2008-11-25
US11793608P 2008-11-25 2008-11-25
US61/117,924 2008-11-25
US61/117,936 2008-11-25

Publications (2)

Publication Number Publication Date
WO2010058977A2 true WO2010058977A2 (en) 2010-05-27
WO2010058977A3 WO2010058977A3 (en) 2010-09-10

Family

ID=42198671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2009/006826 WO2010058977A2 (en) 2008-11-21 2009-11-19 Recording medium, data recording/reproducing method and data recording/reproducing apparatus

Country Status (1)

Country Link
WO (1) WO2010058977A2 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003039698A1 (en) * 2001-11-02 2003-05-15 Atlantis Cyberspace, Inc. Virtual reality game system with pseudo 3d display driver & mission control
WO2008044191A2 (en) * 2006-10-11 2008-04-17 Koninklijke Philips Electronics N.V. Creating three dimensional graphics data
WO2008115222A1 (en) * 2007-03-16 2008-09-25 Thomson Licensing System and method for combining text with three-dimensional content

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003039698A1 (en) * 2001-11-02 2003-05-15 Atlantis Cyberspace, Inc. Virtual reality game system with pseudo 3d display driver & mission control
WO2008044191A2 (en) * 2006-10-11 2008-04-17 Koninklijke Philips Electronics N.V. Creating three dimensional graphics data
WO2008115222A1 (en) * 2007-03-16 2008-09-25 Thomson Licensing System and method for combining text with three-dimensional content

Also Published As

Publication number Publication date
WO2010058977A3 (en) 2010-09-10

Similar Documents

Publication Publication Date Title
WO2014054845A1 (en) Content processing apparatus for processing high resolution content and method thereof
WO2014054847A1 (en) Content processing apparatus for processing high resolution content and content processing method thereof
WO2012044128A4 (en) Display device, signal-processing device, and methods therefor
WO2013172667A1 (en) Recording medium, reproducing device for performing trick play for data of the recording medium, and method thereof
WO2020017840A1 (en) Method and device for inter predicting on basis of dmvr
WO2013111994A1 (en) Image processing method and apparatus for 3d video
WO2014092509A1 (en) Glasses apparatus and method for controlling glasses apparatus, audio apparatus and method for providing audio signal and display apparatus
WO2011159128A2 (en) Method and apparatus for providing digital broadcasting service with 3-dimensional subtitle
WO2020076058A1 (en) Method and apparatus for generating media file comprising 3-dimensional video content, and method and apparatus for replaying 3-dimensional video content
WO2018030567A1 (en) Hmd and control method therefor
WO2014010920A1 (en) Enhanced 3d audio/video processing apparatus and method
WO2010058977A2 (en) Recording medium, data recording/reproducing method and data recording/reproducing apparatus
WO2010062104A2 (en) Recording medium, method for recording/playing data, and device for recording/playing data
WO2011002153A2 (en) Recording medium, data recording/reproducing method, and data recording/reproducing apparatus
JPS58161468A (en) Copy transmitting device
WO2019203533A1 (en) Inter-prediction method in accordance with multiple motion model, and device thereof
JP2005070858A (en) Device, system, method and program for information processing, storage medium storing the program, and player
WO2019160289A1 (en) Electronic device and control method therefor
CA1212177A (en) Image storage system
WO2020009365A1 (en) Display apparatus and control method thereof and recording medium
KR100566970B1 (en) Method for copying video record products and Personal Video Recorder which is embodied in it
WO2024010356A1 (en) Image encoding/decoding method and apparatus, and recording medium having bitstream stored therein
WO2024117306A1 (en) Display device and operation method therefor
WO2024005545A1 (en) Image encoding/decoding method and apparatus, and recording medium having bitstream stored therein
WO2023068731A1 (en) Image decoding method and apparatus therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09827742

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09827742

Country of ref document: EP

Kind code of ref document: A2