WO2010119814A1 - データ構造、記録媒体、再生装置および再生方法、並びにプログラム - Google Patents
データ構造、記録媒体、再生装置および再生方法、並びにプログラム Download PDFInfo
- Publication number
- WO2010119814A1 WO2010119814A1 PCT/JP2010/056418 JP2010056418W WO2010119814A1 WO 2010119814 A1 WO2010119814 A1 WO 2010119814A1 JP 2010056418 W JP2010056418 W JP 2010056418W WO 2010119814 A1 WO2010119814 A1 WO 2010119814A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- data
- offset
- eye
- menu
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 147
- 230000008859 change Effects 0.000 claims description 186
- 230000008569 process Effects 0.000 claims description 132
- 239000000872 buffer Substances 0.000 description 130
- 238000012545 processing Methods 0.000 description 123
- 238000010586 diagram Methods 0.000 description 95
- 239000000203 mixture Substances 0.000 description 46
- 230000004044 response Effects 0.000 description 45
- 238000012546 transfer Methods 0.000 description 27
- 230000005540 biological transmission Effects 0.000 description 20
- 230000001172 regenerating effect Effects 0.000 description 17
- 239000013598 vector Substances 0.000 description 14
- 238000006243 chemical reaction Methods 0.000 description 12
- 239000003086 colorant Substances 0.000 description 12
- 239000000284 extract Substances 0.000 description 12
- 238000004519 manufacturing process Methods 0.000 description 9
- 230000015572 biosynthetic process Effects 0.000 description 8
- 238000003786 synthesis reaction Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 230000000717 retained effect Effects 0.000 description 6
- 238000002834 transmittance Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000035699 permeability Effects 0.000 description 1
- 239000012466 permeate Substances 0.000 description 1
- 230000008929 regeneration Effects 0.000 description 1
- 238000011069 regeneration method Methods 0.000 description 1
- 238000000547 structure data Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/12—Formatting, e.g. arrangement of data block or words on the record carriers
- G11B20/1217—Formatting, e.g. arrangement of data block or words on the record carriers on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/322—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23614—Multiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4348—Demultiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440218—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4621—Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
- H04N5/45—Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/10537—Audio or video recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/12—Formatting, e.g. arrangement of data block or words on the record carriers
- G11B2020/1264—Formatting, e.g. arrangement of data block or words on the record carriers wherein the formatting concerns a specific kind of data
- G11B2020/1289—Formatting of user data
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2541—Blu-ray discs; Blue laser DVR discs
Definitions
- the present invention relates to a data structure, a recording medium, a playback device, a playback method, and a program, and in particular, a data structure that can provide a video format for 3D display suitable for 3D display of subtitles and menu buttons.
- the present invention relates to a recording medium, a playback device, a playback method, and a program.
- 3D displays There are various types of displays having a 3D (3Dimensional) image display function (hereinafter referred to as 3D displays). There are various types of video formats for 3D display (hereinafter referred to as 3D video formats).
- a 3D video format a method using three or more viewpoint images (multi-views), specifically, a 3D video format using a two-dimensional image and a depth image suitable for a so-called lenticular 3D display, for example. and so on.
- a disc-shaped recording medium for recording data such as images there is a read-only optical disc based on the Blu-ray Disc (registered trademark) standard.
- subtitles are displayed on a plane different from the plane on which the moving image is displayed, and the subtitle plane and the moving image plane are combined so that the subtitle and the moving image are superimposed and displayed.
- the image data of one screen on which the subtitle is displayed on the moving image is displayed.
- Patent Document 1 provides a moving image plane and a caption plane based on the Blu-ray Disc standard, which is a recording / playback standard, and displays a moving image based on video data and a caption based on caption image data on one screen. The technology is described.
- the present invention has been made in view of such a situation, and is intended to provide a 3D video format suitable for 3D display of subtitles and menu buttons.
- a data structure or a recording medium is an image data of the sub-image used for 2D (2 Dimensional) display of a sub-image including subtitles or menu buttons, and a screen unit corresponding to the image data.
- Offset information including an offset direction indicating a shift direction of a left-eye L image and a right-eye R image used for 3D display of the sub-image in screen units and an offset value indicating a shift amount with respect to the sub-image of Or a recording medium on which data having the data structure is recorded.
- the playback device uses image data of the sub-image and the sub-image in units of a screen corresponding to the image data, which are used for 2D (2 Dimensional) display of a sub-image composed of subtitles or menu buttons.
- Data structure including an offset direction indicating a shift direction of a left-eye L image and a right-eye R image used for 3D display of the sub-image in screen units, and offset information including an offset value indicating a shift amount
- the image data included in the data is read, and based on the offset information, the image data of the L image and the R image is generated from the image data of the screen unit, It is a playback device that outputs image data in units of screens of the L image and the R image.
- the reproduction method and program according to the first aspect of the present invention correspond to the reproduction apparatus according to the first aspect of the present invention described above.
- Offset information including an offset direction indicating a shift direction of an L image for the left eye and an R image for the right eye used for 3D display of the sub image for each screen, and an offset value indicating the shift amount with respect to the sub image of the unit.
- Data having a data structure including is reproduced as follows. That is, the image data included in the data is read out. Based on the offset information, screen data of the L image and the R image is generated from the screen data. Then, image data for each screen of the L image and the R image is output.
- the data structure or the recording medium according to the second aspect of the present invention is the image data of the left-eye L image and the right-eye R image of the menu button and the L image used for 3D (3 Dimensional) display of the menu button.
- a set offset command for setting the offset information including offset information including an offset direction indicating a shift direction in screen units with respect to each of the image data and the image data of the R image, and an offset value indicating the shift amount;
- the left image of the menu button and the right image of R are used for 3D (3 Dimensional) display of the menu button.
- Sets offset information including image data, offset information indicating an offset direction indicating a shift direction in units of a screen with respect to each of image data of L image and image data of R image, and an offset value indicating a shift amount.
- a set offset command is included.
- a playback device for use in 3D (3 Dimensional) display of a menu button, image data of an L image for the left eye and an R image for the right eye of the menu button, and image data of the L image.
- a set offset command for setting the offset information, including offset information including an offset direction indicating a shift direction in units of a screen with respect to each of the image data of the R image and an offset value indicating a shift amount.
- reproduction method and program according to the second aspect of the present invention correspond to the reproduction apparatus according to the second aspect of the present invention described above.
- the image data of the L image for the left eye and the R image for the right eye of the menu button used for 3D (3 Dimensional) display of the menu button A set offset command for setting offset information, including offset information including an offset direction indicating a shift direction in screen units with respect to each of the image data of the L image and the image data of the R image, and an offset value indicating the shift amount;
- Data having a data structure including is reproduced as follows. That is, image data in units of screens of L image and R image included in the data is read and output. The image data of the L image and the image data of the R image are updated on a screen basis based on the offset information included in the set offset command. Then, the updated image data of the screen unit of the L image and the image data of the screen unit of the R image are output.
- 3D display of subtitles and menu buttons can be performed.
- FIG. 35 is a block diagram illustrating a detailed configuration example of a caption generation unit in FIG. 34.
- FIG. 35 is a flowchart for explaining caption generation processing of the playback apparatus of FIG. 34.
- FIG. It is a figure which shows the example of the subtitles displayed in 3D on the display part of FIG.
- FIG. 40 is a block diagram illustrating a detailed configuration example of a menu generation unit 331 in FIG. 39. It is a flowchart explaining the menu button offset change process by the menu production
- FIG. 51 is a flowchart for describing OSD display processing by the playback apparatus of FIG. 50.
- FIG. 56 It is a figure explaining the determination method of offset information. It is a figure which shows the structural example of the epoch of the caption data in 8th Embodiment of the disk to which this invention is applied. It is a figure explaining the window of subtitle data. It is a block diagram which shows the structural example of the reproducing
- FIG. 60 is a flowchart for explaining details of a caption generation process in FIG. 59.
- 61 is a flowchart for describing details of a right-eye caption object generation process of FIG. 60. It is a figure which shows the structural example of the epoch of the caption data in 9th Embodiment of the disk to which this invention is applied. It is a block diagram which shows the structural example of the reproducing
- 64 is a block diagram illustrating a detailed configuration example of a caption generation unit in FIG. 63.
- FIG. 65 is a flowchart for describing caption offset change processing by the caption generation unit in FIG. 64.
- FIG. 64 is a diagram illustrating an example of a caption displayed in 3D on the display unit in FIG. 63.
- FIG. 69 is a block diagram illustrating a detailed configuration example of a menu generation unit in FIG. 68.
- 70 is a flowchart illustrating menu button offset change processing by the menu generation unit of FIG. 69.
- FIG. 69 is a diagram illustrating an example of menu buttons displayed in 3D on the display unit in FIG. 68. It is a figure which shows the structural example of the epoch of subtitle data in 11th Embodiment of the disk to which this invention is applied.
- FIG. 74 is a block diagram illustrating a detailed configuration example of a caption generation unit in FIG. 73. It is a flowchart explaining the subtitle display change process by the subtitle generation part of FIG.
- FIG. 74 is a diagram illustrating another detailed configuration example of the caption generation unit in FIG. 73.
- FIG. 77 is a diagram for describing a method for generating common caption data for both eyes by the 2D conversion unit in FIG. 76.
- FIG. 16 is a block diagram illustrating a configuration example of a personal computer. It is a figure which shows the example of the syntax of PCS. It is a figure which shows the example of the syntax of ICS.
- FIG. 84 is a block diagram illustrating a detailed configuration example of a caption generation unit in FIG. 83. It is a block diagram which shows the detailed structural example of the 3D display data generation part of FIG.
- FIG. 1 is a diagram showing a configuration example of a first embodiment of a disk to which the present invention is applied.
- the disk 11 in FIG. 1 is configured by a BD-ROM (Blue-ray Disc-Read Memory Only) or the like, and an index file (index.bdmv) and a movie object file (MovieObject.bdmv) are recorded on the disk 11. .
- a playlist file PLAYLIST / XXXXX.mpls
- a clip information file CLIPINF / XXXXX.clpi
- STREAM / XXXXX.m2ts a stream file
- X is an arbitrary number from 0 to 9.
- the stream file is a TS (Transport Stream) file in which video data or audio data of a main image such as a movie is multiplexed according to ISO13818-2.
- this TS is referred to as an AV stream.
- FIG. 2 is a diagram illustrating a detailed configuration example of the index file.
- the index file for example, a list of title numbers recorded on the disc 11 and the types and numbers of objects executed corresponding to the title numbers are described.
- “MovieObject # 1”, “MovieObject # 2”, and “MovieObject # M” correspond to “First Play”, “Top menu”, and “Title # N”, respectively, in the index file. is described. Also, “BD-J Object # 1” is described corresponding to “Title # 1”.
- MovieObject # i and BD-J Object # i indicate that the object type is a movie object and a BD-J object, respectively, and the object number is i.
- Title # i indicates that the title number is i.
- the description of the BD-J object is omitted because it is not particularly related to the BD-J object.
- the above index file is also called an index table.
- FIG. 3 is a diagram illustrating a detailed configuration example of a movie object file.
- a plurality of movie objects are described in the movie object file.
- M movie objects to which numbers from 0 to M are assigned are described in the movie object file.
- a command is described in the movie object, and a playback device that plays back the disc 11 sequentially executes the command.
- FIG. 4 is a diagram showing a detailed configuration example of a movie object.
- commands “EQ (GPR # 0,1)”, “PlayPlayList (02000)”, “PlayPlayList (01000)”, and “JumpTitle # 1” are described in movie object # 1. .
- This command causes the playback device to play the playlist file (PLAYLIST / 02000.mpls) if the value of GPR # 0 is 1, and to play the playlist file (PLAYLIST / 01000.mpls) otherwise. Reproduce. Thereafter, the playback device transitions to title # 1.
- the playback device executes the BD-J object # 1.
- the commands “PlayPlayList (02000)” and “JumpTitle # 5” are described in the movie object # 2.
- the playback device plays back the playlist file (PLAYLIST / 02000.mpls). Thereafter, the playback apparatus executes an object corresponding to the type and number of the object described in the index file corresponding to the title number 5.
- FIG. 5 is a diagram illustrating a detailed configuration example of a playlist file.
- a playlist file is a file that is played back only by a movie object or a BD-J object, and describes information about an AV stream that is played back by one command described in these objects.
- the playlist file is composed of a plurality of play items.
- Each play item describes information specifying a clip information file corresponding to an AV stream to be played back and time information indicating a playback section of the AV stream.
- FIG. 6 is a diagram illustrating a detailed configuration example of a clip information file.
- the playback device can recognize the packet number of the AV stream to be played back corresponding to each play item by referring to the clip information file.
- the playback device when the playback device reads the play item # 0 of the playlist (PlayItem # 0), the playback device reads the clip information file (01000.clpi) specified by the play item. Then, the playback device refers to the clip information file (01000.clpi), recognizes the packet number of the playback section corresponding to the time information described by the play number 0, and the AV stream corresponding to the packet number Play. As a result, the reproduction is started from the packet whose packet number is 100.
- the playback device when the playback device reads the first play item (PlayItem # 1), the playback device reads the clip information file (02000.clpi) specified by the play item. Then, the playback device refers to the clip information file (02000.clpi), recognizes the packet number of the playback section corresponding to the time information described by the first play item, and AV stream corresponding to the packet number Play. As a result, the reproduction is started from the packet number 500.
- FIG. 7 is a diagram illustrating a detailed configuration example of a stream file.
- a stream file is encoded and multiplexed in accordance with MPEG2, MPEG-4MAVC (Advanced Video Coding), VC1, etc. as an AV stream, and is multiplexed as video data (V) and audio data.
- V video data
- A subtitle data
- I menu data
- the video data, caption data, and menu data are data for displaying the main image, caption, and menu button, respectively.
- the 100th packet of the AV stream of the stream file is video data
- the 101st packet is audio data
- the 500th packet is caption data
- the 800th packet is menu data. Yes.
- One packet consists of 192 bytes of data.
- FIG. 8 is a diagram for explaining extraction of a PES packet.
- the playback device When reproducing the stream file of the AV stream shown in FIG. 8A, the playback device extracts the PES packet from the AV stream as shown in FIG. 8B.
- the 100th packet of the AV stream shown in FIG. 8A is video data
- the 101st packet is audio data
- the 500, 550, and 1050 packets are subtitle data
- the 800th packet is a menu. It is data.
- the playback device extracts the PES packet including the 500th packet, the 550th packet, and the 1050th packet.
- FIG. 9 is a diagram showing a detailed configuration example of the PES packet.
- the PES packet is composed of a PES packet header and a segment.
- the PES packet header describes PTS (Presentation Time Stamp), DTS (Decoding Time Stamp) indicating the display time, and the like.
- Segments included in the PES packet of caption data include PCS (Presentation Composition Segment), WDS (Window Definition Definition), PDS (Palette Definition Segment), ODS (Object Definition Definition Segment), or END (End of Display Display Set Segment). is there.
- the segments included in the PES packet of the menu data include ICS (Interactive Composition Segment), PDS, ODS, and END.
- an ID (hereinafter referred to as a sub-image ID) assigned to the caption corresponding to each ODS, offset information (details will be described later) for each screen for displaying the caption in 3D, and the like are described.
- a structure such as a window position and size indicating a caption display range, an ID unique to the window (hereinafter referred to as a window ID), and the like are described.
- the PDS of caption data describes information on colors that can be used as caption colors.
- ODS of the caption data information indicating the shape of the caption is described.
- END of caption data is a segment indicating the end of a display set (details will be described later).
- the video data, subtitle data, and menu data recorded on the disc 11 are data for 2D display, and the playback device cannot display both left-eye and right-eye images. Therefore, in order to enable 3D display of an image, an offset direction (offset_flag) indicating the shift direction of the left-eye image and the right-eye image with respect to the 2D display image and an offset value (offset_value) indicating the shift amount are described as offset information. Is done.
- the left-eye offset direction and the right-eye offset direction are opposite directions.
- the offset value is expressed by the number of pixels, for example.
- the menu data ICS is executed by offset information in units of screens for displaying the menu button in 3D, an ID unique to the menu button corresponding to each ODS (hereinafter referred to as a button ID), and an operation of the menu button.
- Button information including menu control information such as commands is described.
- the menu button END is a segment indicating the end of the display set.
- FIG. 10 is a diagram illustrating a configuration example of a display set including the above-described caption data segments
- FIG. 11 is a diagram illustrating a configuration example of a display set including the menu data segments.
- the display set of subtitle data is composed of PCS, WDS, PDS, ODS, and END, which are subtitle segments for one screen.
- the display set of caption data is composed of PCS, WDS, PDS, ODS, and END.
- the menu data display set is composed of ICS, PDS, ODS, and END, which are menu button segments for one screen.
- the display set of menu data is composed of PCS, PDS # 1, PDS # 2, ODS, and END.
- PDS # 1, PDS # 2, ODS, and END since there are two types of color information that can be used as menu button colors for one screen, two types of PDS are arranged in the display set.
- FIG. 12 is a diagram showing a configuration example of an epoch composed of the display set as described above.
- the epoch is composed of an arbitrary number of display sets.
- the epoch is composed of three display sets, and in FIG. 12B, the epoch is composed of two display sets.
- the playback device continuously displays subtitles and menu buttons corresponding to one epoch, and after temporarily interrupting the display, displays the subtitles and menu buttons corresponding to the next epoch. That is, an epoch is a unit of a display set of subtitles and menu buttons that can be displayed continuously.
- FIG. 13 is a block diagram illustrating a configuration example of the reproducing device 20 that reproduces the disk 11 described above.
- the playback device 20 in FIG. 13 includes an input unit 21, a control unit 22, and a playback unit 23.
- the input unit 21 includes a keyboard, a mouse, a microphone, and the like.
- the input unit 21 receives a command from the user and supplies it to the control unit 22.
- the control unit 22 controls the reproduction unit 23 according to a command from the input unit 21.
- the playback unit 23 includes a drive 31, a read buffer 32, a PID filter 33, a 3D video generation unit 34, a 3D graphics generation unit 35, a 3D display data generation unit 36, and an audio generation unit 37.
- the drive 31 drives the loaded disk 11 according to the control of the control unit 22. As a result, the drive 31 reads the index file, AV stream, etc. recorded on the disk 11. The drive 31 supplies the read index file and the like to the control unit 22. The drive 31 supplies the read AV stream to the read buffer 32.
- the read buffer 32 holds the AV stream supplied from the drive 31 or reads the held AV stream and supplies it to the PID filter 33 according to the control of the control unit 22.
- the PID filter 33 extracts video data, subtitle data, menu data, and audio data packets included in the AV stream based on the packet ID (PID) of each packet of the AV stream from the read buffer 32.
- PID packet ID
- the PID is a unique ID for each type of data constituting the packet, and is added to the packet.
- the PID filter 33 extracts PES packets from the extracted video data, caption data, menu data, and audio data packets. Then, the PID filter 33 supplies the PES packet of video data to the 3D video generation unit 34 and supplies the PES packet of caption data and menu data to the 3D graphics generation unit 35. Further, the PID filter 33 supplies the PES packet of audio data to the audio generation unit 37.
- the 3D video generation unit 34 generates video data for the right eye and video data for the left eye using the PES packet of the video data supplied from the PID filter 33.
- the 3D video generation unit 34 decodes the PES packet of the video data, and uses the resulting video data as video data for the left eye.
- the 3D video generation unit 34 generates video data of an image obtained by shifting a main image corresponding to the video data by a predetermined offset value in a predetermined offset direction as video data for the right eye. Then, the 3D video generation unit 34 supplies the left-eye video data and the right-eye video data to the 3D display data generation unit 36 as 3D video data.
- the 3D graphics generation unit 35 includes a caption generation unit 41 and a menu generation unit 42.
- the caption generation unit 41 generates right-eye caption data and left-eye caption data using the PES packet of the caption data supplied from the PID filter 33. Then, the caption generation unit 41 supplies the right-eye caption data and the left-eye caption data to the 3D display data generation unit 36 as 3D caption data. Details of the caption generation unit 41 will be described with reference to FIG.
- the menu generation unit 42 generates menu data for the right eye and menu data for the left eye using the PES packet of the menu data supplied from the PID filter 33. Then, the menu generation unit 42 supplies the right-eye menu data and the left-eye menu data to the 3D display data generation unit 36 as 3D menu data.
- the 3D display data generation unit 36 converts the 3D video data supplied from the 3D video generation unit 34 and the 3D subtitle data and 3D menu data supplied from the 3D graphics generation unit 35 into data for each of the left and right eyes. To synthesize. Specifically, the 3D display data generation unit 36 generates left-eye display data by combining the left-eye video data, the left-eye caption data, and the left-eye menu data. In addition, the 3D display data generation unit 36 combines the right-eye video data, the right-eye caption data, and the right-eye menu data to generate right-eye display data. The 3D display data generation unit 36 supplies the display data for the left eye and the display data for the right eye to the display unit 51 as 3D display data.
- the audio generation unit 37 decodes the PES packet of the audio data supplied from the PID filter 33 and supplies the audio data obtained as a result to the speaker 52.
- the display unit 51 includes a 3D display.
- the display unit 51 performs output based on the 3D display data supplied from the 3D display data generation unit 36. As a result, the user can see the 3D image.
- the speaker 52 outputs sound corresponding to the audio data supplied from the audio generation unit 37.
- FIG. 14 is a block diagram illustrating a detailed configuration example of the caption generation unit 41 in FIG. 13.
- the caption generation unit 41 includes an encoded data buffer 61, a stream graphics generation unit 62, an object buffer 63, and a 3D generation unit 64.
- the caption generation unit 41 includes a right-eye graphics plane 65, a left-eye graphics plane 66, a CLUT (Color Look Up Table) 67, a composition buffer 68, and a control unit 69.
- the encoded data buffer 61 holds a segment of the PES packet of the caption data supplied from the PID filter 33.
- the encoded data buffer 61 supplies the PDS, ICS, WDS, and PCS to the stream graphics generation unit 62 based on the DTS included in the PES packet header of the PES packet of the caption data.
- the encoded data buffer 61 immediately supplies the PDS of the caption data supplied from the PID filter 33 to the stream graphics generation unit 62.
- the stream graphics generation unit 62 decodes the ODS supplied from the encoded data buffer 61, and supplies uncompressed caption data (run-length data) composed of index colors obtained as a result to the object buffer 63 as a caption object. To do. Further, the stream graphics generation unit 62 supplies the PDS, PCS, and WDS supplied from the encoded data buffer 61 to the composition buffer 68.
- the object buffer 63 holds the caption object supplied from the stream graphics generation unit 62.
- the 3D generation unit 64 reads the caption object from the object buffer 63 according to the control from the control unit 69. Based on the offset information in increments of screens included in the PCS from the control unit 69, the 3D generation unit 64 uses the subtitle objects for the right eye and the left eye for all the ODSs included in the same display set as the PCS. Generate a caption object.
- the 3D generation unit 64 converts the subtitle object of the screen unit obtained by shifting the subtitle of the screen unit corresponding to the subtitle object by the offset value in the offset direction of the offset information, the subtitle object for the right eye, and the subtitle object for the right eye Generated as a subtitle object for the left eye.
- the 3D generation unit 64 supplies the right-eye caption object to the right-eye graphics plane 65. Further, the 3D generation unit 64 supplies the left-eye caption object to the left-eye graphics plane 66.
- the right-eye graphics plane 65 holds the right-eye caption object for one screen supplied from the 3D generation unit 64.
- the right-eye graphics plane 65 reads the retained right-eye caption object in accordance with an instruction from the control unit 69 and supplies the read right-eye caption object to the CLUT 67.
- the left-eye graphics plane 66 holds the left-eye caption object for one screen supplied from the 3D generation unit 64.
- the left-eye graphics plane 66 reads the stored left-eye caption object in accordance with an instruction from the control unit 69 and supplies the read-out left-eye caption object to the CLUT 67.
- the CLUT 67 stores a table in which index colors are associated with Y, Cr, and Cb values based on the PDS supplied from the control unit 69.
- the CLUT 67 converts the index color of the right-eye caption object supplied from the right-eye graphics plane 65 into image data composed of Y, Cr, and Cb values based on the stored table.
- the CLUT 67 similarly converts the left-eye caption object supplied from the left-eye graphics plane 66 into image data. Then, the CLUT 67 outputs the image data of the right-eye caption object to the 3D display data generation unit 36 as the right-eye caption data, and outputs the image data of the left-eye caption object to the 3D display data generation unit 36 as the left-eye caption data. .
- the composition buffer 68 holds the PDS, PCS, and WDS supplied from the stream graphics generation unit 62.
- the control unit 69 reads the offset information of the screen unit included in the PCS from the composition buffer 68 and supplies it to the 3D generation unit 64. Further, the control unit 69 instructs the right-eye graphics plane 65 to transfer the right-eye caption object to the CLUT 67 at the timing based on the PTS included in the PES packet header, and the left-eye graphics plane 66 has the left-eye caption. Instructs transfer of the object to CLUT 67. Further, the control unit 69 reads the PDS from the composition buffer 68 and supplies it to the CLUT 67.
- control unit 69 controls each unit in accordance with a command from the control unit 22 (FIG. 13).
- menu generation unit 42 Since the menu generation unit 42 is configured in the same manner as the caption generation unit 41 of FIG. 14 except that the processing target is not caption data but menu data, the illustration is omitted.
- the encoded data buffer of the menu generation unit 42 holds a segment of the PES packet of menu data, and the composition buffer holds ICS and PDS.
- the stream graphics generation unit decodes the ODS of the menu data, and supplies the uncompressed menu data composed of the index colors to the object buffer as a menu object to be held.
- the 3D graphics generation unit generates a menu object for the right eye from the menu object from the object buffer based on the offset information for each screen included in the ICS of the menu data, and stores it in the graphics plane for the right eye. Further, the 3D graphics generation unit generates a left-eye menu object from the menu object from the object buffer based on the offset information for each screen included in the ICS of the menu data, and stores the left-eye menu object in the left-eye graphics plane.
- the CLUT converts the right-eye menu object into image data, outputs it as 3D display data generation unit 36 as right-eye menu data, converts the left-eye menu object into image data, and 3D display data generation unit as left-eye menu data To 36.
- FIG. 15 is a flowchart for explaining the reproduction processing by the reproduction apparatus 20. This reproduction process is started, for example, when the disk 11 is loaded in the drive 31.
- the drive 31 reads the index file from the disk 11 and supplies the index file to the control unit 22 in response to a command from the control unit 22.
- step S12 the drive 31 reads a movie object file corresponding to the first play of the index file from the disc 11 and supplies the movie object file to the control unit 22 in response to a command from the control unit 22 based on the index file.
- the control unit 22 recognizes a command described in the movie object included in the movie object file, and instructs the drive 31 to read the playlist in accordance with the command.
- step S ⁇ b> 13 the drive 31 reads a playlist according to the command of the movie object from the disc 11 in accordance with a command from the control unit 22 and supplies the playlist to the control unit 22.
- step S14 the drive 31 reads the clip information file specified by the playlist from the disc 11 and supplies the clip information file specified by the playlist to the control unit 22 in response to a command from the control unit 22 based on the playlist.
- the control unit 22 recognizes the packet number of the AV stream to be reproduced based on the playlist and the clip information file. Then, the control unit 22 instructs the drive 31 to read an AV stream including a packet having a packet number to be reproduced.
- step S15 the drive 31 reads the AV stream to be reproduced from the disk 11 and supplies it to the read buffer 32 in response to a command from the control unit 22.
- step S ⁇ b> 16 the read buffer 32 holds the AV stream supplied from the drive 31.
- the read buffer 32 reads the held AV stream and supplies it to the PID filter 33.
- step S17 the PID filter 33 extracts PES packets of video data, caption data, menu data, and audio data of the AV stream based on the PID of each packet of the AV stream from the read buffer 32. Then, the PID filter 33 supplies the PES packet of video data to the 3D video generation unit 34 and supplies the PES packet of caption data and menu data to the 3D graphics generation unit 35. Further, the PID filter 33 supplies the PES packet of audio data to the audio generation unit 37.
- step S18 the 3D video generation unit 34 generates 3D video data using the PES packet of the video data supplied from the PID filter 33, and supplies the 3D video data to the 3D display data generation unit 36.
- step S19 the 3D graphics generation unit 35 performs 3D graphics generation processing for generating 3D subtitle data and 3D menu data. Details of the 3D graphics generation processing will be described in detail with reference to FIG.
- step S20 the 3D display data generation unit 36 converts the 3D video data from the 3D video generation unit 34 and the 3D subtitle data and 3D menu data from the 3D graphics generation unit 35 into data for each of the left and right eyes. To synthesize. Then, the 3D display data generation unit 36 supplies the display data for the left eye and the display data for the right eye obtained as a result of the synthesis to the display unit 51 as 3D display data.
- step S21 the audio generation unit 37 decodes the PES packet of the audio data supplied from the PID filter 33, and generates audio data. Then, the audio generation unit 37 supplies the generated audio data to the speaker 52.
- step S22 the display unit 51 alternates the left-eye image corresponding to the left-eye display data and the right-eye image corresponding to the right-eye display data based on the 3D display data supplied from the 3D display data generation unit 36, or Display at the same time.
- step S23 the speaker 52 outputs a sound corresponding to the audio data supplied from the audio generation unit 37. Then, the process ends.
- the playback process immediately after the disc 11 is mounted has been described. However, when a title corresponding to a movie object file other than the first play is played after the disc 11 is mounted, the same playback process is performed. In this case, however, the movie object file read in step S12 is a movie object file corresponding to the title number of the title to be reproduced in the index file.
- the playback device 20 reads the second movie object file corresponding to the title number “Top Menu” in step S12. Process.
- FIG. 16 is a flowchart for explaining the details of the 3D graphics generation processing in step S19 of FIG.
- step S41 of FIG. 16 the caption generation unit 41 performs a caption generation process for generating 3D caption data. Details of the caption generation processing will be described with reference to FIG.
- step S42 the menu generation unit 42 performs a menu generation process for generating 3D menu data, and returns the process to step S19 in FIG. And the process after step S20 is performed.
- FIG. 17 is a flowchart for explaining the details of the caption generation processing in step S41 of FIG.
- the encoded data buffer 61 holds a segment of the PES packet of the caption data supplied from the PID filter 33.
- the encoded data buffer 61 reads the held segment and supplies it to the stream graphics generation unit 62.
- step S63 the stream graphics generation unit 62 supplies the PCS, PDS, and WDS supplied from the encoded data buffer 61 to the composition buffer 68 and holds them.
- step S64 the stream graphics generation unit 62 decodes the ODS supplied from the encoded data buffer 61, and supplies the caption object obtained as a result to the object buffer 63.
- step S ⁇ b> 65 the object buffer 63 holds the caption object supplied from the stream graphics generation unit 62.
- step S66 the 3D generation unit 64, based on the offset information in increments of screens included in the PCS from the control unit 69, subtitles for the right eye from subtitle objects corresponding to all ODSs included in the same display set as the PCS. Object and left eye caption object are generated.
- step S67 the 3D generation unit 64 supplies and holds the right-eye caption object to the right-eye graphics plane 65.
- step S68 the 3D generation unit 64 supplies the left-eye caption object to the left-eye graphics plane 66 and holds it.
- step S 69 the right-eye graphics plane 65 reads the stored right-eye caption object in accordance with an instruction from the control unit 69 and supplies the read right-eye caption object to the CLUT 67.
- step S ⁇ b> 70 the left-eye graphics plane 66 reads the retained left-eye caption object in accordance with an instruction from the control unit 69, and supplies it to the CLUT 67.
- step S71 the CLUT 67 converts the index color of the right-eye caption object supplied from the right-eye graphics plane 65 into image data composed of Y, Cr, and Cb values based on the stored table.
- step S72 the CLUT 67 converts the index color of the left-eye caption object supplied from the left-eye graphics plane 66 into image data composed of Y, Cr, and Cb values based on the stored table.
- step S73 the CLUT 67 outputs the image data of the right-eye caption object to the 3D display data generation unit 36 as the right-eye caption data, and the image data of the left-eye caption object to the 3D display data generation unit 36 as the left-eye caption data. Output. And a process returns to step S41 of FIG. 16, and progresses to step S42.
- step S42 in FIG. 16 is performed in the same manner as the caption generation process in FIG. 17 except that the processing target is not the caption data but menu data, and thus the description thereof is omitted.
- FIG. 18 is a diagram illustrating an example of subtitles displayed in 3D on the display unit 51 of the playback device 20.
- the playback device 20 Based on the offset information in increments of screens included in the PCS, the playback device 20 generates subtitles obtained as a result of shifting the subtitles in increments of screens corresponding to all ODSs included in the same display set as the PCS in the opposite direction.
- Subtitle objects are generated as a right eye subtitle object and a left eye subtitle object.
- subtitle # 1 and subtitle # 2 as 3D images displayed on one screen have the same length in the same depth direction.
- the depth direction is a direction perpendicular to the display surface of the display unit 51. If the direction toward the front of the display surface is the positive direction and the direction toward the back of the display surface is the negative direction, the subtitles appear to pop out when the position in the depth direction of the subtitles is positive. Looks retracted.
- Subtitle #i represents the i-th subtitle displayed in one screen.
- subtitle data and menu data are recorded on the disk 11, and offset information for each screen is recorded. Therefore, the playback device 20 can display 3D captions and menu buttons by generating 3D caption data from caption data based on the offset information in units of screens and generating 3D menu data from menu data.
- FIG. 19 is a diagram showing a configuration example of a display set of caption data in the second embodiment of the disc to which the present invention is applied
- FIG. 20 is a diagram showing a configuration example of a display set of menu data.
- offset information in units of ODS is described in ODS instead of PCS. Therefore, offset information can be set for each subtitle.
- ODS 19 is an example of a display set for displaying two subtitles on one screen, and two ODSs, ODS # 1 and ODS # 2, are arranged in the display set.
- ODS # 1 and ODS # 2 describe offset information # 1 and offset information # 2 in units of ODS, respectively.
- offset information in units of ODS is described in ODS instead of ICS. Therefore, offset information can be set for each menu button.
- the display set in FIG. 20 is a display set for displaying two menu buttons on one screen, and two ODSs, ODS # 1 and ODS # 2, are arranged in the display set.
- ODS # 1 and ODS # 2 describe offset information # 1 and offset information # 2 in units of ODS, respectively.
- FIG. 21 is a block diagram illustrating a configuration example of a playback device 90 that plays back the disk 81 described above.
- FIG. 21 differs from the configuration of FIG. 13 mainly in that a playback unit 91 is provided instead of the playback unit 23.
- the configuration of the playback unit 91 is different from the configuration of FIG. 13 in that a 3D graphics generation unit 101 is provided instead of the 3D graphics generation unit 35.
- the 3D graphics generation unit 101 includes a caption generation unit 111 and a menu generation unit 112.
- the caption generation unit 111 generates caption data for the right eye and caption data for the left eye based on the offset information in units of ODS, using the PES packet of the caption data supplied from the PID filter 33. Then, the caption generation unit 111 supplies the right-eye caption data and the left-eye caption data to the 3D display data generation unit 36 as 3D caption data. Details of the caption generation unit 111 will be described with reference to FIG.
- the menu generation unit 112 generates menu data for the right eye and menu data for the left eye based on offset information in units of ODS, using the PES packet of the menu data supplied from the PID filter 33. Then, the menu generation unit 112 supplies the menu data for the right eye and the menu data for the left eye to the 3D display data generation unit 36 as 3D menu data.
- FIG. 22 is a block diagram illustrating a detailed configuration example of the caption generation unit 111 of the playback device 90.
- the configuration of the caption generation unit 111 in FIG. 22 is mainly because a 3D generation unit 121 is provided instead of the 3D generation unit 64, and a control unit 122 is provided instead of the control unit 69. 14 different from the configuration.
- the 3D generation unit 121 reads the caption object from the object buffer 63 according to the control from the control unit 122.
- the 3D generation unit 121 generates a right-eye caption object and a left-eye caption object from the caption object corresponding to the ODS based on the offset information in ODS units included in each ODS from the control unit 122.
- the 3D generation unit 121 shifts each subtitle in the screen corresponding to the subtitle object by the offset value in the offset direction of the offset information in the ODS unit corresponding to the subtitle by the offset value.
- Subtitle objects are generated as a right eye subtitle object and a left eye subtitle object.
- the 3D generation unit 121 supplies the right-eye caption object to the right-eye graphics plane 65. In addition, the 3D generation unit 121 supplies the left-eye caption object to the left-eye graphics plane 66.
- the control unit 122 reads offset information in units of ODS included in each ODS from the composition buffer 68 and supplies the information to the 3D generation unit 121. Similarly to the control unit 69, the control unit 122 instructs the right-eye graphics plane 65 to transfer and instructs the left-eye graphics plane 66 to transfer at a timing based on the PTS included in the PES packet header. . Further, like the control unit 69, the control unit 122 reads the PDS from the composition buffer 68 and supplies it to the CLUT 67.
- control part 122 controls each part according to the instruction
- FIG. 21 the control part 122 controls each part according to the instruction
- menu generation unit 112 Since the menu generation unit 112 is configured in the same manner as the subtitle generation unit 111 of FIG. 22 except that the processing target is not the subtitle data but the menu data, illustration is omitted.
- FIG. 23 is a flowchart for explaining the details of the caption generation processing in step S41 of FIG.
- step S86 based on the offset information included in each ODS from the control unit 122, the 3D generation unit 121 generates a right-eye caption object and a left-eye caption object from the caption object corresponding to the ODS. Then, the process proceeds to step S87.
- steps S87 through S93 is the same as the processing in steps S67 through S73 in FIG.
- the menu generation process in step S42 in FIG. 16 by the playback device 90 is performed in the same manner as the caption generation process in FIG. 23 except that the processing target is not the caption data but the menu data, and thus the description thereof is omitted. To do.
- FIG. 24 is a diagram illustrating an example of subtitles displayed in 3D on the display unit 51 of the playback device 90.
- the playback device 90 Based on the offset information in units of ODS included in each ODS, the playback device 90 converts the subtitle objects of the subtitles obtained as a result of shifting the subtitles corresponding to the ODS in the opposite direction into the subtitle object for the right eye and the subtitle object for the left eye Generate as
- the positions in the depth direction of caption # 1 and caption # 2 as 3D images displayed on one screen can be made different.
- the subtitle # 1 and the subtitle # 2 have the same depth in the depth direction, that is, both the subtitle # 1 and the subtitle # 2 pop out, but may be different. it can.
- the subtitles or menu buttons for each eye must not protrude from the plane (screen). Also, when multiple menu buttons exist in one screen and offset information is set for each menu button, that is, when offset information is described in ODS units, a right-eye image and a left-eye image of a certain menu button Each must not overlap with the right eye image or left eye image of another menu button.
- subtitle data and menu data are recorded on the disk 81, and offset information in ODS units is recorded. Accordingly, the playback device 90 can generate 3D subtitle data from subtitle data based on the offset information in units of ODS, and generate 3D menu data from the menu data, thereby displaying 3D subtitles and menu buttons.
- FIG. 25 is a diagram showing a configuration example of a display set of caption data in the third embodiment of the disc to which the present invention is applied
- FIG. 26 is a diagram showing a configuration example of a display set of menu data.
- the set offset command is a navigation command for setting the offset change information including the offset change information indicating the offset information after changing the screen unit of the subtitles and menu buttons.
- offset information indicating the difference between the vector represented by the offset information being set and the vector represented by the offset information after the change is used as the offset change information.
- the playback device 160 When executing the set offset command, the playback device 160 (FIG. 27 to be described later) that plays the disc 151 executes the set offset command, subtitles described in the set offset command, offset change information for each menu button screen, and currently set subtitles.
- the screen-by-screen offset information of subtitles and menu buttons is changed based on the screen-by-screen offset information of menu buttons.
- FIG. 27 is a block diagram illustrating a configuration example of the reproducing device 160 that reproduces the disk 151 described above.
- the configuration of the playback device 160 in FIG. 27 is mainly that a control unit 161 is provided instead of the control unit 22, and a playback unit 162 is provided instead of the playback unit 23 in FIG. Different from the configuration.
- the configuration of the playback unit 162 is different from the configuration of FIG. 13 in that a 3D graphics generation unit 171 is provided instead of the 3D graphics generation unit 35.
- the control unit 161 controls the playback unit 162 in accordance with a command from the input unit 21. Further, the control unit 161 requests the 3D graphics generation unit 171 for a command corresponding to the menu button in response to a command corresponding to the operation of the menu button from the input unit 21. Then, the control unit 161 sets the offset change information in units of screens of subtitles and menu buttons described in the set offset command transmitted as a result by holding in the built-in register 161A. The control unit 161 supplies the 3D graphics generation unit 171 with the offset change information in units of screens of subtitles and menu buttons held in the register 161A.
- the register 161A is constituted by a register that holds, for example, a playback device setting status and playback status, called PSR (Player Status Registers).
- the register 161A holds offset change information for each screen of subtitles and menu buttons.
- the 3D graphics generation unit 171 includes a caption generation unit 181 and a menu generation unit 182. Similar to the caption generation unit 41 in FIG. 13, the caption generation unit 181 uses the PES packet of the caption data supplied from the PID filter 33 and the right-eye caption data and the left-eye caption data based on the offset information in units of screens. Generate caption data. Then, the caption generation unit 181 supplies the right-eye caption data and the left-eye caption data to the 3D display data generation unit 36 as 3D caption data.
- the subtitle generating unit 181 updates the subtitle screen offset information based on the subtitle screen offset change information transmitted from the control unit 161 and the currently set offset information.
- the menu generation unit 182 uses the menu data PES packet supplied from the PID filter 33 in the same manner as the menu generation unit 42 of FIG. 13, and uses the right-eye menu data and the left-eye menu data based on the offset information for each screen. Generate menu data. Then, the menu generation unit 182 supplies the right-eye menu data and the left-eye menu data to the 3D display data generation unit 36 as 3D menu data.
- the menu generation unit 182 sends a set offset command included in the ICS to the control unit 161 in response to a command request corresponding to the offset change button that is a menu button for instructing the change of the offset from the control unit 161. Send. Then, the menu generation unit 182 obtains the offset information for the screen unit of the menu button based on the offset change information for the screen unit of the menu button and the currently set offset information transmitted from the control unit 161 as a result. Update.
- FIG. 28 is a block diagram illustrating a detailed configuration example of the caption generation unit 181 of the playback device 160.
- the configuration of the caption generation unit 181 in FIG. 28 is mainly different from the configuration in FIG. 14 in that a control unit 191 is provided instead of the control unit 69.
- the control unit 191 reads the offset information for each screen included in the PCS from the composition buffer 68 and supplies it to the 3D generation unit 64 in the same manner as the control unit 69. Similarly to the control unit 69, the control unit 191 instructs the right-eye graphics plane 65 and the left-eye graphics plane 66 to transfer at a timing based on the PTS included in the PES packet header. Further, like the control unit 69, the control unit 191 reads the PDS from the composition buffer 68 and supplies it to the CLUT 67.
- control unit 191 controls each unit in accordance with a command from the control unit 161 (FIG. 27).
- control unit 191 receives the offset change information for each subtitle screen stored in the register 161A, which is transmitted from the control unit 161.
- the control unit 161 adds the vector represented by the received subtitle screen offset change information and the vector represented by the screen unit offset information included in the PCS, and adds the screen unit offset information represented by the vector to the new screen. Set as unit offset information. Then, the control unit 191 supplies the screen unit offset information to the 3D generation unit 64.
- the menu generation unit 182 of the playback device 160 is configured in the same manner as the caption generation unit 181 in FIG. 28 except that the processing target is not caption data but menu data. The illustration is omitted.
- the control unit of the menu generation unit 182 reads the set offset command included in the ICS from the composition buffer in response to a command request corresponding to the offset change button from the control unit 161, and transmits the set offset command to the control unit 161.
- the playback process, 3D graphics generation process, and caption generation process performed by the playback device 160 are the same as the playback process of FIG. 15, the 3D graphics generation process of FIG. 16, and the caption generation process of FIG. To do.
- FIG. 29 is a flowchart for explaining subtitle offset change processing by the subtitle generation unit 181 of the playback device 160.
- This subtitle offset change process is started when the control unit 161 transmits offset change information in response to a command corresponding to the operation of the offset change button from the input unit 21.
- control unit 191 receives from the control unit 161 offset change information for each subtitle screen held in the register 161A.
- step S102 the control unit 191 sets new screen unit offset information based on the subtitle screen unit offset change information received from the control unit 161 and the screen unit offset information included in the PCS. Then, the control unit 191 supplies the set offset information for each screen to the 3D generation unit 64, and the process proceeds to step S103.
- step S103 the 3D generation unit 64 generates a right-eye caption object and a left-eye caption object from the caption object based on the offset information in units of screen supplied from the control unit 191, and advances the processing to step S104.
- the processing in steps S104 to S110 is the same as the processing in steps S67 to S73 in FIG.
- menu offset changing process by the menu generating unit 182 is performed in the same manner as the caption offset changing process of FIG. 29 except that the processing target is not the caption data but the menu data, and thus the description thereof is omitted.
- FIG. 30 is a flowchart for explaining the details of the offset control processing by the playback device 160.
- This offset control processing is started when the control unit 161 requests the menu generation unit 182 for a command corresponding to the offset change button in response to an offset change command from the input unit 21.
- step S121 of FIG. 30 the control unit 161 determines whether or not the set offset command transmitted from the menu generation unit 182 in response to the request is a subtitle set offset command.
- the control unit 161 stores the subtitle screen offset change information described in the subtitle set offset command in the register 161A.
- step S123 the control unit 161 transmits the offset change information for each subtitle screen stored in the register 161A to the subtitle generation unit 181 and ends the process.
- step S121 determines whether the command is not a caption set offset command, that is, if a menu button set offset command is transmitted from the menu generation unit 182, the process proceeds to step S124.
- the control unit 161 causes the register 161A to store offset change information for each screen of the menu button described in the menu button set offset command.
- step S125 the control unit 161 transmits the screen unit offset change information of the menu button stored in the register 161A to the menu generation unit 182 and ends the process.
- FIG. 31 is a diagram illustrating an example of subtitles displayed in 3D on the display unit 51 of the playback device 160.
- an offset change button 195 as a 3D image having a predetermined length in a predetermined depth direction is displayed on the display unit 51 based on the offset information in units of screens included in the ICS. Displayed on the screen.
- subtitles # 1 and # 2 as 3D images having the same length in the same depth direction are further displayed on this screen based on offset information in increments of screens included in the PCS. Has been.
- the playback device 160 stores the offset change information for each subtitle screen described in the set offset command included in the ICS corresponding to the offset change button 195 in the register. 161A. Then, the offset information of the screen unit represented by the vector obtained as a result of adding only the vector represented by the offset change information of the screen unit to the vector represented by the currently set screen unit offset information is the new offset information of the screen unit.
- the length in the depth direction of caption # 1 and caption # 2 increases in the depth direction by a length corresponding to the offset change information in units of screens.
- FIG. 32 is a diagram showing a configuration example of a display set of caption data in the fourth embodiment of the disc to which the present invention is applied
- FIG. 33 is a diagram showing a configuration example of a display set of menu data.
- the disk 201 stores all information related to offset information recorded on the disks 11, 81, and 151.
- offset information for each screen is described in the PCS as in the case of the disc 11. Further, in the disk 201, as in the disk 81, offset information in units of ODS is described in the ODS.
- offset information for each screen is described in the ICS as in the disc 11, and a set offset command is described in the PCS as in the disc 151. Further, in the disk 201, as in the disk 81, offset information in units of ODS is described in the ODS.
- FIG. 34 is a block diagram illustrating a configuration example of the playback device 210 that plays back the above-described disc 201.
- the playback device 210 in FIG. 34 includes an input unit 21, a display unit 51, a speaker 52, a control unit 161, and a playback unit 211.
- the same components as those in FIG. 27 are denoted by the same reference numerals. The overlapping description will be omitted as appropriate.
- the configuration of the playback unit 211 is mainly different from the configuration of FIG. 27 in that a 3D graphics generation unit 221 is provided instead of the 3D graphics generation unit 171.
- the 3D graphics generation unit 221 includes a caption generation unit 231 and a menu generation unit 232.
- the caption generation unit 231 generates caption data for the right eye and caption data for the left eye based on the offset information in units of screens and the offset information in units of ODS, using the PES packet of the caption data supplied from the PID filter 33. . Then, the caption generation unit 231 supplies the caption data for the right eye and the caption data for the left eye to the 3D display data generation unit 36 as 3D caption data.
- the caption generation unit 231 generates a caption based on the offset change information for each caption screen transmitted from the control unit 161 and the currently set offset information. Update the offset information for each screen.
- the menu generation unit 232 generates menu data for the right eye and menu data for the left eye based on the offset information in units of screens and the offset information in units of ODS, using the PES packet of the menu data supplied from the PID filter 33. . Then, the menu generation unit 232 supplies the right-eye menu data and the left-eye menu data to the 3D display data generation unit 36 as 3D menu data.
- the menu generation unit 232 sends a set offset command included in the ICS to the control unit 161 in response to a command request corresponding to the offset change button 195 from the control unit 161, as in the menu generation unit 182 of FIG. Send. Then, similarly to the menu generation unit 182, the menu generation unit 232 determines the menu based on the offset change information for each screen of the menu button transmitted from the control unit 161 and the currently set offset information. Update the offset information of the screen unit of the button.
- FIG. 35 is a block diagram illustrating a detailed configuration example of the caption generation unit 231 of the playback apparatus 210.
- 35 has all the functions of the caption generation unit 41 in FIG. 14, the caption generation unit 111 in FIG. 22, and the caption generation unit 181 in FIG.
- the caption generation unit 231 in FIG. 35 includes an encoded data buffer 61, a stream graphics generation unit 62, an object buffer 63, a right-eye graphics plane 65, and a left-eye graphics plane 66.
- the caption generation unit 231 includes a CLUT 67, a composition buffer 68, a 3D generation unit 251, and a control unit 252.
- configurations the same as the configurations in FIG. 28 are denoted with the same reference numerals. The overlapping description will be omitted as appropriate.
- the 3D generation unit 251 has both functions of the 3D generation unit 64 in FIG. 14 and the 3D generation unit 121 in FIG. Specifically, the 3D generation unit 251 reads a caption object from the object buffer 63 according to control from the control unit 252. The 3D generation unit 251 generates a right-eye caption object and a left-eye caption object from caption objects corresponding to each ODS, based on the screen unit and ODS unit offset information from the control unit 252. Then, the 3D generation unit 251 supplies the right-eye caption object to the right-eye graphics plane 65. The 3D generation unit 251 supplies the left-eye caption object to the left-eye graphics plane 66.
- the control unit 252 reads the offset information in increments of screens included in the PCS from the composition buffer 68 and supplies it to the 3D generation unit 121 in the same manner as the control unit 69 of FIG. Similarly to the control unit 69, the control unit 252 instructs the right-eye graphics plane 65 to transfer at the timing based on the PTS included in the PES packet header, and also instructs the left-eye graphics plane 66 to transfer. Further, like the control unit 69, the control unit 252 reads the PDS from the composition buffer 68 and supplies it to the CLUT 67.
- control unit 252 reads offset information in units of ODS included in each ODS from the composition buffer 68 and supplies the offset information to the 3D generation unit 121 in the same manner as the control unit 122 in FIG.
- control unit 252 receives the offset change information for each subtitle screen stored in the register 161A and transmitted from the control unit 161. Similar to the control unit 161, the control unit 252 sets new screen unit offset information based on the received subtitle screen unit offset change information and the screen unit offset information included in the PCS. Then, similarly to the control unit 191, the control unit 252 supplies the offset information of the screen unit to the 3D generation unit 251.
- the menu generation unit 232 of the playback device 210 is configured in the same manner as the caption generation unit 231 in FIG. 35 except that the processing target is not caption data but menu data. The illustration is omitted.
- the control unit of the menu generation unit 232 reads the set offset command included in the ICS from the composition buffer in response to a command request corresponding to the offset change button from the control unit 161, and transmits the set offset command to the control unit 161.
- FIG. 36 is a flowchart for explaining the details of the caption generation processing in step S41 of FIG.
- step S146 the 3D generation unit 251 generates a right-eye caption object and a left-eye caption object from the caption objects corresponding to each ODS based on the offset information in units of screens and the offset information in units of ODS from the control unit 252. . Then, the process proceeds to step S147.
- steps S147 to S153 is the same as the processing of steps S67 to S73 in FIG.
- the menu generation process in step S42 in FIG. 16 by the playback apparatus 210 is performed in the same manner as the caption generation process in FIG. 36 except that the processing target is not the caption data but the menu data, and thus description thereof is omitted. To do.
- FIG. 37 is a diagram illustrating an example of subtitles displayed in 3D on the display unit 51 of the playback apparatus 210.
- the playback apparatus 210 displays an offset change button 195 as a 3D image having a predetermined length in a predetermined depth direction based on the offset information in units of screens and the offset information in units of ODS. Provided on the screen of the unit 51.
- the playback device 210 shifts the subtitles corresponding to each ODS in the opposite direction based on the offset information in ODS units, and further reverses all the subtitles in the screen based on the offset information in screen units described in the PCS. A subtitle object of a subtitle obtained as a result of shifting each is generated. Then, the playback device 210 sets the caption object as a right-eye caption object and a left-eye caption object.
- subtitles # 1 and # 2 as 3D images having the same depth direction and different lengths in the depth direction are further displayed on the screen.
- the length of the caption # 1 in the depth direction is the length in the depth direction corresponding to the offset information in the ODS unit described in the ODS of the caption # 1 and the screen unit described in the PCS of the screen including the caption # 1. This is the sum of the lengths in the depth direction corresponding to the offset information.
- the length in the depth direction of subtitle # 2 is also the same as in the case of subtitle # 1, the length in the depth direction corresponding to the offset information in ODS units of subtitle # 2, and the screen unit of the screen including subtitle # 2. Is the sum of the lengths in the depth direction corresponding to the offset information.
- the playback apparatus 210 stores the offset change information in units of screens of subtitles described in the set offset command included in the ICS corresponding to the offset change button 195. 161A. Then, the offset information of the screen unit represented by the vector obtained as a result of adding only the vector represented by the offset change information of the screen unit to the vector represented by the currently set screen unit offset information is the new offset information of the screen unit. Set as As a result, the length in the depth direction of caption # 1 and caption # 2 increases in the depth direction by a length corresponding to the offset change information in units of screens.
- FIG. 38 is a diagram showing a configuration example of a display set of menu data in a fifth embodiment of a disc to which the present invention is applied. It is.
- offset information in units of screens is described in the ICS like the disc 11
- offset information in units of ODS is described in the ODS like the disc 81.
- the button unit set offset command is a navigation command for setting offset change information for each ODS unit including offset change information for each menu button, that is, for each ODS unit.
- the button unit set offset command describes a button ID and offset change information of the menu button specified by the button ID.
- the playback apparatus 310 (described later) that plays back the disc 301 can change the offset information in units of menu buttons.
- the configuration of the display set of subtitle data recorded on the disc 301 is the same as the configuration of the display set of subtitle data recorded on the disc 11 shown in FIG.
- FIG. 39 is a block diagram illustrating a configuration example of the playback device 310 that plays back the above-described disc 301.
- the configuration of the playback device 310 in FIG. 39 is mainly that a control unit 311 is provided instead of the control unit 161, and a playback unit 312 is provided instead of the playback unit 211. Different from the configuration.
- the configuration of the playback unit 312 is different from the configuration of FIG. 34 in that a 3D graphics generation unit 321 is provided instead of the 3D graphics generation unit 221.
- the control unit 311 controls the reproduction unit 312 in accordance with a command from the input unit 21. Also, the control unit 311 requests the 3D graphics generation unit 321 for a set offset command corresponding to the menu button in response to a command corresponding to the operation of the menu button from the input unit 21. Then, the control unit 311 supplies the menu generation unit 331 with the menu button unit offset change information and the button ID described in the button unit set offset command transmitted from the menu generation unit 331 as a result.
- the 3D graphics generation unit 321 includes the caption generation unit 41 and the menu generation unit 331 shown in FIG. Similarly to the menu generation unit 232 of FIG. 34, the menu generation unit 331 uses the PES packet of the menu data supplied from the PID filter 33, and uses the right-eye offset information based on the offset information in units of screens and the offset information in units of ODS. Menu data and menu data for the left eye are generated. Then, the menu generation unit 331 supplies the right-eye menu data and the left-eye menu data to the 3D display data generation unit 36 as 3D menu data.
- the menu generation unit 331 transmits a button unit set offset command included in the ICS to the control unit 311 in response to a command request corresponding to the offset change button 195 from the control unit 311. Then, the menu generation unit 331 updates the offset information in ODS units of the menu button specified by the button ID based on the offset change information and button ID in the menu button unit transmitted from the control unit 311 as a result. .
- FIG. 40 is a block diagram illustrating a detailed configuration example of the menu generation unit 331 of FIG.
- the menu generation unit 331 in FIG. 40 includes an encoded data buffer 341, a stream graphics generation unit 342, an object buffer 343, a 3D generation unit 344, a right-eye graphics plane 345, a left-eye graphics plane 346, a CLUT 347, and a composition buffer. 348 and a control unit 349.
- the control unit 349 reads the offset information of the screen unit included in the ICS from the composition buffer 348 and supplies it to the 3D generation unit 344. Also, the control unit 349 instructs the right-eye graphics plane 345 to transfer at the timing based on the PTS included in the PES packet header, and also instructs the left-eye graphics plane 346 to transfer. Further, the control unit 349 reads the PDS from the composition buffer 348 and supplies it to the CLUT 347.
- control unit 349 reads out offset information in units of ODS included in each ODS from the composition buffer 348 and supplies the information to the 3D generation unit 344.
- the control part 349 controls each part according to the command from the control part 311 (FIG. 39).
- control unit 349 reads a button unit set offset command included in the ICS from the composition buffer 348 in response to a command request corresponding to the offset change button 195 from the control unit 311, and transmits it to the control unit 311. Further, the control unit 349 receives the offset change information and button ID for each menu button transmitted from the control unit 311 as a result. The control unit 349 updates the offset information for each ODS based on the received offset change information for each menu button and the currently set offset information of the ODS corresponding to the button ID transmitted together with the offset information. To do. Then, the control unit 349 supplies the ODS unit offset information to the 3D generation unit 344.
- the playback process and 3D graphics generation process by the playback apparatus 310 are the same as the playback process of FIG. 15 and the 3D graphics generation process of FIG.
- the subtitle generation process and the subtitle offset change process by the playback device 310 are the same as the subtitle generation process in FIG. 36 and the offset change process in FIG.
- FIG. 41 is a flowchart for explaining menu button offset change processing by the menu generation unit 331 of the playback device 310.
- This menu button offset change process starts when the control unit 311 requests the menu generation unit 331 to issue a command corresponding to the offset change button 195 in response to a command corresponding to the operation of the offset change button 195 from the input unit 21. Is done.
- control unit 349 in response to a command request corresponding to the offset change button 195 from the control unit 311, the control unit 349 reads out a button unit set offset command included in the ICS from the composition buffer 348.
- step S172 the control unit 349 transmits the button unit set offset command read in step S171 to the control unit 311.
- the control unit 311 transmits, to the control unit 349, the offset change information and button ID for each menu button described in the button unit set offset command transmitted from the control unit 349.
- step S173 the control unit 349 receives the offset change information and button ID for each menu button from the control unit 311.
- the control unit 349 recognizes the ODS corresponding to the button ID received from the control unit 311 based on the button ID included in the ICS held in the composition buffer 348.
- step S174 the control unit 349 generates a new offset for each ODS unit based on the offset change information for each menu button received from the control unit 311 and the currently set offset information for the ODS corresponding to the menu button. Set the information. Then, the control unit 349 supplies the ODS unit offset information to the 3D generation unit 344.
- step S175 the 3D generation unit 344 generates a right-eye menu object and a left-eye menu object from the menu object based on the offset information in ODS units supplied from the control unit 349, and the process proceeds to step S176.
- the processing in steps S176 to S182 is the same as the processing in steps S67 to S73 in FIG. 17 except that the processing target is not caption data but menu data, and thus description thereof is omitted.
- FIG. 42 is a flowchart for explaining offset control processing by the playback device 310.
- This offset control process is started when the control unit 311 requests the menu generation unit 331 for a command corresponding to the offset change button 195 in response to a command corresponding to the operation of the offset change button 195 from the input unit 21. .
- step S201 the control unit 311 determines whether a button unit set offset command has been transmitted from the menu generation unit 331 in response to a request.
- step S202 the control unit 311 generates the menu button unit offset change information and button ID described in the button unit set offset command as a menu.
- the data is transmitted to the unit 331, and the process ends.
- step S201 if it is determined in step S201 that the button unit set offset command has not been transmitted, the process ends.
- FIG. 43 is a diagram illustrating an example of menu buttons displayed in 3D on the display unit 51 of the playback apparatus 310.
- the playback device 310 shifts the menu buttons corresponding to each ODS in the opposite direction based on the offset information in ODS units, and further shifts all the menu buttons in the screen in the opposite direction based on the offset information in screen units. Create a menu button object for the resulting menu button. Then, the playback device 310 sets the menu button objects as a right-eye menu button object and a left-eye menu button object.
- the menu button # 1, the menu button # 2, and the offset change button 195 as 3D images having the same depth direction and different lengths in the depth direction are displayed on the screen.
- the offset change button 195 is described, but the offset change button 195 is the menu button # 3.
- the length in the depth direction of the menu button # 1 is the length in the depth direction corresponding to the offset information in the ODS unit of the menu button # 1 and the depth direction corresponding to the offset information in the screen unit including the menu button # 1. It is the sum of lengths.
- the lengths of the menu button # 2 and the offset change button 195 in the depth direction are the same as the menu button # 1, and the length corresponding to the offset information in ODS units of the menu button # 2 or the offset change button 195.
- the length in the depth direction of the menu button # 2 and the offset change button 195 This is the length corresponding to the offset information of the screen unit including the menu button # 2 and the offset change button 195.
- the playback apparatus 310 indicates the offset change information in units of menu buttons in the button unit set offset command in the vector indicated by the offset information in units of ODS currently set. Add vectors. Then, offset information in ODS units represented by the vector obtained as a result of addition is set as new offset information in ODS units. As a result, the length in the depth direction of the menu button # 1, the menu button # 2, and the offset change button 195 is increased in the depth direction by a length corresponding to the offset change information for each menu button in the button unit set offset command. To do.
- FIG. 44 is a diagram showing a configuration example of a display set of subtitle data in the sixth embodiment of the disc to which the present invention is applied
- FIG. 45 is a diagram showing a configuration example of a display set of menu data.
- the 2D display command is a navigation command for changing a subtitle or menu button displayed in 3D to 2D display.
- a screen unit offset value of a caption or menu button is ignored by a 2D display command.
- the offset value described in the PCS the offset value for each plane set by the navigation command, and the offset value set for each ODS are ignored.
- the offset value described in the ICS the offset value for each plane set by the navigation command, and the offset value set for each ODS are ignored. If an offset value is set for each menu button by the navigation command, that value is also ignored, and the playback device 410 (described later) displays the subtitles and menu to be displayed in 3D in 2D.
- the playback device 410 can change the display of subtitles and menu buttons from 3D display to 2D display. Similarly, switching from 2D display to 3D display can be changed again as necessary.
- FIG. 46 is a block diagram illustrating a configuration example of the playback device 410 that plays back the above-described disc 401.
- the configuration of the playback device 410 in FIG. 46 is mainly that a control unit 411 is provided instead of the control unit 22 and a playback unit 412 is provided instead of the playback unit 23. Different from the configuration.
- the configuration of the playback unit 412 is different from the configuration of FIG. 13 in that a 3D graphics generation unit 421 is provided instead of the 3D graphics generation unit 35.
- the control unit 411 controls the playback unit 412 in accordance with a command from the input unit 21. Further, the control unit 411 requests the 3D graphics generation unit 421 for a command corresponding to the menu button in response to a command corresponding to the operation of the menu button from the input unit 21. Then, the control unit 411 supplies a command for invalidating the offset value to the 3D graphics generation unit 421 in accordance with the 2D display command transmitted as a result.
- the 3D graphics generation unit 421 includes a caption generation unit 431 and a menu generation unit 432.
- the caption generation unit 431 uses the PES packet of caption data supplied from the PID filter 33 to generate right-eye caption data and left-eye caption data based on offset information in units of screens. Then, the caption generation unit 431 supplies the right-eye caption data and the left-eye caption data to the 3D display data generation unit 36 as 3D caption data.
- the offset value is not reflected, and the right-eye caption data and the left-eye caption data are made the same to perform 2D processing.
- the following is an example in which the 2D processing is performed after the 3D processing is once performed.
- the subtitle generation unit 431 regards the offset value of the subtitle screen unit as 0 in accordance with the instruction transmitted from the control unit 411, and updates the offset value of the subtitle screen unit offset information.
- the menu generation unit 432 generates the menu data for the right eye and the menu data for the left eye based on the offset information for each screen, using the PES packet of the menu data supplied from the PID filter 33. Then, the menu generation unit 432 supplies the menu data for the right eye and the menu data for the left eye to the 3D display data generation unit 36 as 3D menu data. Depending on the playback device, if the 2D display command has already been received at this time, the offset value is not reflected, and the right-eye caption data and the left-eye caption data can be made the same to perform 2D processing. .
- the menu generation unit 432 transmits a 2D display command included in the ICS to the control unit 411 in response to a command request corresponding to the 2D display button that is a menu button for instructing 2D display from the control unit 411. To do. Then, the menu generation unit 432 considers the offset value of the screen button unit of the menu button as 0 according to the command transmitted from the control unit 411 as a result, and updates the offset value of the offset information of the menu button unit of screen.
- FIG. 47 is a block diagram illustrating a detailed configuration example of the caption generation unit 431 of the playback apparatus 410.
- control unit 441 is provided instead of the control unit 69.
- the control unit 441 reads the offset information in increments of screens included in the PCS from the composition buffer 68 and supplies it to the 3D generation unit 64 in the same manner as the control unit 69. Similarly to the control unit 69, the control unit 441 instructs transfer to the right-eye graphics plane 65 and the left-eye graphics plane 66 at a timing based on the PTS included in the PES packet header. Further, like the control unit 69, the control unit 441 reads the PDS from the composition buffer 68 and supplies it to the CLUT 67.
- control unit 441 controls each unit in accordance with a command from the control unit 411 (FIG. 46). Further, the control unit 441 receives a command for invalidating the offset value transmitted from the control unit 411. The control unit 411 sets 0 as a new subtitle screen offset value in accordance with the received command. The control unit 441 supplies offset information including the offset value in units of screens to the 3D generation unit 64.
- menu generation unit 432 of the playback apparatus 410 is configured in the same manner as the caption generation unit 431 in FIG. 47 except that the processing target is not caption data but menu data, illustration is omitted.
- the control unit of the menu generation unit 432 reads the 2D display command included in the ICS from the composition buffer, and transmits the 2D display command to the control unit 411.
- the playback process, 3D graphics generation process, and caption generation process performed by the playback apparatus 410 are the same as the playback process of FIG. 15, the 3D graphics generation process of FIG. 16, and the caption generation process of FIG. To do.
- FIG. 48 is a flowchart for explaining caption display change processing by the caption generation unit 431 of the playback apparatus 410.
- This caption display change process is started when the control unit 411 transmits a command to invalidate the offset value in response to a command corresponding to the operation of the 2D display button from the input unit 21.
- step S233 the control unit 441 receives 0 as an offset value for each caption screen from the control unit 411 (receives a command for invalidating the offset value).
- step S232 the control unit 441 regards the subtitle screen unit offset value as 0 in accordance with the command received from the control unit 411, and updates the screen unit offset information. Then, the control unit 441 supplies the updated offset information to the 3D generation unit 64, and the process proceeds to step S233.
- step S233 the 3D generating unit 64 generates a right-eye caption object and a left-eye caption object from the caption object based on the offset information in units of screen supplied from the control unit 441, and the process proceeds to step S234.
- the processing in steps S234 to S240 is the same as the processing in steps S67 to S73 in FIG.
- menu display change process by the menu generation unit 432 is performed in the same manner as the subtitle display change process of FIG. 48 except that the processing target is not the subtitle data but the menu data, and thus description thereof is omitted.
- FIG. 49 is a flowchart for explaining the details of the display control processing by the playback device 410.
- This display control processing is started when the control unit 411 requests a command corresponding to the 2D display button from the menu generation unit 432 in response to a command corresponding to the operation of the 2D display button from the input unit 21.
- step S251 the control unit 411 determines whether or not the 2D display command transmitted from the menu generation unit 432 in response to the request is a subtitle 2D display command. If it is determined in step S251 that the command is a subtitle 2D display command, in step S252, the control unit 411 sets 0 as the subtitle screen offset value described in the subtitle 2D display command as a subtitle generation unit 431. Send to. That is, the control unit 411 supplies a command for invalidating the offset value to the caption generation unit 431. Then, the process ends.
- step S251 determines whether the command is not a caption 2D display command, that is, if a menu button 2D display command is transmitted from the menu generation unit 432. If it is determined in step S251 that the command is not a caption 2D display command, that is, if a menu button 2D display command is transmitted from the menu generation unit 432, the process proceeds to step S253.
- step S ⁇ b> 253 the control unit 411 transmits 0 to the menu generation unit 432 as an offset value for each screen of the menu button described in the 2D display command of the menu button. That is, the control unit 411 supplies a command to invalidate the offset value to the menu generation unit 432. Then, the process ends.
- FIG. 50 is a block diagram showing a configuration example of a reproducing apparatus for reproducing a disc according to the seventh embodiment to which the present invention is applied.
- the configuration of the playback device 460 of FIG. 50 is mainly in that a control unit 461 is provided instead of the control unit 22, an OSD generation unit 462 is newly provided, and instead of the playback unit 23. 13 is different from the configuration of FIG. 13 in that a reproduction unit 463 is provided.
- the configuration of the playback unit 463 is different from the configuration of FIG. 13 in that a 3D display data generation unit 471 is provided instead of the 3D display data generation unit 36.
- the playback device 460 is a playback device that plays back the disc 451.
- the disk 451 among the offset information described in the disk 451, the one in which the 3D display based on the offset information is closest to the front is described in the index file as the maximum offset information.
- the playback device 460 displays an OSD (On4Screen Display) image such as a menu unique to the playback device 460 on the front side.
- OSD On4Screen Display
- control unit 461 controls the playback unit 463 in response to a command from the input unit 21.
- control unit 461 controls the drive 31 in response to an OSD display command from the input unit 21, reads the maximum offset information described in the index file of the disk 451, and supplies it to the OSD generation unit 462. .
- the OSD generation unit 462 generates OSD image data from predetermined OSD image data stored in a memory (not shown) built in the playback device 460 based on the maximum offset information supplied from the control unit 461. .
- the playback device 460 may hold image data for the right eye and the left eye in order to display the OSD in 3D in the storage area of the memory in the playback device 460.
- the following example shows a configuration for displaying the OSD in 3D.
- the OSD generation unit 462 sets predetermined OSD image data stored in a memory (not shown) as OSD image data for the left eye.
- the OSD generation unit 462 generates OSD image data of an OSD image obtained as a result of shifting the OSD image corresponding to the OSD image data for the left eye by a value larger than the offset value in the offset direction of the maximum offset information. .
- the OSD generation unit 462 sets the OSD image data as image data for the right eye.
- the OSD generation unit 462 supplies the OSD image data for the right eye and the OSD image data for the left eye to the 3D display data generation unit 471 of the playback unit 463 as 3DOSD image data.
- the 3D display data generation unit 471 converts the 3D video data from the 3D video generation unit 34, the 3D subtitle data and 3D menu data from the 3D graphics generation unit 35, and the 3D OSD image data from the OSD generation unit 462 to the left and right. Combining for each eye data.
- the 3D display data generation unit 471 supplies the display data for the left eye and the display data for the right eye obtained as a result of the synthesis to the display unit 51 as 3D display data.
- the playback process, 3D graphics generation process, and caption generation process by the playback device 460 are the same as the playback process of FIG. 15, the 3D graphics generation process of FIG. 16, and the caption generation process of FIG. To do.
- FIG. 51 is a flowchart for explaining OSD display processing by the playback device 460. This OSD display process is started when an OSD image display is instructed from the input unit 21.
- control unit 461 controls the drive 31, reads the maximum offset information from the index file on the disk 451, and supplies it to the OSD generation unit 462.
- step S272 the OSD generation unit 462 reads predetermined OSD image data from the memory (not shown) as left-eye OSD image data.
- step S273 the OSD generation unit 462 generates right-eye OSD image data from the left-eye OSD image data based on the maximum offset information.
- step S274 the OSD generation unit 462 supplies the left-eye OSD image data and the right-eye OSD image data to the 3D display data generation unit 471 as 3DOSD image data.
- step S275 the 3D display data generation unit 471 3D video data from the 3D video generation unit 34, 3D subtitle data and 3D menu data from the 3D graphics generation unit 35, and 3D OSD image data from the OSD generation unit 462 Is synthesized. Then, the 3D display data generation unit 471 supplies the display data for the left eye and the display data for the right eye obtained as a result of the synthesis to the display unit 51 as 3D display data.
- step S276 the display unit 51 alternates between a left-eye image corresponding to left-eye display data and a right-eye image corresponding to right-eye display data based on the 3D display data supplied from the 3D display data generation unit 471. Display at the same time. Then, the process ends.
- the playback device 460 can display the OSD image in the foreground based on the maximum offset information. Thereby, the user can surely visually recognize the OSD image.
- the maximum offset information is described in the index file of the disk 451, the display position in the depth direction of the OSD image can be made constant on one disk 451. As a result, it is possible to prevent user confusion caused by changing the display position of the OSD image in the depth direction.
- the index file not the maximum offset information but an offset value based on the maximum offset information may be described.
- the index file may describe an offset value in which the offset direction is limited to the positive direction so that the display position is closer to the front side than the 3D display position corresponding to the maximum offset information.
- the offset direction of the maximum offset information is a negative direction
- 0 is described in the index file as the offset value.
- FIG. 53 is a diagram showing a configuration example of epochs of caption data in the eighth embodiment of the disc to which the present invention is applied.
- two AV streams of an AV stream for the left eye and an AV stream for the right eye are recorded.
- the epoch structures of the AV stream for the left eye and the AV stream for the right eye that are reproduced simultaneously are the same. That is, the number of display sets for the left-eye epoch and the number of display sets for the right-eye epoch that are played back simultaneously are the same.
- the PTS of each segment is the same between the display set for the left eye and the display set for the right eye that are played back simultaneously. Thereby, the display timing of the subtitle for the left eye and the subtitle for the right eye can be made simultaneously.
- the PTS included in the PES packet header of the PCS is obtained based on the decoding time of the ODS corresponding to the PCS, the time required for drawing the subtitles corresponding to the ODS, and the time required for drawing the window. Therefore, between the left-eye display set and the right-eye display set that are played back simultaneously, the vertical and horizontal sizes of the subtitles corresponding to the ODS with the same sub-image ID and the vertical and horizontal sizes of the windows with the same window ID are the same. is there. As a result, the PTS included in the PES packet header of the PCS can be synchronized between the left-eye display set and the right-eye display set without any contradiction.
- the sub-image ID and window ID are the same between the left-eye display set and the right-eye display set that are played back simultaneously. As a result, images corresponding to the same subtitle are displayed at the same time, so that the user can view the 3D subtitle.
- the number of segments excluding ODS is the same, and the DTS of each segment is the same.
- subtitles and menu buttons corresponding to the same sub-image ID may be different.
- PDS may be different.
- the menu data epoch structure and the relationship between the left-eye display set and the right-eye display set that are played back at the same time are the same except that the PCS replaces the ICS, and a description thereof will be omitted.
- the animation frame rate of the left-eye menu button and the right-eye menu button must be the same. Therefore, the field for determining the frame rate of the animation included in the ICS is the same between the display set for the left eye and the display set for the right eye corresponding to such a menu button. Thereby, the menu button for the left eye and the menu button for the right eye are always correspondingly animated at a constant frame rate, so that the user can see the 3D menu button animated at a constant frame rate. .
- the number of frames and the interval of the animation need to be the same for the menu button for the left eye and the menu button for the right eye. is there. Therefore, between the left-eye display set and the right-eye display set corresponding to such a menu button, the fields describing the number of frames and the interval of the animation at the time of the effect included in the ICS are the same. Thereby, the menu button for the left eye and the menu button for the right eye are always effected correspondingly, so that the user can see the 3D menu button to be effected.
- FIG. 54 is a diagram illustrating a window corresponding to a display set of subtitle data.
- FIG. 55 is a block diagram illustrating a configuration example of a playback device 510 that plays back the above-described disc 501.
- the configuration of the playback apparatus 510 in FIG. 55 is mainly in that a control unit 511 is provided instead of the control unit 22, and a playback unit 512 is provided instead of the playback unit 23. And different.
- the playback unit 512 includes a PID filter 521, a 3D video generation unit 522, and a 3D graphics generation unit 523 instead of the PID filter 33, the 3D video generation unit 34, and the 3D graphics generation unit 35, respectively. This is different from the configuration of FIG.
- the control unit 511 controls the reproduction unit 512 in accordance with a command from the input unit 21. For example, the control unit 511 controls the drive 31 of the playback unit 512 to read an index file, movie object file, playlist file, clip information file, and the like from the disc 501. Further, the control unit 511 recognizes the packet having the packet number of the AV stream for the left eye and the AV stream for the right eye to be reproduced based on the read clip information file. Then, the control unit 511 controls the drive 31 to read out the left-eye AV stream and the right-eye AV stream that are composed of the packets.
- the PID filter 521 Based on the PID of each packet of the left-eye AV stream from the read buffer 32, the PID filter 521 extracts the left-eye video data and the left-eye caption data PES packets included in the left-eye AV stream, respectively. To do. Also, the PID filter 521 extracts the left-eye menu data and the audio data PES packet included in the left-eye AV stream, based on the PID of each packet of the left-eye AV stream.
- the PID filter 521 also converts the right-eye video data and the right-eye subtitle data PES packets included in the right-eye AV stream based on the PID of each packet of the right-eye AV stream from the read buffer 32, respectively. Extract. Further, the PID filter 521 extracts the PES packet of the menu data for the right eye included in the AV stream for the right eye based on the PID of each packet of the AV stream for the right eye.
- the 3D video generation unit 522 decodes the PES packet of the left-eye video data and the PES packet of the right-eye video data supplied from the PID filter 521. Then, the 3D video generation unit 522 supplies the left-eye video data and the right-eye video data obtained as a result of decoding to the 3D display data generation unit 36 as 3D video data.
- the 3D graphics generation unit 523 includes a caption generation unit 531 and a menu generation unit 532.
- the caption generation unit 531 decodes the left-eye caption data and the right-eye caption data supplied from the PID filter 521. Then, the caption generation unit 531 supplies the left-eye caption data and the right-eye caption data obtained as a result of the decoding to the 3D display data generation unit 36 as 3D caption data.
- the menu generation unit 532 decodes the left-eye menu data and the right-eye menu data PES packet supplied from the PID filter 521. Then, the menu generation unit 532 supplies the left-eye menu data and the right-eye menu data obtained as a result of the decoding to the 3D display data generation unit 36 as 3D menu data.
- FIG. 56 is a block diagram illustrating a detailed configuration example of the caption generation unit 531 of FIG.
- the caption generation unit 531 includes a right-eye decoder 541-1, a left-eye decoder 541-2, a right-eye graphics plane 542-1, a left-eye graphics plane 542-2, a CLUT 543-1, and a CLUT 543-2. Consists of.
- the right-eye decoder 541-1 includes an encoded data buffer 561-1, a stream graphics generation unit 562-1, an object buffer 563-1, a composition buffer 564-1, and a control unit 565-1.
- the encoded data buffer 561-1 holds a segment in the PES packet of the caption data for the right eye supplied from the PID filter 521.
- the encoded data buffer 561-1 reads the held segment and supplies it to the stream graphics generation unit 562-1.
- the stream graphics generation unit 562-1 decodes the ODS supplied from the encoded data buffer 561-1. Then, the stream graphics generation unit 562-1 supplies the uncompressed right-eye caption data including the index color obtained as a result to the object buffer 563-1 as the right-eye caption object. In addition, the stream graphics generation unit 562-1 supplies the PDS, PCS, and WDS supplied from the encoded data buffer 561-1 to the composition buffer 564-1.
- the object buffer 563-1 holds the subtitle object for the right eye supplied from the stream graphics generation unit 562-1.
- the object buffer 563-1 deletes the stored right-eye caption object for each epoch. Further, the object buffer 563-1 reads out the held right-eye caption object in accordance with the control from the control unit 565-1, and supplies it to the right-eye graphics plane 542-1.
- the composition buffer 564-1 holds the PDS, PCS, and WDS supplied from the stream graphics generation unit 562-1.
- the control unit 565-1 monitors the state of storage of the right-eye caption object for one screen by the right-eye graphics plane 542-1 and completes the storage of the right-eye caption object for one screen. Notify The control unit 565-1 instructs transfer to the right-eye graphics plane 542-1 based on the PTS included in the PES packet header or the notification of the storage of the left-eye caption object from the control unit 565-2. To do. Further, the control unit 565 reads the PDS from the composition buffer 564 and supplies it to the CLUT 543-1.
- control unit 565-1 controls each unit in accordance with a command from the control unit 511 (FIG. 55).
- the left-eye decoder 541-2 includes an encoded data buffer 561-2, a stream graphics generation unit 562-2, an object buffer 563-2, a composition buffer 564-2, and a control unit 565-2.
- the left-eye decoder 541-2 is configured in the same manner as the right-eye decoder 541-1, and performs the same processing except that the processing target is subtitle data for the left eye.
- the right-eye graphics plane 542-1 holds the right-eye caption object for one screen supplied from the object buffer 563-1.
- the right-eye graphics plane 542-1 erases the retained right-eye caption object for each epoch.
- the right-eye graphics plane 542-1 reads the stored right-eye caption object in accordance with a transfer instruction from the control unit 565-1 and supplies the read right-eye caption object to the CLUT 543-1.
- the left-eye graphics plane 542-2 holds the left-eye caption object for one screen supplied from the object buffer 563-2.
- the left-eye graphics plane 542-2 erases the retained left-eye caption object in units of epochs.
- the left-eye graphics plane 542-2 reads the stored left-eye caption object and supplies it to the CLUT 543-2 in response to an instruction from the control unit 565-2.
- the CLUT 543-1 stores a table in which index colors are associated with Y, Cr, and Cb values based on the PDS supplied from the control unit 565-1. Based on the stored table, the CLUT 543-1 converts the index color of the right-eye caption object supplied from the right-eye graphics plane 542-1 into image data composed of Y, Cr, and Cb values. Then, the CLUT 543-1 supplies the image data to the 3D display data generation unit 36 as the right-eye caption data.
- the CLUT 543-2 stores a table in which index colors are associated with Y, Cr, and Cb values based on the PDS supplied from the control unit 565-2. Based on the stored table, the CLUT 543-2 converts the index color of the left-eye caption object supplied from the left-eye graphics plane 542-2 into image data including Y, Cr, and Cb values. Then, the CLUT 543-2 supplies the image data to the 3D display data generation unit 36 as the left-eye caption data.
- the caption generation unit 531 clears the object buffer 563-1, the object buffer 563-2, the right-eye graphics plane 542-1, and the left-eye graphics plane 542-2 in units of epochs.
- the number of display sets constituting the epoch is the same for the AV stream for the right eye and the AV stream for the left eye, only one of the subtitles for the right eye and the left eye is displayed.
- subtitles can always be displayed in 3D.
- FIG. 57 is a diagram for explaining a transfer instruction based on completion notifications by the control units 565-1 and 565-2.
- the control unit 565-1 monitors the storage state of the right-eye caption object for one screen by the right-eye graphics plane 542-1. When storage of the right-eye caption object for one screen is completed in the right-eye graphics plane 542-1, the control unit 565-1 notifies the control unit 565-2 of the completion.
- the control unit 565-1 then waits for a completion notification from the control unit 565-2. That is, as shown in FIG. 57, the right-eye caption object for one screen and the left-eye caption object for one screen are aligned with the right-eye graphics plane 542-1 and the left-eye graphics plane 542-2, respectively. stand by. Upon receiving the completion notification, the control unit 565-1 instructs the right-eye graphics plane 542-1 to transfer.
- control unit 565-2 monitors the storage state of the left-eye caption object for one screen by the left-eye graphics plane 542-2. When storage of the left-eye caption object for one screen is completed in the left-eye graphics plane 542-2, the control unit 565-2 notifies the completion to the control unit 565-1.
- control unit 565-2 waits for a completion notification from the control unit 565-1.
- the control unit 565-2 instructs the left-eye graphics plane 542-2 to transfer.
- the right-eye caption object for one screen and the left-eye caption object for one screen are aligned on the right-eye graphics plane 542-1 and the left-eye graphics plane 542-2, respectively. Will be transferred after.
- the transfer from the right-eye graphics plane 542-1 and the left-eye graphics plane 542-2 is synchronized, but the transfer from the CLUTs 543-1 and 543-2 is synchronized. Also good.
- menu generation unit 532 is configured in the same manner as the caption generation unit 531 of FIG. 56 except that the processing target is not caption data but menu data.
- the right-eye menu object for one screen and the left-eye menu object for one screen are transferred after being aligned with the right-eye graphics plane and the left-eye graphics plane, respectively.
- buttons can be performed.
- FIG. 58 is a flowchart for explaining playback processing by the playback device 510. This reproduction process is started, for example, when the disk 501 is loaded in the drive 31.
- control unit 511 After the processing in step S304, the control unit 511 recognizes the packet numbers of the AV stream for the left eye and the AV stream for the right eye to be reproduced based on the playlist and the clip information file. Then, the control unit 511 instructs the drive 31 to reproduce the AV stream for the left eye and the AV stream for the right eye that are composed of packets having the packet number to be reproduced.
- step S 305 the drive 31 reads the left-eye AV stream and the right-eye AV stream to be reproduced from the disc 501 and supplies them to the read buffer 32 in accordance with a command from the control unit 511.
- step S306 the read buffer 32 holds the left-eye AV stream and the right-eye AV stream supplied from the drive 31.
- step S307 the PID filter 521 extracts the PES packet based on the PID of each packet of the left-eye AV stream and the right-eye AV stream from the read buffer 32.
- the PID filter 521 extracts the left-eye video data, left-eye caption data, left-eye menu data, and audio data PES packets based on the PID of each packet of the left-eye AV stream. . Further, the PID filter 521 extracts PES packets for the right-eye video data, the right-eye caption data, and the right-eye menu data based on the PID of each packet of the right-eye AV stream.
- step S308 the 3D video generation unit 522 decodes the left-eye video data and the right-eye video data supplied from the PID filter 521, and generates 3D video data.
- step S309 the 3D graphics generation unit 523 generates 3D caption data using the left-eye and right-eye caption data, and generates 3D menu data using the left-eye and right-eye menu data. Perform the generation process. Details of the 3D graphics generation processing will be described with reference to FIG. 59 described later.
- step S309 After the process of step S309, the process proceeds to step S310.
- the processing in steps S310 to S313 is the same as the processing in steps S20 to S23 in FIG.
- the playback process immediately after the disk 501 is mounted has been described. However, when a title corresponding to a movie object file other than the first play is played after the disk 501 is mounted, the same playback process is performed. However, in this case, the movie object file read in step S302 is a movie object file corresponding to the title number of the title to be reproduced in the index file.
- FIG. 59 is a flowchart for explaining the details of the 3D graphics generation processing in step S309 of FIG.
- the subtitle generating unit 531 performs subtitle generating processing for generating 3D subtitle data using the PES packets of the left-eye subtitle data and the right-eye subtitle data. Details of the caption generation processing will be described with reference to FIG.
- step S342 the menu generation unit 532 performs menu generation processing for generating 3D menu data using the PES packet of the menu data for the left eye and the menu data for the right eye, and returns the processing to step S309 in FIG. And the process after step S310 is performed.
- FIG. 60 is a flowchart for explaining the details of the caption generation processing in step S341 of FIG.
- step S361 of FIG. 60 the right-eye decoder 541-1 performs a right-eye caption object generation process that generates a right-eye caption object using the PES packet of the right-eye caption data from the PID filter 521. Details of the right-eye caption object generation processing will be described with reference to FIG. 61 described later.
- step S362 the left-eye decoder 541-2 performs a left-eye caption object generation process that generates a left-eye caption object using the PES packet of the left-eye caption data from the PID filter 521.
- step S363 the control units 565-1 and 565-2 determine whether storage of the right-eye caption object and the left-eye caption object for one screen is completed. Specifically, the control units 565-1 and 565-2 complete the storage of one screen by the object buffers 563-1 and 563-2, and the control units 565-2 and 565-1 store one screen. It is determined whether or not the completion of storage is notified.
- step S363 If it is determined in step S363 that the storage of the right-eye caption object and the left-eye caption object for one screen has not been completed, the process waits until the storage is completed.
- step S363 if it is determined in step S363 that the right-eye caption object and the left-eye caption object for one screen have been stored, the control units 565-1 and 565-2 control the object buffer 563-in step S 364. 1, 563-2 is instructed to transfer.
- the right-eye caption object for one screen and the left-eye caption object for one screen held in the object buffers 563-1 and 563-2 are converted into the right-eye graphics plane 542-1 and the left-eye caption object. Each is transferred to the fix plane 542-2.
- step S365 the CLUTs 543-1 and 543-2 convert the right-eye caption object from the right-eye graphics plane 542-1 and the left-eye caption object from the left-eye graphics plane 542-2 into image data, respectively.
- step S366 the CLUT 543-1 outputs the right-eye caption data obtained as a result of the conversion in step S365 to the 3D display data generation unit 36, and the CLUT 543-2 outputs the left-eye caption data to the 3D display data generation unit 36. To do. Then, the process returns to step S341 in FIG. 59 and proceeds to step S342.
- step S342 in FIG. 59 is performed in the same manner as the caption generation process in FIG. 60 except that the processing target is not the caption data but menu data, and thus the description thereof is omitted.
- FIG. 61 is a flowchart for explaining the details of the right-eye caption object generation processing in step S361 of FIG.
- step S381 in FIG. 61 the encoded data buffer 561-1 holds a segment of the PES packet of the right-eye caption data supplied from the PID filter 521.
- step S382 the encoded data buffer 561-1 reads the held segment and supplies the segment to the stream graphics generation unit 562-1.
- step S383 the stream graphics generation unit 562-1 supplies the PCS, PDS, and WDS supplied from the encoded data buffer 561-1 to the composition buffer 564-1 and holds them.
- step S384 the stream graphics generation unit 562-1 decodes the ODS supplied from the encoded data buffer 561-1. Then, the stream graphics generation unit 562-1 supplies the uncompressed right-eye caption data including the index color obtained as a result to the object buffer 563-1 as the right-eye caption object. In step S385, the object buffer 563-1 holds the right-eye caption object supplied from the stream graphics generation unit 562-1.
- step S386 the object buffer 563-1 reads the stored right-eye caption object in accordance with the control from the control unit 565-1, and supplies the stored right-eye caption object to the right-eye graphics plane 542-1. Then, the process returns to step S361 in FIG. 60 and proceeds to step S362.
- the left-eye caption object generation process in step S362 of FIG. 60 is the same as the right-eye caption object generation process of FIG. 61 except that the processing target is not the right-eye caption data but the left-eye caption data. The description is omitted.
- another PDS may be provided between the display set for the right eye and the display set for the left eye that are played simultaneously, but the display set for the right eye that is played back simultaneously.
- the PDS may be the same between the left and right eye display sets. In this case, since only one CLUT is required, the mounting load of the playback device can be reduced.
- FIG. 62 is a diagram showing a configuration example of the epoch of menu data in the ninth embodiment of the disk to which the present invention is applied.
- two AV streams of a left-eye AV stream and a right-eye AV stream are recorded in the same manner as the disk 501 in FIG.
- the structure of the epoch of the menu data for the left eye is the same as the epoch of the menu data for the right eye, like the disc 501.
- the relationship between the left-eye display set and the right-eye display set that are played back simultaneously on the disc 601 is that the PDS is the same, and that the set offset command is described only in the left-eye ICS. Except for this, it is the same as the disk 501.
- the offset information after the change is used as the offset change information.
- the playback device 610 (described later) for playing back the disc 601 has to display all subtitles and menu buttons in the screen corresponding to the ICS.
- the length in the depth direction can be changed.
- the structure of the epoch of subtitle data and the relationship between the left-eye display set and the right-eye display set that are played back simultaneously are the same as those of the disc 501 except that the PDS is the same.
- FIG. 63 is a block diagram illustrating a configuration example of a playback device 610 that plays back the disk 601 described above.
- the configuration of the playback device 510 in FIG. 63 is mainly in that a control unit 611 is provided instead of the control unit 511, and a playback unit 612 is provided instead of the playback unit 512. And different.
- the configuration of the playback unit 612 is different from the configuration of FIG. 55 in that a 3D graphics generation unit 621 is provided instead of the 3D graphics generation unit 523.
- the control unit 611 controls the reproduction unit 612 according to the command from the input unit 21, similarly to the control unit 511. Further, the control unit 611 requests a command corresponding to the menu button from the 3D graphics generation unit 621 in response to a command corresponding to the operation of the menu button from the input unit 21. Then, the control unit 611 holds the offset change information in units of screens of subtitles and menu buttons described in the set offset command transmitted as a result in the built-in register 611A. The control unit 611 supplies the 3D graphics generation unit 621 with the offset change information in units of screens of subtitles and menu buttons held in the register 611A.
- the register 611A is composed of PSR, like the register 161A, and holds offset change information for each screen of subtitles and menu buttons.
- the 3D graphics generation unit 621 includes a caption generation unit 631 and a menu generation unit 632.
- the subtitle generation unit 631 decodes the left-eye subtitle data and the right-eye subtitle data PES packets supplied from the PID filter 521 in the same manner as the subtitle generation unit 531 of FIG. Similar to the caption generation unit 531, the caption generation unit 631 supplies the left-eye caption data and the right-eye caption data obtained as a result of decoding to the 3D display data generation unit 36 as 3D caption data.
- the caption generation unit 631 updates the caption data for the left eye based on the offset change information for each caption screen transmitted from the control unit 611.
- the menu generation unit 632 decodes the left-eye menu data and the right-eye menu data PES packet supplied from the PID filter 521, similarly to the menu generation unit 532 of FIG. Similarly to the menu generation unit 532, the menu generation unit 632 supplies the left-eye menu data and the right-eye menu data obtained as a result of decoding to the 3D display data generation unit 36 as 3D menu data.
- the menu generation unit 632 transmits a set offset command included in the ICS to the control unit 611 in response to a command request corresponding to the offset change button 195 from the control unit 611. Then, the menu generation unit 632 updates the menu data for the left eye based on the screen unit offset change information transmitted from the control unit 611 as a result.
- FIG. 64 is a block diagram illustrating a detailed configuration example of the caption generation unit 631 of FIG.
- the 64 mainly includes a left-eye decoder 641 provided in place of the left-eye decoder 541-2, and a CLUT 642 provided in place of the CLUTs 543-1 and 543-2. This is different from the configuration of FIG. 56 in that a point and a depth adjustment unit 643 are newly provided.
- the configuration of the left-eye decoder 641 is different from the configuration of FIG. 56 in that a control unit 651 is provided instead of the control unit 565-2.
- control unit 651 of the left-eye decoder 641 monitors the storage state of the left-eye caption object for one screen by the left-eye graphics plane 542-2, and the storage is completed. Is notified to the control unit 565-1. Similar to the control unit 565-2, the control unit 651 instructs the left-eye graphics plane 542-2 to transfer based on the PTS included in the PES packet header or the notification from the control unit 565-1.
- control unit 651 controls each unit in accordance with a command from the control unit 611 (FIG. 63).
- control unit 651 receives the offset change information for each subtitle screen stored in the register 611A, which is transmitted from the control unit 611, and supplies it to the depth adjustment unit 643.
- the CLUT 642 stores a table in which index colors are associated with Y, Cr, and Cb values based on the PDS supplied from the control unit 565-1.
- this table corresponds to both the left-eye caption object and the right-eye caption object. .
- the CLUT 642 converts the index color of the right-eye caption object supplied from the right-eye graphics plane 542-1 into image data composed of Y, Cr, and Cb values based on the stored table. Then, the CLUT 642 supplies the image data to the depth adjustment unit 643 as right-eye caption data.
- the CLUT 642 converts the index color of the left-eye caption object supplied from the left-eye graphics plane 542-2 into image data including Y, Cr, and Cb values based on the stored table. Then, the CLUT 642 supplies the image data to the depth adjustment unit 643 as left-eye caption data.
- the depth adjustment unit 643 generates subtitle data of subtitles obtained as a result of shifting the subtitles in screen units corresponding to the left-eye subtitle data from the CLUT 642 in the offset direction indicated by the offset change information from the control unit 651 by the offset value. . Then, the depth adjustment unit 643 supplies the caption data to the 3D display data generation unit 36 as new left-eye caption data. Further, the depth adjustment unit 643 subtitle data of subtitles obtained as a result of shifting the subtitles in screen units corresponding to the subtitle data for the right eye from the CLUT 642 by the offset value in the offset direction indicated by the offset change information from the control unit 651. Generate. Then, the depth adjustment unit 643 supplies the caption data to the 3D display data generation unit 36 as new right-eye caption data.
- the depth adjustment unit 643 is not provided in the subsequent stage of the CLUT 642, but is divided into a function for the right eye and a function for the left eye, respectively, between the object buffer 563-1 and the graphics plane for the right eye 542-1. Alternatively, it may be provided between the object buffer 563-2 and the left-eye graphics plane 542-2.
- menu generation unit 632 is configured in the same manner as the caption generation unit 631 of FIG. 64 except that the processing target is not caption data but menu data. However, in response to a command request corresponding to the offset change button 195 from the control unit 611, the control unit of the left-eye decoder of the menu generation unit 632 reads the set offset command included in the ICS from the composition buffer, and the control unit To 611.
- FIG. 65 is a flowchart for explaining subtitle offset change processing by the subtitle generation unit 631 of the playback device 610.
- This subtitle offset change process is started when the control unit 611 transmits offset change information in response to a command corresponding to the operation of the offset change button 195 from the input unit 21.
- control unit 651 receives from the control unit 611 the offset change information for each subtitle screen held in the register 611A and supplies the information to the depth adjustment unit 643.
- step S402 the depth adjustment unit 643 generates new subtitle data for the left eye based on the offset change information for each screen of the subtitles received from the control unit 611.
- step S ⁇ b> 403 the depth adjustment unit 643 generates new right-eye caption data based on the offset change information for each caption screen received from the control unit 611. Then, the new right-eye caption data and left-eye caption data are output to the 3D display data generation unit 36 as 3D caption data, and the process ends.
- menu offset changing process by the menu generating unit 632 is performed in the same manner as the caption offset changing process of FIG. 65 except that the processing target is not the caption data but the menu data, and thus the description thereof is omitted.
- the offset control processing by the control unit 611 is the same as the offset control processing of FIG.
- FIG. 66 is a diagram illustrating an example of subtitles displayed in 3D on the display unit 51 of the playback device 610.
- an offset change button 195 as a 3D image having a predetermined length in a predetermined depth direction is displayed on the display unit 51 based on offset information in units of screens included in the ICS. Displayed on the screen.
- subtitles # 1 and # 2 as 3D images having the same length in the same depth direction are displayed on the screen based on the display set for the left eye and the display set for the right eye. It is also displayed.
- the playback device 610 changes the offset of the subtitle screen unit described in the set offset command included in the left-eye ICS corresponding to the offset change button 195.
- Information is held in the register 611A.
- subtitle data of subtitles obtained as a result of shifting the subtitles in screen units corresponding to the subtitle data for the left eye by the offset value in the offset direction indicated by the offset change information is generated as new left eye subtitle data.
- the length in the depth direction of subtitle # 1 and subtitle # 2 increases in the depth direction by a length corresponding to the offset change information for each screen held in register 611A.
- FIG. 67 is a diagram showing a configuration example of the epoch of menu data in the tenth embodiment of the disk to which the present invention is applied.
- two AV streams of a left-eye AV stream and a right-eye AV stream are recorded in the same manner as the disk 501 in FIG.
- the epoch structure of the menu data for the left eye and the epoch structure of the menu data for the right eye are the same.
- the relationship between the left-eye menu data display set and the right-eye menu data display set that are simultaneously reproduced on the disc 671 is the same as that of the disc 501 except for the following two points.
- the two different points are that the PDS is the same and that the button unit set offset command is described only in the left-eye ICS.
- a button unit set offset command is provided in the ICS for the left eye. Accordingly, a playback device 680 (described later) that plays back the disc 671 can change the length in the depth direction of the menu button in the screen corresponding to the ICS in units of menu buttons.
- FIG. 68 is a block diagram showing a configuration example of the playback device 680 that plays back the disk 671 described above.
- the configuration of the playback device 680 in FIG. 68 is mainly in that a control unit 681 is provided instead of the control unit 611 and a playback unit 682 is provided instead of the playback unit 612. And different.
- the configuration of the playback unit 682 is different from the configuration of FIG. 63 in that a 3D graphics generation unit 691 is provided instead of the 3D graphics generation unit 621.
- the control unit 681 controls the playback unit 682 in response to a command from the input unit 21 as with the control unit 611 in FIG. Further, the control unit 681 requests the 3D graphics generation unit 691 for a set offset command corresponding to the menu button in response to a command corresponding to the operation of the menu button from the input unit 21. Then, the control unit 681 supplies the menu generation unit 701 with the offset change information and button ID for each menu button described in the button unit set offset command transmitted from the menu generation unit 701 as a result.
- the 3D graphics generation unit 691 includes a caption generation unit 531 and a menu generation unit 701 shown in FIG. Similar to the menu generation unit 632 in FIG. 63, the menu generation unit 701 decodes the left-eye menu data and the right-eye menu data PES packets supplied from the PID filter 521. Similarly to the menu generation unit 632, the menu generation unit 701 supplies the left-eye menu data and right-eye menu data obtained as a result of the decoding to the 3D display data generation unit 36 as 3D menu data.
- the menu generation unit 701 transmits a button unit set offset command included in the ICS to the control unit 681 in response to a command request corresponding to the offset change button 195 from the control unit 681. Then, the menu generation unit 701 updates the menu data for the left eye based on the offset change information and button ID for each menu button transmitted from the control unit 681 as a result.
- FIG. 69 is a block diagram illustrating a detailed configuration example of the menu generation unit 701 in FIG.
- the menu generation unit 701 includes a right-eye decoder 711-1, a left-eye decoder 711-2, a right-eye graphics plane 712-1, a left-eye graphics plane 712-2, a CLUT 713, and a depth adjustment unit 714. Composed.
- the right-eye decoder 711-1 includes an encoded data buffer 721-1, a stream graphics generation unit 722-1, an object buffer 723-1, a composition buffer 724-1, and a control unit 725-1.
- the left-eye decoder 711-2 includes an encoded data buffer 721-2, a stream graphics generation unit 722-2, an object buffer 723-2, a composition buffer 724-2, and a control unit 725-2.
- the configuration other than the control unit 725-2 and the depth adjustment unit 714 is the same as the configuration of the menu generation unit 632 in FIG. 63, and thus description thereof is omitted.
- the control unit 725-2 monitors the storage state of the left-eye caption object for one screen by the left-eye graphics plane 712-2, and notifies the control unit 725-1 of the completion of the storage.
- the control unit 725-2 instructs the graphics plane for left eye 712-2 to transfer based on the ICS from the composition buffer 724-2 or the notification from the control unit 725-1.
- control unit 725-2 controls each unit in accordance with a command from the control unit 681 (FIG. 68).
- control unit 725-2 reads a button unit set offset command included in the ICS from the composition buffer 724-2 in response to a command request corresponding to the offset change button 195 from the control unit 681, and the control unit 681 Send to. Further, the control unit 725-2 receives the offset change information and button ID for each menu button transmitted from the control unit 681 as a result. The control unit 725-2 supplies the received offset change information in units of menu buttons to the depth adjustment unit 714 as offset change information in units of ODSs corresponding to the button IDs transmitted together therewith.
- the depth adjustment unit 714 obtains the menu data of the menu button obtained as a result of shifting each menu button in the screen corresponding to the left-eye menu data from the CLUT 713 based on the offset change information in ODS units corresponding to the menu button. Generate. Then, the depth adjustment unit 714 supplies the menu data to the 3D display data generation unit 36 as new left-eye menu data. The depth adjustment unit 714 also obtains a menu of menu buttons obtained as a result of shifting each menu button in the screen corresponding to the right-eye menu data from the CLUT 713 based on the offset change information in ODS units corresponding to the menu button. Generate data. Then, the depth adjustment unit 714 supplies the menu data to the 3D display data generation unit 36 as new right-eye caption data.
- FIG. 70 is a flowchart for explaining menu button offset change processing by the menu generation unit 701 of the playback device 680.
- This menu button offset changing process starts when the control unit 681 requests a command corresponding to the offset change button 195 from the menu generation unit 701 in response to a command corresponding to the operation of the offset change button 195 from the input unit 21. Is done.
- step S421 in FIG. 70 the control unit 725-2 sends a button unit set offset command included in the ICS from the composition buffer 724-2 in response to a command request corresponding to the offset change button 195 from the control unit 681. read out.
- step S422 the control unit 725-2 transmits the button unit set offset command read in step S421 to the control unit 681.
- the control unit 681 transmits the menu button unit offset change information and the button ID described in the button unit set offset command transmitted from the control unit 725-2 to the control unit 725-2.
- step S423 the control unit 725-2 receives the offset change information and button ID for each menu button from the control unit 681.
- the control unit 725-2 recognizes the ODS corresponding to the button ID received from the control unit 681, based on the button ID included in the ICS held in the composition buffer 724-2. Then, the control unit 725-2 supplies the offset change information in units of menu buttons received from the control unit 681 to the depth adjustment unit 714 as offset change information in ODS units of the recognized ODS.
- step S424 the depth adjustment unit 714 generates new left-eye menu data and right-eye menu data based on the ODS unit offset change information supplied from the control unit 725-2.
- step S425 the depth adjustment unit 714 outputs the new left-eye menu data and right-eye menu data generated in step S424 to the 3D display data generation unit 36, and ends the process.
- FIG. 71 is a diagram showing an example of menu buttons displayed in 3D on the display unit 51 of the playback device 680.
- menu button # 1 menu button # as a 3D image having the same length in the same depth direction is displayed on this screen based on the display set for the left eye and the display set for the right eye. 2 and an offset change button 195 are displayed.
- the playback device 680 obtains a menu obtained as a result of shifting the position of each menu button for the left eye currently displayed based on the offset change information for each menu button. Generate data. Then, the playback device 610 uses the menu data as new left-eye menu data. Further, the playback device 680 generates menu data obtained as a result of shifting the position of each menu button for the right eye currently being displayed based on the offset change information for each menu button. Then, the playback device 610 sets the menu data as new right-eye menu data.
- the length in the depth direction of the menu button # 1, the menu button # 2, and the offset change button 195 is increased in the depth direction by a length corresponding to the offset change information for each menu button in the button unit set offset command. To do.
- the menu button image for the right eye and the menu button image for the left eye are respectively for the right eye of different menu buttons. It must not overlap with the menu button image or the menu button image for the left eye.
- FIG. 72 is a diagram showing a configuration example of the epoch of menu data in the eleventh embodiment of the disk to which the present invention is applied.
- the disc 751 has the same epoch structure for the menu data for the left eye and the epoch for the menu data for the right eye.
- the relationship between the left-eye display set and the right-eye display set that are played back simultaneously on the disc 751 is the same as that of the disc 501 except for the following two points.
- the two different points are that the PDS is the same and that the 2D display command is described only in the left-eye ICS.
- the 2D display command describes information indicating a command for using the left-eye menu data as the menu data for both eyes.
- the playback device 760 that plays back the disc 751 (described later) can perform 2D display of the menu button.
- the structure of the epoch of subtitle data and the relationship between the left-eye display set and the right-eye display set that are played back simultaneously are the same as those of the disc 501 except that the PCS is the same. Omitted.
- FIG. 73 is a block diagram illustrating a configuration example of a playback device 760 that plays the above-described disc 751.
- the configuration of the playback device 760 in FIG. 73 is mainly that a control unit 771 is provided instead of the control unit 511 and a playback unit 772 is provided instead of the playback unit 512. Different from the configuration.
- the configuration of the playback unit 772 is different from the configuration of FIG. 55 in that a 3D graphics generation unit 781 is provided instead of the 3D graphics generation unit 523.
- the control unit 771 controls the playback unit 772 in accordance with a command from the input unit 21, similarly to the control unit 511. Further, the control unit 771 requests a command corresponding to the menu button from the 3D graphics generation unit 781 in response to a command corresponding to the operation of the menu button from the input unit 21. Then, the control unit 771 supplies a command corresponding to the 2D display command transmitted as a result to the 3D graphics generation unit 781.
- the 3D graphics generation unit 781 includes a caption generation unit 791 and a menu generation unit 792.
- the caption generation unit 791 decodes the PES packets of the left-eye caption data and the right-eye caption data supplied from the PID filter 521, similarly to the caption generation unit 531 of FIG. Similar to the caption generation unit 531, the caption generation unit 791 supplies the left-eye caption data and the right-eye caption data obtained as a result of decoding to the 3D display data generation unit 36 as 3D caption data. In addition, the caption generation unit 791 updates 3D caption data based on a command transmitted from the control unit 771.
- the menu generation unit 792 decodes the left-eye menu data and the right-eye menu data PES packet supplied from the PID filter 521, similarly to the menu generation unit 532 of FIG. Similarly to the menu generation unit 532, the menu generation unit 792 supplies the left-eye menu data and the right-eye menu data obtained as a result of decoding to the 3D display data generation unit 36 as 3D menu data.
- the menu generation unit 792 transmits a 2D display command included in the ICS of the menu data for the left eye to the control unit 771 in response to a command request corresponding to the 2D display button from the control unit 771. Then, the menu generation unit 792 updates the 3D menu data based on the command transmitted from the control unit 771 as a result.
- FIG. 74 is a block diagram illustrating a detailed configuration example of the caption generation unit 791 of FIG.
- the configuration of the caption generation unit 791 in FIG. 74 is that a right-eye decoder 801-1 and a left-eye decoder 801-2 are provided instead of the right-eye decoder 541-1 and the left-eye decoder 541-2 in FIG. Different from the configuration.
- the configuration of FIG. 56 is that the right-eye graphics plane 802 is provided instead of the right-eye graphics plane 542-1 and that the CLUT 642 is provided instead of the CLUTs 543-1 and 543-2. And different.
- the configuration of the right-eye decoder 801-1 is different from the configuration of FIG. 56 in that a control unit 811-1 is provided instead of the control unit 565-1.
- the left-eye decoder 801-2 is configured such that an object buffer 810 is provided instead of the object buffer 563-2, and a control unit 811-2 is provided instead of the control unit 565-2. This is different from the configuration of FIG.
- control unit 811-1 of the right-eye decoder 801-1 monitors and stores the storage state of the right-eye caption object for one screen by the right-eye graphics plane 802. Is notified to the control unit 811-2. Similarly to the control unit 565-2, the control unit 811-1 instructs the right-eye graphics plane 802 to transfer based on the PTS included in the PES packet header or the notification from the control unit 811-2.
- control unit 811-1 controls each unit in accordance with a command from the control unit 771 (FIG. 73).
- the object buffer 810 of the left-eye decoder 801-2 holds the left-eye caption object supplied from the stream graphics generation unit 562-2.
- the object buffer 810 deletes the left-eye caption object that is held in units of epochs. Further, the object buffer 810 reads out the left-eye caption object that is held in accordance with the control from the control unit 811-2, and supplies the left-eye caption object to the left-eye graphics plane 542-2.
- the object buffer 810 supplies the retained left-eye caption object to the right-eye graphics plane 802 and the left-eye graphics plane 542-2 according to the control from the control unit 811-2.
- control unit 811-2 monitors the storage state of the left-eye caption object for one screen by the left-eye graphics plane 542-2, and the control unit 811-1 is notified. Similarly to the control unit 565-2, the control unit 811-2 instructs the left-eye graphics plane 542-2 to transfer based on the PTS included in the PES packet header or the notification from the control unit 565-1.
- control unit 811-2 controls each unit in accordance with a command from the control unit 771 (FIG. 73).
- control unit 811-2 transfers the left-eye caption object to the right-eye graphics plane 802 and the left-eye graphics plane 542-2 in the object buffer 810 in response to a command transmitted from the control unit 771. Instruct.
- the right-eye graphics plane 802 holds the right-eye caption object for one screen supplied from the object buffer 563-1, similarly to the right-eye graphics plane 542-1.
- the right-eye graphics plane 802 holds the left-eye caption object for one screen supplied from the object buffer 810 as the right-eye caption object for one screen.
- the right-eye graphics plane 802 erases the retained right-eye caption object in units of epochs, similarly to the right-eye graphics plane 542-1. Similarly to the right-eye graphics plane 542-1, the right-eye graphics plane 802 reads the stored right-eye caption object and supplies it to the CLUT 642 in response to a transfer instruction from the control unit 811-1. .
- menu generation unit 792 is configured in the same manner as the caption generation unit 791 in FIG. 74 except that the processing target is not caption data but menu data.
- the control unit of the left-eye decoder of the menu generation unit 792 reads the 2D display command included in the ICS from the composition buffer in response to a command request corresponding to the 2D display button from the control unit 771, and controls the control unit 771. Send to.
- FIG. 75 is a flowchart for explaining subtitle display change processing by the subtitle generation unit 791 of the playback device 760.
- This subtitle offset change processing is started when the control unit 771 requests a 2D display command from the 3D graphics generation unit 781 in response to an offset change command from the input unit 21.
- step S441 in FIG. 75 the control units 811-1 and 811-2 receive a command from the control unit 771.
- step S442 the control unit 811-1 controls the object buffer 563-1 according to the command received in step S441, and stops reading the right-eye caption object from the object buffer 563-1.
- step S443 the control unit 811-2 controls the object buffer 810 according to the instruction received in step S441, and transfers the left-eye caption object in the object buffer 810 to the right-eye graphics plane 802.
- the right-eye graphics plane 802 holds the left-eye caption object as the right-eye caption object. Then, the process proceeds to step S444.
- Steps S444 to S447 are the same as the processes of steps S363 to S366 in FIG.
- the right-eye caption data and the left-eye caption data become the same caption data corresponding to the left-eye AV stream. Thereby, the user can see the 2D display of subtitles. Therefore, the user can change the 3D display of subtitles to 2D display by instructing 2D display using the input unit 21 when eyes are tired.
- menu display change process by the menu generation unit 792 is performed in the same manner as the subtitle display change process of FIG. 75 except that the processing target is not the subtitle data but the menu data, and thus description thereof is omitted.
- the 2D display command describes information indicating a command to use the left-eye caption data as both-eye caption data.
- the left-eye and right-eye caption data are used for both eyes.
- Information indicating a command for generating common caption data may be described.
- FIG. 76 is a diagram illustrating a detailed configuration example of the caption generation unit 791 in such a case.
- the configuration of the caption generation unit 791 in FIG. 76 is that a right-eye decoder 541-1 and a left-eye decoder 851 are provided instead of the right-eye decoder 801-1 and the left-eye decoder 801-2. And different. 76 is different from the configuration of FIG. 76 in that a right-eye graphics plane 542-1 is provided instead of the right-eye graphics plane 802, and a 2D conversion unit 852 is newly provided.
- the configuration of the left-eye decoder 851 is that an object buffer 563-2 is provided instead of the object buffer 810, and a control unit 861 is provided instead of the control unit 811-2. And different.
- control unit 861 of the left-eye decoder 851 monitors the storage state of the left-eye caption object for one screen by the left-eye graphics plane 542-2, and the storage is completed. Is notified to the control unit 565-1. Similar to the control unit 811-2, the control unit 861 instructs the left-eye graphics plane 542-2 to transfer based on the PTS included in the PES packet header or the notification from the control unit 565-1.
- control unit 861 controls each unit in accordance with a command from the control unit 771 (FIG. 73).
- control unit 861 receives a command transmitted from the control unit 771 and supplies the command to the 2D conversion unit 852.
- the 2D conversion unit 852 generates common subtitle data for both eyes from the subtitle data for the left eye and the subtitle data for the right eye output from the CLUT 642 in response to a command supplied from the control unit 861.
- the 2D conversion unit 852 supplies the generated common subtitle data for both eyes to the 3D display data generation unit 36 as subtitle data for the left eye and subtitle data for the right eye.
- FIG. 77 is a diagram illustrating an example of a method for generating common caption data for both eyes by the 2D conversion unit 852 in FIG. 76. Transferring either left-eye or right-eye caption data as synthesized caption data is the simplest 2D conversion method.
- the position on the screen of the caption 871 for the left eye is (XL, YL)
- the position on the screen of the caption 872 for the right eye is ( XR, YR).
- YL and YR are the same.
- the 2D conversion unit 852 generates caption data obtained as a result of changing the position of each caption corresponding to the caption data for the right eye to the position (X, Y) as the caption data for the right eye. Further, the 2D conversion unit 852 generates caption data obtained as a result of changing the position of each caption corresponding to the caption data for the left eye to the position (X, Y) as caption data for the left eye.
- the user can view the 2D display of the caption.
- the set offset command and the 2D display command are described in the menu data for the left eye, but the set offset command and the 2D display command may be described in the menu data for the right eye. Further, a set offset command and a 2D display command may be described in both left-eye and right-eye menu data.
- ICS may also be provided with a 3D display command for switching from 2D display to 3D display.
- the process corresponding to the 2D display command is returned to the original process by the 3D display command.
- buttons and subtitles correspond to one ODS.
- the “ODS unit offset information” described above is “menu button unit or subtitle unit offset information”. The same applies to the “ODS unit offset change information”, and when the button unit set offset command is executed, the offset information is changed in menu button units.
- a personal computer shown in FIG. 78 may be employed as at least a part of the above-described playback device.
- a CPU (Central Processing Unit) 901 executes various processes according to a program recorded in a ROM (Read Only Memory) 902 or a program loaded from a storage unit 908 to a RAM (Random Access Memory) 903. To do.
- the RAM 903 also appropriately stores data necessary for the CPU 901 to execute various processes.
- the CPU 901, ROM 902, and RAM 903 are connected to each other via a bus 904.
- An input / output interface 905 is also connected to the bus 904.
- the input / output interface 905 includes an input unit 906 made up of a keyboard, mouse, etc., an output unit 907 made up of a display, a storage unit 908 made up of a hard disk, etc., and a communication unit 909 made up of a modem, a terminal adapter, etc. It is connected.
- the communication unit 909 controls communication performed with other devices (not shown) via a network including the Internet.
- a drive 910 is connected to the input / output interface 905 as necessary, and a removable medium 911 made of a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted, and a computer program read from them is loaded. If necessary, it is installed in the storage unit 908.
- a program constituting the software executes various functions by installing a computer incorporated in dedicated hardware or various programs.
- a general-purpose personal computer is installed from a network or a recording medium.
- the recording medium including such a program is distributed to provide a program to the user separately from the apparatus main body, and a magnetic disk (including a floppy disk) on which the program is recorded.
- Removable media including optical disks (including CD-ROM (Compact Disk-Read Only Memory), DVD (Digital Versatile Disk), Blu-ray Disc), magneto-optical disks (including MD (Mini-Disk)), or semiconductor memory (Package medium) 911 is not only configured, but also configured by a ROM 902 on which a program is recorded and a hard disk included in the storage unit 908 provided to the user in a state of being incorporated in the apparatus main body in advance.
- the 3D video generation unit 34 generates the video data for the right eye and the left eye using the PES packet of the video data supplied from the PID filter 33, but the right eye and the left eye are generated on the disc. If the video data is recorded, the same processing as that performed by the 3D video generation unit 522 may be performed. In this case, the 3D video generation unit 34 decodes the PES packet of the left-eye video data supplied from the PID filter 33 to generate left-eye video data, and the right-eye video data supplied from the PID filter 33 The right-eye video data is generated by decoding the PES packet.
- FIG. 79 is a diagram illustrating an example of the syntax of PCS
- FIG. 80 is a diagram illustrating an example of the syntax of ICS.
- interactive_composition is arranged in the ICS, and offset information, button information, and the like are described in the interactive_compositio.
- FIG. 81 is a diagram for explaining display data for the left eye and right eye generated by the 3D display data generation unit 36.
- the 3D display data generation unit 36 generates left-eye display data by combining three data of left-eye video data, left-eye caption data, and left-eye menu data. To do. Also, as shown in FIG. 81B, the 3D display data generation unit 36 combines the three data of the video data for the right eye, the caption data for the right eye, and the menu data for the right eye to display the display data for the right eye. Is generated. Note that the order of superimposing video data, caption data, and menu data for each eye is, in order from the bottom, video data, caption data, and menu data.
- FIG. 82 is a block diagram illustrating a detailed configuration example of the 3D display data generation unit 36.
- the 3D display data generation unit 36 includes a left-eye display data generation unit 1000 and a right-eye display data generation unit 1010.
- the left-eye display data generation unit 1000 includes a left-eye video plane 1001, a transmission unit 1002, a transmission unit 1003, a combination unit 1004, a transmission unit 1005, a transmission unit 1006, and a combination unit 1007.
- the left-eye video plane 1001 holds the left-eye video data supplied from the 3D video generation unit 34 (522).
- the transmission unit 1002 reads the left-eye video data held in the left-eye video plane 1001.
- the transmission unit 1002 converts the read video data for the left eye so that the main image for the left eye is transmitted at a preset transmittance (1- ⁇ 1L ).
- the transmission unit 1002 supplies the converted video data for the left eye to the synthesis unit 1004.
- the transmission unit 1003 converts the left-eye caption data supplied from the caption generation unit 41 (111, 181, 231, 431, 531, 631, 791) with the transmittance ⁇ 1L in which the left-eye caption is set in advance. Convert to transparent.
- the transmission unit 1003 supplies the converted video data for the left eye to the synthesis unit 1004.
- the synthesizing unit 1004 synthesizes the left-eye video data supplied from the transmitting unit 1002 and the left-eye caption data supplied from the transmitting unit 1003, and supplies the resulting data to the transmitting unit 1005.
- the transmission unit 1005 converts the data so that the image corresponding to the data supplied from the synthesis unit 1004 is transmitted with the transmittance (1- ⁇ 2L ), and supplies the converted data to the synthesis unit 1007.
- menu data for the left eye supplied from the menu generation unit 42 (112, 182, 232, 331, 432, 532, 632, 701, 792) is preset with menu buttons for the left eye. It converts so that it may permeate
- the transmission unit 1006 supplies the converted menu data for the left eye to the synthesis unit 1007.
- the combining unit 1007 combines the data supplied from the transmission unit 1005 and the menu data supplied from the transmission unit 1006, and outputs the resulting data as display data for the left eye.
- the right-eye display data generation unit 1010 includes a right-eye video plane 1011, a transmission unit 1012, a transmission unit 1013, a combination unit 1014, a transmission unit 1015, a transmission unit 1016, and a combination unit 1017.
- the processing of each unit of the right-eye display data generation unit 1010 is the same as the processing of each unit of the left-eye display data generation unit 1000 except that the processing target is data for the right eye, and thus description thereof is omitted.
- subtitle data and menu data for each eye are generated in the 3D graphics generation unit 35 (101, 171, 221, 321, 421, 523, 621, 691, 781).
- the 3D display data generation unit 36 may generate subtitle data and menu data for each eye.
- the playback device 20 is configured as shown in FIG.
- FIG. 83 is a block diagram showing another configuration example of the playback device 20 of FIG.
- 83 differs from the configuration of FIG. 13 in that a playback unit 1030 is provided instead of the playback unit 23.
- the playback unit 1030 of the playback device 20 in FIG. 83 is provided with a 3D graphics generation unit 1031 and a 3D display data generation unit 1032 in place of the 3D graphics generation unit 35 and the 3D display data generation unit 36. 13 different from the playback unit 23. 83, the same components as those in FIG. 13 are denoted by the same reference numerals, and repeated description will be omitted as appropriate.
- the 3D graphics generation unit 1031 includes a caption generation unit 1041 and a menu generation unit 1042.
- the caption generation unit 1041 generates caption data and offset information for each screen using the PES packet of the caption data supplied from the PID filter 33, and supplies the generated information to the 3D display data generation unit 1032. Details of the caption generation unit 1041 will be described with reference to FIG.
- the menu generation unit 1042 generates menu data and offset information for each screen using the PES packet of the menu data supplied from the PID filter 33, and supplies it to the 3D display data generation unit 1032.
- the 3D display data generation unit 1032 generates left-eye caption data and right-eye caption data as 3D caption data based on the caption data supplied from the caption generation unit 1041 and offset information in units of screens. Also, the 3D display data generation unit 1032 generates left-eye caption data and right-eye caption data as 3D caption data based on the caption data supplied from the menu generation unit 1042 and the offset information in units of screens.
- the 3D display data generation unit 1032 converts the 3D video data, the 3D subtitle data, and the 3D menu data supplied from the 3D video generation unit 34 to the left and right sides. Combining for each eye data.
- the 3D display data generation unit 1032 supplies the display data for the left eye and the display data for the right eye obtained as a result to the display unit 51 as 3D display data. Details of the 3D display data generation unit 1032 will be described with reference to FIG.
- FIG. 84 is a block diagram illustrating a detailed configuration example of the caption generation unit 1041 in FIG. 83.
- the 3D generation unit 64, the right-eye graphics plane 65, and the left-eye graphics plane 66 are not provided, and the CLUT 105 and the control unit 1052 are provided instead of the CLUT 67 and the control unit 69.
- 14 is different from the configuration of FIG. Of the configurations shown in FIG. 84, configurations the same as the configurations in FIG. 14 are denoted with the same reference numerals, and repeated description will be omitted as appropriate.
- the CLUT 1051 of the caption generation unit 1041 stores a table in which index colors are associated with Y, Cr, and Cb values based on the PDS supplied from the control unit 1052, similarly to the CLUT 67 of FIG. .
- the CLUT 1051 reads the caption object from the object buffer 63 and converts the index color of the caption object into image data composed of Y, Cr, and Cb values based on the stored table. Then, the CLUT 1051 supplies the image data of the caption object to the 3D display data generation unit 1032 (FIG. 83) as caption data.
- the control unit 1052 reads the offset information in increments of screens included in the PCS from the composition buffer 68 and supplies it to the 3D display data generation unit 1032. Further, the control unit 1052 reads the PDS from the composition buffer 68 and supplies it to the CLUT 1051. Furthermore, the control unit 1052 controls each unit in accordance with a command from the control unit 22 (FIG. 83).
- menu generation unit 1042 is configured in the same manner as the caption generation unit 1041 of FIG. 84 except that the processing target is not caption data but menu data, the illustration is omitted.
- FIG. 85 is a block diagram illustrating a detailed configuration example of the 3D display data generation unit 1032 in FIG. 83.
- the 3D display data generation unit 1032 includes a subtitle plane 1061, a menu plane 1062, a left-eye display data generation unit 1063, and a right-eye display data generation unit 1064.
- the subtitle plane 1061 holds subtitle data supplied from the subtitle generation unit 1041 (FIG. 83) of the 3D graphics generation unit 1031.
- the menu plane 1062 holds menu data supplied from the menu generation unit 1042 (FIG. 83) of the 3D graphics generation unit 1031.
- the configuration of the left-eye display data generation unit 1063 is different from that of the left-eye display data generation unit 1000 (FIG. 82) in that an offset addition unit 1071 and an offset addition unit 1072 are newly provided. 85, the same reference numerals are given to the same configurations as the configurations in FIG. 82, and overlapping descriptions will be omitted as appropriate.
- the offset adding unit 1071 reads the caption data from the caption plane 1061.
- the offset adding unit 1071 generates subtitle data for the left eye from the read subtitle data based on the offset information for each screen supplied from the subtitle generation unit 1041. Specifically, the offset adding unit 1071 generates subtitle data obtained as a result of shifting the subtitle in units of screen corresponding to the read subtitle data by the offset value in the offset direction of the offset information as subtitle data for the left eye To do.
- the offset addition unit 1071 supplies the left-eye caption data to the transmission unit 1003.
- the offset adding unit 1072 reads the menu data from the menu plane 1062.
- the offset adding unit 1072 generates menu data for the left eye from the read menu data based on the offset information for each screen supplied from the menu generating unit 1042. Specifically, the offset adding unit 1072 uses, as menu data for the left eye, menu data obtained as a result of shifting the menu button for each screen corresponding to the read menu data by the offset value in the offset direction of the offset information. Generate.
- the offset adding unit 1072 supplies the menu data for the left eye to the transmission unit 1006.
- the configuration of the right-eye display data generation unit 1064 is different from the configuration of the right-eye display data generation unit 1010 (FIG. 82) in that an offset addition unit 1081 and an offset addition unit 1082 are newly provided.
- the processing of each unit of the right-eye display data generation unit 1064 is the same as the processing of each unit of the left-eye display data generation unit 1063 except that the left-eye data is replaced with the right-eye data, and thus the description thereof is omitted. To do.
- the step of describing the program recorded on the recording medium is not limited to the processing performed in chronological order according to the order, but is not necessarily performed in chronological order, either in parallel or individually.
- the process to be executed is also included.
- the present invention can also be applied to a playback apparatus that identifies a display system including a display that does not support 3D display and then outputs an output image signal that is suitable for the display system.
- the present invention can be applied not only to a playback device that plays back a disc, but also to a playback device that plays back a broadcast signal of a digital broadcast or a signal distributed by IP (Internet Protocol).
- IP Internet Protocol
- the disk described above may be a disk other than the BD-ROM.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Library & Information Science (AREA)
- Human Computer Interaction (AREA)
- Marketing (AREA)
- Business, Economics & Management (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Television Signal Processing For Recording (AREA)
- Controls And Circuits For Display Device (AREA)
- Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)
Abstract
Description
[ディスクの第1実施の形態の構成例]
図1は、本発明を適用したディスクの第1実施の形態の構成例を示す図である。
図2は、インデックスファイルの詳細構成例を示す図である。
図3は、ムービーオブジェクトファイルの詳細構成例を示す図である。
図5は、プレイリストファイルの詳細構成例を示す図である。
図6は、クリップインフォメーションファイルの詳細構成例について説明する図である。
図7は、ストリームファイルの詳細構成例を示す図である。
図8は、PESパケットの抽出について説明する図である。
図13は、上述したディスク11を再生する再生装置20の構成例を示すブロック図である。
図14は、図13の字幕生成部41の詳細構成例を示すブロック図である。
メニュー生成部42は、処理対象が字幕データではなく、メニューデータである点を除いて、図14の字幕生成部41と同様に構成されるので、図示は省略する。
図15は、再生装置20による再生処理を説明するフローチャートである。この再生処理は、例えば、ディスク11がドライブ31に装着されたときに開始される。
図18は、再生装置20の表示部51に3D表示される字幕の例を示す図である。
[ディスクの第2実施の形態におけるディスプレイセットの構成例]
図19は、本発明を適用したディスクの第2実施の形態における字幕データのディスプレイセットの構成例を示す図であり、図20は、メニューデータのディスプレイセットの構成例を示す図である。
図21は、上述したディスク81を再生する再生装置90の構成例を示すブロック図である。
図22は、再生装置90の字幕生成部111の詳細構成例を示すブロック図である。
メニュー生成部112は、処理対象が字幕データではなく、メニューデータである点を除いて、図22の字幕生成部111と同様に構成されるので、図示は省略する。
再生装置90による再生処理、3Dグラフィックス生成処理については、それぞれ、図15の再生処理、図16の3Dグラフィックス生成処理と同様であるので説明は省略する。
図24は、再生装置90の表示部51に3D表示される字幕の例を示す図である。
[ディスクの第3実施の形態におけるディスプレイセットの構成例]
図25は、本発明を適用したディスクの第3実施の形態における字幕データのディスプレイセットの構成例を示す図であり、図26は、メニューデータのディスプレイセットの構成例を示す図である。
図27は、上述したディスク151を再生する再生装置160の構成例を示すブロック図である。
図28は、再生装置160の字幕生成部181の詳細構成例を示すブロック図である。
再生装置160による再生処理、3Dグラフィックス生成処理、字幕生成処理については、それぞれ、図15の再生処理、図16の3Dグラフィックス生成処理、図17の字幕生成処理と同様であるので説明は省略する。
図31は、再生装置160の表示部51に3D表示される字幕の例を示す図である。
[ディスクの第4実施の形態におけるディスプレイセットの構成例]
図32は、本発明を適用したディスクの第4実施の形態における字幕データのディスプレイセットの構成例を示す図であり、図33は、メニューデータのディスプレイセットの構成例を示す図である。
図34は、上述したディスク201を再生する再生装置210の構成例を示すブロック図である。
図35は、再生装置210の字幕生成部231の詳細構成例を示すブロック図である。
再生装置210による再生処理、3Dグラフィックス生成処理、字幕オフセット変更処理、オフセット制御処理については、それぞれ、図15の再生処理、図16の3Dグラフィックス生成処理、図29のオフセット変更処理、図30のオフセット制御処理と同様であるので説明は省略する。
図37は、再生装置210の表示部51に3D表示される字幕の例を示す図である。
図39は、上述したディスク301を再生する再生装置310の構成例を示すブロック図である。
図40は、図39のメニュー生成部331の詳細構成例を示すブロック図である。
再生装置310による再生処理、3Dグラフィックス生成処理は、それぞれ、図15の再生処理、図16の3Dグラフィックス生成処理と同様であるので説明は省略する。また、再生装置310による字幕生成処理、字幕オフセット変更処理は、それぞれ、図36の字幕生成処理、図29のオフセット変更処理と同様であるので説明は省略する。
図43は、再生装置310の表示部51に3D表示されるメニューボタンの例を示す図である。
[ディスクの第6実施の形態におけるディスプレイセットの構成例]
図44は、本発明を適用したディスクの第6実施の形態における字幕データのディスプレイセットの構成例を示す図であり、図45は、メニューデータのディスプレイセットの構成例を示す図である。
図46は、上述したディスク401を再生する再生装置410の構成例を示すブロック図である。
図47は、再生装置410の字幕生成部431の詳細構成例を示すブロック図である。
再生装置410のメニュー生成部432は、処理対象が字幕データではなく、メニューデータである点を除いて、図47の字幕生成部431と同様に構成されるので、図示は省略する。但し、メニュー生成部432の制御部は、制御部411からの2D表示ボタンに対応するコマンドの要求に応じて、コンポジションバッファからICSに含まれる2D表示コマンドを読み出し、制御部411に送信する。
再生装置410による再生処理、3Dグラフィックス生成処理、字幕生成処理については、それぞれ、図15の再生処理、図16の3Dグラフィックス生成処理、図17の字幕生成処理と同様であるので説明は省略する。
図50は、本発明を適用した第7実施の形態のディスクを再生する再生装置の構成例を示すブロック図である。
再生装置460による再生処理、3Dグラフィックス生成処理、字幕生成処理については、それぞれ、図15の再生処理、図16の3Dグラフィックス生成処理、図17の字幕生成処理と同様であるので説明は省略する。
[ディスクの第8実施の形態におけるディスプレイセットの構成例]
図53は、本発明を適用したディスクの第8実施の形態における字幕データのエポック(Epoch)の構成例を示す図である。
図54は、字幕データのディスプレイセットに対応するウィンドウについて説明する図である。
図55は、上述したディスク501を再生する再生装置510の構成例を示すブロック図である。
図56は、図55の字幕生成部531の詳細構成例を示すブロック図である。
図示は省略するが、メニュー生成部532は、処理対象が字幕データではなく、メニューデータである点を除いて、図56の字幕生成部531と同様に構成される。
図58は、再生装置510による再生処理を説明するフローチャートである。この再生処理は、例えば、ディスク501がドライブ31に装着されたときに開始される。
[ディスクの第9実施の形態におけるディスプレイセットの構成例]
図62は、本発明を適用したディスクの第9実施の形態におけるメニューデータのエポックの構成例を示す図である。
図63は、上述したディスク601を再生する再生装置610の構成例を示すブロック図である。
図64は、図63の字幕生成部631の詳細構成例を示すブロック図である。
図示は省略するが、メニュー生成部632は、処理対象が字幕データではなく、メニューデータである点を除いて、図64の字幕生成部631と同様に構成される。但し、メニュー生成部632の左目用デコーダの制御部は、制御部611からのオフセット変更ボタン195に対応するコマンドの要求に応じて、コンポジションバッファからICSに含まれるセットオフセットコマンドを読み出し、制御部611に送信する。
図66は、再生装置610の表示部51に3D表示される字幕の例を示す図である。
[ディスクの第10実施の形態におけるディスプレイセットの構成例]
図67は、本発明を適用したディスクの第10実施の形態におけるメニューデータのエポックの構成例を示す図である。
図68は、上述したディスク671を再生する再生装置680の構成例を示すブロック図である。
図69は、図68のメニュー生成部701の詳細構成例を示すブロック図である。
再生装置680による再生処理、3Dグラフィックス生成処理、字幕生成処理、右目用字幕オブジェクト生成処理については、それぞれ、図58の再生処理、図59の3Dグラフィックス生成処理、図60の字幕生成処理、図61の右目用字幕オブジェクト生成処理と同様であるので説明は省略する。
図71は、再生装置680の表示部51に3D表示されるメニューボタンの例を示す図である。
[ディスクの第11実施の形態におけるディスプレイセットの構成例]
図72は、本発明を適用したディスクの第11実施の形態におけるメニューデータのエポックの構成例を示す図である。
図73は、上述したディスク751を再生する再生装置760の構成例を示すブロック図である。
図74は、図73の字幕生成部791の詳細構成例を示すブロック図である。
図示は省略するが、メニュー生成部792は、処理対象が字幕データではなく、メニューデータである点を除いて、図74の字幕生成部791と同様に構成される。但し、メニュー生成部792の左目用デコーダの制御部は、制御部771からの2D表示ボタンに対応するコマンドの要求に応じて、コンポジションバッファからICSに含まれる2D表示コマンドを読み出し、制御部771に送信する。
再生装置760による再生処理、3Dグラフィックス生成処理、字幕生成処理、右目用字幕オブジェクト生成処理については、それぞれ、図58の再生処理、図59の3Dグラフィックス生成処理、図60の字幕生成処理、図61の右目用字幕オブジェクト生成処理と同様であるので説明は省略する。
図76は、このような場合の字幕生成部791の詳細構成例を示す図である。
図77は、図76の2D変換部852による両目用の共通の字幕データの生成方法の例を説明する図である。左目用あるいは右目用の字幕データのどちらかを合成後の字幕データとして転送することが最も簡単な2D変換の方法である。
ところで、上述した一連の処理は、ハードウエアにより実行させることもできるが、ソフトウエアにより実行させることができる。
図79は、PCSのシンタックスの例を示す図であり、図80は、ICSのシンタックスの例を示す図である。
図81は、3D表示データ生成部36により生成される左目用および右目用の表示データについて説明する図である。
[再生装置の構成例]
図83は、図13の再生装置20の他の構成例を示すブロック図である。
図84は、図83の字幕生成部1041の詳細構成例を示すブロック図である。
メニュー生成部1042は、処理対象が字幕データではなく、メニューデータである点を除いて、図84の字幕生成部1041と同様に構成されるので、図示は省略する。
図85は、図83の3D表示データ生成部1032の詳細構成例を示すブロック図である。
Claims (20)
- 字幕またはメニューボタンからなる副画像の2D(2Dimensional)表示に用いられる、前記副画像の画像データと、
前記画像データに対応する画面単位の前記副画像に対する、画面単位の前記副画像の3D表示に用いられる左目用のL画像および右目用のR画像のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報と
を含むデータ構造。 - 前記オフセット情報は、前記画像データに対応する前記副画像単位の前記副画像に対する、前記副画像単位の前記L画像および前記R画像のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなる
請求項1に記載のデータ構造。 - 前記副画像は前記メニューボタンからなり、
変更後の画面単位のオフセット情報を表すオフセット変更情報を含む、そのオフセット変更情報を設定するためのセットオフセットコマンド
をさらに含む
請求項1または2のいずれかに記載のデータ構造。 - 前記セットオフセットコマンドは、変更後の前記メニューボタン単位のオフセット情報を表すオフセット変更情報を含む、そのオフセット変更情報を設定するためのコマンドである
請求項3に記載のデータ構造。 - 前記オフセット情報は、前記副画像の前記L画像または前記R画像のうちの、前記画像データに対応しない画像の画面の副画像どうしが重ならないように、かつ、その副画像が画面外に位置しないように決定される
請求項2または4のいずれかに記載のデータ構造。 - 請求項1乃至5のいずれかに記載のデータ構造のデータが記録されている記録媒体。
- 字幕またはメニューボタンからなる副画像の2D(2Dimensional)表示に用いられる、前記副画像の画像データと、
前記画像データに対応する画面単位の前記副画像に対する、画面単位の前記副画像の3D表示に用いられる左目用のL画像および右目用のR画像のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報と
を含むデータ構造のデータを再生する場合、
前記データに含まれる前記画像データを読み出し、
前記オフセット情報に基づいて、画面単位の前記画像データから、前記L画像および前記R画像の画面単位の画像データを生成し、
前記L画像および前記R画像の画面単位の画像データを出力する
再生装置。 - 前記オフセット情報は、前記画像データに対応する前記副画像単位の前記副画像に対する、前記副画像単位の前記L画像および前記R画像のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなり、
前記再生装置は、前記オフセット情報に基づいて、前記副画像単位の前記画像データから、前記L画像および前記R画像の副画像単位の画像データを生成する
請求項7に記載の再生装置。 - 前記副画像は前記メニューボタンからなり、
前記データ構造は、変更後の画面単位のオフセット情報を表すオフセット変更情報を含む、そのオフセット変更情報を設定するためのセットオフセットコマンドをさらに含み、 前記再生装置は、さらに、前記オフセット変更情報に基づいて、前記データに含まれる画面単位の前記画像データから、前記L画像および前記R画像の画面単位の画像データを生成し、前記L画像および前記R画像の画面単位の画像データを更新する
請求項7または8のいずれかに記載の再生装置。 - 前記再生装置は、
前記セットオフセットコマンドに含まれる前記オフセット変更情報を保持する保持手段
を備え、
前記再生装置は、前記保持手段に保持されている前記オフセット変更情報に基づいて、前記データに含まれる画面単位の前記画像データから、前記L画像および前記R画像の画面単位の画像データを生成する
請求項9に記載の再生装置。 - 前記セットオフセットコマンドは、変更後の前記メニューボタン単位のオフセット情報を表すオフセット変更情報を含む、前記メニューボタン単位の前記オフセット情報を設定するためのコマンドであり、
前記再生装置は、さらに、前記オフセット変更情報に基づいて、前記データに含まれる前記メニューボタン単位の前記画像データから、前記L画像および前記R画像の前記メニューボタン単位の画像データを生成し、前記L画像および前記R画像の画面単位の画像データを更新する
請求項9に記載の再生装置。 - 字幕またはメニューボタンからなる副画像の2D(2Dimensional)表示に用いられる、前記副画像の画像データと、
前記画像データに対応する画面単位の前記副画像に対する、画面単位の前記副画像の3D表示に用いられる左目用のL画像および右目用のR画像のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報と
を含むデータ構造のデータを再生する再生装置が、
前記データに含まれる前記画像データを読み出し、
前記オフセット情報に基づいて、画面単位の前記画像データから、前記L画像および前記R画像の画面単位の画像データを生成し、
前記L画像および前記R画像の画面単位の画像データを出力する
ステップを含む再生方法。 - 字幕またはメニューボタンからなる副画像の2D(2Dimensional)表示に用いられる、前記副画像の画像データと、
前記画像データに対応する画面単位の前記副画像に対する、画面単位の前記副画像の3D表示に用いられる左目用のL画像および右目用のR画像のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報と
を含むデータ構造のデータを再生する制御を実行するコンピュータに、
前記データに含まれる前記画像データを読み出し、
前記オフセット情報に基づいて、画面単位の前記画像データから、前記L画像および前記R画像の画面単位の画像データを生成し、
前記L画像および前記R画像の画面単位の画像データを出力する
ステップを含む制御処理を実行させるプログラム。 - メニューボタンの3D(3Dimensional)表示に用いられる、前記メニューボタンの左目用のL画像および右目用のR画像の画像データと、
前記L画像の画像データおよび前記R画像の画像データのそれぞれに対する画面単位のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報を含む、前記オフセット情報を設定するためのセットオフセットコマンドと
を含むデータ構造。 - 前記セットオフセットコマンドは、前記L画像の画像データおよび前記R画像の画像データのそれぞれに対する前記メニューボタン単位の画像のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報を含む
請求項14に記載のデータ構造。 - 請求項14または請求項15のいずれかに記載のデータ構造のデータが記録されている記録媒体。
- メニューボタンの3D(3Dimensional)表示に用いられる、前記メニューボタンの左目用のL画像および右目用のR画像の画像データと、
前記L画像の画像データおよび前記R画像の画像データのそれぞれに対する画面単位のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報を含む、前記オフセット情報を設定するためのセットオフセットコマンドと
を含むデータ構造のデータを再生する場合、
前記データに含まれる前記L画像および前記R画像の画面単位の画像データを読み出して出力し、
前記L画像の画像データおよび前記R画像の画像データを前記セットオフセットコマンドに含まれる前記オフセット情報に基づいて画面単位で更新し、
更新後の前記L画像の画面単位の画像データと前記R画像の画面単位の画像データを出力する
再生装置。 - 前記セットオフセットコマンドは、前記L画像の画像データおよび前記R画像の画像データのそれぞれに対する前記メニューボタン単位の画像のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報を含み、
前記再生装置は、前記L画像の画像データおよび前記R画像の画像データを前記セットオフセットコマンドに含まれる前記オフセット情報に基づいて前記メニューボタン単位で更新する
請求項17に記載の再生装置。 - メニューボタンの3D(3Dimensional)表示に用いられる、前記メニューボタンの左目用のL画像および右目用のR画像の画像データと、
前記L画像の画像データおよび前記R画像の画像データのそれぞれに対する画面単位のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報を含む、前記オフセット情報を設定するためのセットオフセットコマンドと
を含むデータ構造のデータを再生する再生装置が、
前記データに含まれる前記L画像および前記R画像の画面単位の画像データを読み出して出力し、
前記L画像の画像データおよび前記R画像の画像データを前記セットオフセットコマンドに含まれる前記オフセット情報に基づいて画面単位で更新し、
更新後の前記L画像の画面単位の画像データと前記R画像の画面単位の画像データを出力する
ステップを含む再生方法。 - メニューボタンの3D(3Dimensional)表示に用いられる、前記メニューボタンの左目用のL画像および右目用のR画像の画像データと、
前記L画像の画像データおよび前記R画像の画像データのそれぞれに対する画面単位のずれ方向を表すオフセット方向、および、ずれ量を表すオフセット値からなるオフセット情報を含む、前記オフセット情報を設定するためのセットオフセットコマンドと
を含むデータ構造のデータを再生する制御を実行するコンピュータに、
前記データに含まれる前記L画像および前記R画像の画面単位の画像データを読み出して出力し、
前記L画像の画像データおよび前記R画像の画像データを前記セットオフセットコマンドに含まれる前記オフセット情報に基づいて画面単位で更新し、
更新後の前記L画像の画面単位の画像データと前記R画像の画面単位の画像データを出力する
ステップを含む制御処理を実行させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080001837.8A CN102282859B (zh) | 2009-04-15 | 2010-04-09 | 数据结构、记录介质、播放设备和播放方法以及程序 |
EP10764398.3A EP2421273A4 (en) | 2009-04-15 | 2010-04-09 | DATA STRUCTURE, RECORDING MEDIUM, REPRODUCTION DEVICE AND METHOD, AND PROGRAM THEREOF |
US12/999,293 US20120020640A1 (en) | 2009-04-15 | 2010-04-09 | Data structure, recording medium, playing device and playing method, and program |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009099412 | 2009-04-15 | ||
JP2009-099412 | 2009-04-15 | ||
JP2010063055A JP2010268431A (ja) | 2009-04-15 | 2010-03-18 | データ構造、記録媒体、再生装置および再生方法、並びにプログラム |
JP2010-063055 | 2010-03-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010119814A1 true WO2010119814A1 (ja) | 2010-10-21 |
Family
ID=42982476
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/056418 WO2010119814A1 (ja) | 2009-04-15 | 2010-04-09 | データ構造、記録媒体、再生装置および再生方法、並びにプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20120020640A1 (ja) |
EP (1) | EP2421273A4 (ja) |
JP (1) | JP2010268431A (ja) |
CN (1) | CN102282859B (ja) |
WO (1) | WO2010119814A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012063675A1 (ja) * | 2010-11-08 | 2012-05-18 | ソニー株式会社 | 立体画像データ送信装置、立体画像データ送信方法および立体画像データ受信装置 |
EP2806644A4 (en) * | 2012-01-18 | 2014-11-26 | Panasonic Corp | TRANSMISSION DEVICE, VIDEO DISPLAY DEVICE, TRANSMISSION METHOD, VIDEO PROCESSING METHOD, VIDEO PROCESSING PROGRAM AND INTEGRATED CIRCUIT |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011041249A (ja) * | 2009-05-12 | 2011-02-24 | Sony Corp | データ構造および記録媒体、並びに、再生装置、再生方法、プログラム、およびプログラム格納媒体 |
JP4957831B2 (ja) * | 2009-08-18 | 2012-06-20 | ソニー株式会社 | 再生装置および再生方法、並びに記録装置および記録方法 |
JP5502436B2 (ja) * | 2009-11-27 | 2014-05-28 | パナソニック株式会社 | 映像信号処理装置 |
KR20120017649A (ko) * | 2010-08-19 | 2012-02-29 | 삼성전자주식회사 | 디스플레이장치 및 그 제어방법 |
JP5302285B2 (ja) * | 2010-10-28 | 2013-10-02 | シャープ株式会社 | 立体映像出力装置、立体映像出力方法、立体映像出力プログラムおよびコンピュータ読み取り可能な記録媒体、ならびに、立体映像表示装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004104742A (ja) * | 2002-09-11 | 2004-04-02 | Remedia:Kk | 立体映像データの発生方法と装置 |
JP2004304767A (ja) | 2003-01-30 | 2004-10-28 | Sony Corp | 再生装置、再生方法、再生プログラムおよび記録媒体 |
WO2008044191A2 (en) * | 2006-10-11 | 2008-04-17 | Koninklijke Philips Electronics N.V. | Creating three dimensional graphics data |
WO2010010709A1 (ja) * | 2008-07-24 | 2010-01-28 | パナソニック株式会社 | 立体視再生が可能な再生装置、再生方法、プログラム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11289555A (ja) * | 1998-04-02 | 1999-10-19 | Toshiba Corp | 立体映像表示装置 |
US6864894B1 (en) * | 2000-11-17 | 2005-03-08 | Hewlett-Packard Development Company, L.P. | Single logical screen system and method for rendering graphical data |
AU2003234905A1 (en) * | 2003-05-07 | 2004-11-26 | Seijiro Tomita | Method and device for displaying image |
JP2005049668A (ja) * | 2003-07-30 | 2005-02-24 | Sharp Corp | データ変換装置、表示装置、データ変換方法、プログラム及び記録媒体 |
JP2005229384A (ja) * | 2004-02-13 | 2005-08-25 | Nippon Hoso Kyokai <Nhk> | マルチメディア情報配受信システム、マルチメディア情報配信装置およびマルチメディア情報受信装置 |
JP3746506B2 (ja) * | 2004-03-08 | 2006-02-15 | 一成 江良 | 立体視化パラメータ埋込装置及び立体視画像再生装置 |
KR100649523B1 (ko) * | 2005-06-30 | 2006-11-27 | 삼성에스디아이 주식회사 | 입체 영상 표시 장치 |
US8335425B2 (en) * | 2008-11-18 | 2012-12-18 | Panasonic Corporation | Playback apparatus, playback method, and program for performing stereoscopic playback |
US8269821B2 (en) * | 2009-01-27 | 2012-09-18 | EchoStar Technologies, L.L.C. | Systems and methods for providing closed captioning in three-dimensional imagery |
JP4588120B2 (ja) * | 2009-02-19 | 2010-11-24 | パナソニック株式会社 | 再生装置、記録方法、記録媒体再生システム |
-
2010
- 2010-03-18 JP JP2010063055A patent/JP2010268431A/ja active Pending
- 2010-04-09 WO PCT/JP2010/056418 patent/WO2010119814A1/ja active Application Filing
- 2010-04-09 EP EP10764398.3A patent/EP2421273A4/en not_active Withdrawn
- 2010-04-09 US US12/999,293 patent/US20120020640A1/en not_active Abandoned
- 2010-04-09 CN CN201080001837.8A patent/CN102282859B/zh not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004104742A (ja) * | 2002-09-11 | 2004-04-02 | Remedia:Kk | 立体映像データの発生方法と装置 |
JP2004304767A (ja) | 2003-01-30 | 2004-10-28 | Sony Corp | 再生装置、再生方法、再生プログラムおよび記録媒体 |
WO2008044191A2 (en) * | 2006-10-11 | 2008-04-17 | Koninklijke Philips Electronics N.V. | Creating three dimensional graphics data |
WO2010010709A1 (ja) * | 2008-07-24 | 2010-01-28 | パナソニック株式会社 | 立体視再生が可能な再生装置、再生方法、プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2421273A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012063675A1 (ja) * | 2010-11-08 | 2012-05-18 | ソニー株式会社 | 立体画像データ送信装置、立体画像データ送信方法および立体画像データ受信装置 |
JP2012120142A (ja) * | 2010-11-08 | 2012-06-21 | Sony Corp | 立体画像データ送信装置、立体画像データ送信方法および立体画像データ受信装置 |
EP2806644A4 (en) * | 2012-01-18 | 2014-11-26 | Panasonic Corp | TRANSMISSION DEVICE, VIDEO DISPLAY DEVICE, TRANSMISSION METHOD, VIDEO PROCESSING METHOD, VIDEO PROCESSING PROGRAM AND INTEGRATED CIRCUIT |
EP2806644A1 (en) * | 2012-01-18 | 2014-11-26 | Panasonic Corporation | Transmission device, video display device, transmission method, video processing method, video processing program, and integrated circuit |
US9872008B2 (en) | 2012-01-18 | 2018-01-16 | Panasonic Corporation | Display device and video transmission device, method, program, and integrated circuit for displaying text or graphics positioned over 3D video at varying depths/degrees |
Also Published As
Publication number | Publication date |
---|---|
EP2421273A4 (en) | 2013-06-19 |
US20120020640A1 (en) | 2012-01-26 |
EP2421273A1 (en) | 2012-02-22 |
JP2010268431A (ja) | 2010-11-25 |
CN102282859A (zh) | 2011-12-14 |
CN102282859B (zh) | 2014-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4985807B2 (ja) | 再生装置および再生方法 | |
JP2010250562A (ja) | データ構造、記録媒体、再生装置および再生方法、並びにプログラム | |
WO2010119814A1 (ja) | データ構造、記録媒体、再生装置および再生方法、並びにプログラム | |
JP2010252055A (ja) | データ構造、記録媒体、再生装置および再生方法、並びにプログラム | |
JP4985890B2 (ja) | 再生装置および再生方法、並びに記録方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080001837.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010764398 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10764398 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 8180/CHENP/2010 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12999293 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |