WO2010113730A1 - 情報処理装置、情報処理方法、及び、プログラム - Google Patents
情報処理装置、情報処理方法、及び、プログラム Download PDFInfo
- Publication number
- WO2010113730A1 WO2010113730A1 PCT/JP2010/055134 JP2010055134W WO2010113730A1 WO 2010113730 A1 WO2010113730 A1 WO 2010113730A1 JP 2010055134 W JP2010055134 W JP 2010055134W WO 2010113730 A1 WO2010113730 A1 WO 2010113730A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- video
- graphics
- plane
- mode
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8227—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/327—Table of contents
- G11B27/329—Table of contents on a disc [VTOC]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42646—Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/1062—Data buffering arrangements, e.g. recording or playback buffers
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/1062—Data buffering arrangements, e.g. recording or playback buffers
- G11B2020/10629—Data buffering arrangements, e.g. recording or playback buffers the buffer having a specific structure
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2541—Blu-ray discs; Blue laser DVR discs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
Definitions
- the present invention relates to an information processing device, an information processing method, and a program, and more particularly, for example, an information processing device and an information processing method capable of appropriately reproducing content of a 3D (Dimension) image from a recording medium. And related to the program.
- two-dimensional (2D) image content is the mainstream as content for movies and the like, but recently, three-dimensional (3D) image (graphic) content capable of stereoscopic viewing has attracted attention.
- 3D images There are various methods for displaying 3D images (hereinafter, also referred to as stereo images). Regardless of which method is used, the amount of 3D images is greater than the amount of 2D images. Become big.
- high-resolution image content such as movies may have a large capacity, and in order to record such a large-capacity image content as a 3D image with a large amount of data, a large-capacity recording medium is required.
- Examples of such a large-capacity recording medium include Blu-Ray® Disc (hereinafter also referred to as BD) such as BD (Blu-Ray®) -ROM (Read Only Memory).
- BD Blu-Ray® Disc
- BD Blu-Ray® Disc
- ROM Read Only Memory
- BD can handle BD-J (BD Java (registered trademark)), and can provide advanced interactive functions with BD-J (Patent Document 1).
- BD-J BD Java (registered trademark)
- Patent Document 1 BD Document 1
- the current BD standard does not stipulate how to record and play back 3D image content on a BD.
- the 3D image content may not be played back properly.
- the present invention has been made in view of such a situation, and makes it possible to appropriately reproduce 3D image content from a recording medium such as a BD.
- An information processing apparatus or program is such that a video plane that stores a video image includes an L region that is a storage region for an image that stores a left-eye image, and a right-eye image.
- the API for L Application Programming Interface
- One of the APIs sets the same size and position as the size and position of the image set by the other API, gives disparity to the graphics image, and uses the original image for the left eye Graphic spray, which is data for generating images and right-eye images
- An offset value and a PG plane offset value that is data for generating a left-eye image and a right-eye image from the original image by giving parallax to the PG (Presentation Graphics)
- a video plane that stores a video image stores an L area that is a storage area for an image that stores a left-eye image, and a right-eye image.
- This is a storage area in which two areas of image storage areas are arranged side by side with the R area, which is the image storage area of the area, and sets the size and position of the image stored in the L area.
- One of the API for L (Application Programming Interface) and the R API for setting the size and position of the image stored in the R area is the size of the image set by the other API.
- a graphics plane which is data for setting the same size and position as the position, giving parallax to the graphics image, and generating a left-eye image and a right-eye image from the original image
- the PG plane offset value which is data for generating the left-eye image and the right-eye image, from the original image
- the L API and the R API are added to the video plane.
- a video plane that stores video images includes an L area that is an image storage area for one surface that stores an image for the left eye, and an area for one surface that stores an image for the right eye.
- An image storage area is a storage area in which two areas of image storage areas are arranged side by side, and one of the API for L and the API for R is the other API Set the same size and position as the size and position of the image to be set.
- the API for L is an API for setting the size and position of an image stored in the L area
- the API for R is an API for setting the size and position of an image stored in the R area. is there.
- the disparity from the original image to the graphics plane offset value which is data for generating the left eye image and the right eye image
- the PG (Presentation Graphics) image PG plane offset value which is data for generating a left eye image and a right eye image from the original image
- the image is scaled at the scaling ratio when scaling is performed to set the size of the video image.
- the information processing apparatus may be an independent apparatus or an internal block constituting one apparatus.
- the program can be provided by being transmitted through a transmission medium or by being recorded on a recording medium.
- 3D image content can be appropriately reproduced.
- FIG. 3 is a block diagram illustrating a hardware configuration example of a BD player. It is a figure explaining the outline
- FIG. 3 is a diagram for describing drawing of a graphics 3D image on a graphics plane 11 by a BD-J application. 3 is a diagram illustrating a graphics mode in which a BD-J application reproduces a graphics image by drawing a graphics 3D image on the graphics plane 11.
- FIG. It is a block diagram which shows the functional structural example of a 3D corresponding
- FIG. 4 is a diagram for describing switching between 3D image playback and 2D image playback in a 3D-compatible player. It is a figure explaining the setting of the position and size of the video by the author, and the correction of the position and size of the video by the 3D-compatible player. It is a block diagram which shows the functional structural example of a 3D corresponding
- FIG. 14 is a figure which shows the 1st focus system and the 2nd focus system. It is a flowchart explaining the management of the focus of a 3D-compatible player. It is a figure which shows the position on the display screen where the 3D image of a cursor can be seen, and the position of the cursor on the graphics plane.
- BD-ROM which is a read-only type BD, defined by “Blu-ray Disc Read-Only Format Ver1.0 part3 Audio Visual Specifications”, that is, AV ( A management structure (hereinafter also referred to as BDMV format) of audio / video data and the like will be described.
- a bit stream encoded by an encoding method such as MPEG (Moving Picture Experts Group) video or MPEG audio and multiplexed according to the MPEG2 system is called a clip AV stream (or AV stream).
- the clip AV stream is recorded on the BD as a file by a file system defined by "Blu-ray Disc Read-Only Format part2", which is one of the BD standards.
- a clip AV stream file is called a clip AV stream file (or AV stream file).
- the clip AV stream file is a management unit on the file system, and information necessary for reproducing the clip AV stream file (the clip AV stream) is recorded on the BD as a database.
- This database is defined by "Blu-ray Disc Read-Only Format part3", one of the BD standards.
- FIG. 1 is a diagram for explaining the outline of the BDMV format.
- BDMV format consists of 4 layers.
- the lowest layer is a layer to which the clip AV stream belongs, and is hereinafter also referred to as a clip layer as appropriate.
- the layer one layer above the clip layer is a layer to which a playlist (Movie PlayList) belongs for designating a playback position for the clip AV stream, and is also referred to as a playlist layer hereinafter.
- a playlist Media PlayList
- the layer immediately above the playlist layer is a layer to which a movie object (Movie Object) composed of a command for designating a playback order or the like belongs to the playlist, and is also referred to as an object layer hereinafter.
- a movie object Movie Object
- the layer above the object layer (the highest layer) is a layer to which an index table for managing titles stored in the BD belongs, and is also referred to as an index layer hereinafter.
- the clip layer, playlist layer, object layer, and index layer will be further described.
- a clip AV stream is a stream in which video data or audio data as content data is in the form of TS (MPEG2 TS (Transport Stream)).
- Clip Information is information about the clip AV stream, and is recorded on the BD as a file.
- the clip AV stream includes graphics streams such as subtitles and menus as necessary.
- the subtitle (graphics) stream is called a presentation graphics (PG (Presentation Graphics)) stream
- the menu (graphics) stream is called an interactive graphics (IG (Interactive Graphics)) stream.
- PG Presentation Graphics
- IG Interactive Graphics
- a set of a clip AV stream file and a file (clip information file) of corresponding clip information (clip information related to the clip AV stream of the clip AV stream file) is called a clip.
- a clip is one object composed of a clip AV stream and clip information.
- a plurality of positions including the first and last positions (time) when the content corresponding to the clip AV stream constituting the clip is expanded on the time axis are set as access points.
- An access point is mainly designated by a time-stamp and a higher layer playlist (PlayList).
- the clip information constituting the clip includes the address (logical address) of the position of the clip AV stream represented by the access point designated by the time stamp in the playlist.
- the playlist layer (Movie PlayList) belongs to the playlist layer.
- the playlist is composed of a play item (PlayItem) including an AV stream file to be played, a playback start point (IN point) for specifying a playback position of the AV stream file, and a playback end point (OUT point).
- a play item including an AV stream file to be played
- a playback start point for specifying a playback position of the AV stream file
- a playback end point OUT point
- a playlist is composed of a set of play items.
- playback of a play item means playback of a section of a clip AV stream specified by an IN point and an OUT point included in the play item.
- Movie objects (Movie Objects) and BD-J objects (Blu-ray Discs Java (registered trademark) Objects) belong to the object layer.
- the movie object includes terminal information that links the HDMV (High Definition Movie) navigation command program (navigation command) and the movie object.
- HDMV High Definition Movie
- the navigation command is a command for controlling playback of the playlist.
- the terminal information includes information for permitting a user's interactive operation on a BD player that plays BD.
- user operations such as menu call and title search are controlled based on terminal information.
- BD-J object is a Java (registered trademark) program and can provide a user with a more advanced (sophisticated) interactive function than a navigation command.
- the index table belongs to the index layer.
- the index table is a top-level table that defines the title of the BD-ROM disc.
- An entry (column) in the index table corresponds to a title, and each entry is linked to an object (movie object, BD-J object) of a title (HDMV title, BD-J title) corresponding to the entry. .
- Fig. 2 is a diagram for explaining the BD file management structure defined by "Blu-ray Disc Read-Only Format part 3".
- a file under a directory means a file immediately under the directory, and a file included in the directory is a file under the directory or the directory. Means a file under a so-called subdirectory.
- the top level directory of BD is the root directory.
- the directory “BDMV” and the directory “CERTIFICATE” exist immediately under the root directory.
- the directory “CERTIFICATE” stores copyright information (files).
- the directory “BDMV” stores the files in the BDMV format described in FIG.
- the file “index.bdmv” includes the index table described with reference to FIG. 1 as information related to the menu for playing the BD.
- the BD player for example, displays an initial menu (screen) including items such as playing all BD contents, playing only specific chapters, repeatedly playing, displaying a predetermined menu, etc. Play based on "index.bdmv”.
- a movie object (Movie Object) to be executed when each item is selected can be set.
- the BD The player executes the Movie Object command set in the file “index.bdmv”.
- the file “MovieObject.bdmv” is a file including information about Movie Object.
- Movie Object includes a command for controlling playback of the PlayList recorded on the BD.
- the BD player records one on the BD by selecting and executing one of the MovieObjects recorded on the BD. Play the content (title) that has been played.
- the directory “PLAYLIST” stores a playlist database. That is, the playlist file “xxxxx.mpls” is stored in the directory “PLAYLIST”.
- the file name of the file “xxxxx.mpls” a file name composed of a 5-digit number “xxxxx” and an extension “mpls” is used.
- the directory “CLIPINF” stores a database of clips. That is, the directory CLIPINF "stores the clip information file" xxxxx.clpi "for each of the clip AV stream files. The file name of the clip information file” xxxxx.clpi " The file name consisting of the extension "clpi" is used.
- the directory “STREAM” stores the clip AV stream file “xxxxx.m2ts”.
- TS is stored in the clip AV stream file “xxxxx.m2ts”.
- the file names of the clip information file “xxxxx.clpi” and the clip AV stream file “xxxxx.m2ts” constituting a certain clip are the same file names except for the extension. Thereby, the clip information file “xxxxx.clpi” and the clip AV stream file “xxxxx.m2ts” constituting a certain clip can be easily specified.
- the directory “AUXDATA” stores sound files, font files, font index files, bitmap files, and the like that are used for displaying menus.
- the file “sound.bdmv” stores predetermined sound data (audio data).
- “sound.bdmv” is fixedly used.
- the file with the extension “otf” stores subtitle display and font data used in BD-J objects (applications).
- a 5-digit number is used for a portion other than the extension in the file name of the file having the extension “otf”.
- the directory “META” stores metadata files.
- the directory “BDJO” and the directory “JAR” store files of BD-J objects.
- the directory “BACKUP” stores backups of files recorded on the BD.
- FIG. 3 is a block diagram illustrating a hardware configuration example of a BD player that plays BDs.
- the BD player in FIG. 3 can play back a BD on which 3D image content is recorded.
- the BD player has a built-in processor (computer) such as a CPU (Central Processing Unit) 102.
- a built-in processor such as a CPU (Central Processing Unit) 102.
- An input / output interface 110 is connected to the CPU 102 via the bus 101.
- the CPU 102 executes a program stored in a ROM (Read Only Memory) 103 accordingly. .
- the CPU 102 loads a program recorded on the hard disk 105 or the disk 100 mounted in the drive 109 into a RAM (Random Access Memory) 104 and executes it.
- the CPU 102 performs various processes described later. Then, the CPU 102 outputs the processing result as necessary, for example, via the input / output interface 110, from the output unit 106, transmitted from the communication unit 108, and further recorded in the hard disk 105.
- the input unit 107 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 106 includes an LCD (Liquid Crystal Display), a speaker, and the like.
- the communication unit 108 includes a network card or the like.
- the program executed by the CPU 102 can be recorded in advance on a hard disk 105 or a ROM 103 as a recording medium built in the BD player.
- the program can be stored (recorded) in a removable recording medium such as the disk 100.
- a removable recording medium can be provided as so-called package software.
- examples of the removable recording medium include a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disc, a DVD (Digital Versatile Disc), a magnetic disc, and a semiconductor memory.
- the program can be installed on the BD player from the removable recording medium as described above, or can be downloaded to the BD player via the communication network or the broadcast network and installed on the built-in hard disk 105. That is, for example, the program is transferred from a download site to a BD player wirelessly via a digital satellite broadcasting artificial satellite, or wired to a BD player via a network such as a LAN (Local Area Network) or the Internet. Can be transferred.
- LAN Local Area Network
- a disc 100 is, for example, a BD, in which 3D image content is recorded in a form that maintains compatibility with a BD that is played back by a legacy player.
- the disc 100 can be played back by a legacy player, and can also be played back by the BD player of FIG. 3 which is a BD player (hereinafter also referred to as a 3D-compatible player) capable of playing back 3D image content. it can.
- the legacy player is a BD player that can reproduce a BD on which a 2D image content is recorded, but cannot reproduce a 3D image content.
- the legacy player can play back 2D image content from the disc 100, but cannot play back 3D image content.
- the BD player of FIG. 3 which is a 3D-compatible player
- 2D image content can be reproduced from the disc 100, and 3D image content can also be reproduced.
- the CPU 102 controls the drive 109 to reproduce the disc 100.
- BD-J application (BD-J title) (BD-J object) is recorded as one of 3D image contents.
- the CPU 102 executes a Java (registered trademark) virtual machine, and a BD-J application is executed on the Java (registered trademark) virtual machine.
- FIG. 4 is a diagram for explaining the outline of 3D image processing (outline of BD-J stereoscopic graphics) by a 3D-compatible player.
- the 3D-compatible player draws a 3D image on the logical plane 10, the PG plane 12, or the video plane 13.
- the entities of the logical plane 10, the PG plane 12, and the video plane 13 are, for example, a partial storage area of the RAM 104 in FIG.
- 3D images drawn by a 3D compatible player include BD-J graphics, PG (Presentation Graphics), TextST (Text subtitle), video, and background defined in the BD standard.
- graphics 3D images are an image observed with the left eye (L (Left) -view) and an image observed with the right eye. It consists of an image for the right eye (R (Right) -view).
- the 3D image of PG (stereo PG source), the video 3D image (stereo video source), and the background 3D image (stereo ⁇ background source) are composed of a left eye image and a right eye image.
- left-eye image and the right-eye image constituting the video 3D image or the like can be encoded by, for example, H.264 AVC (Advanced Video Coding) / MVC (Multi-view Video coding) or the like. it can.
- H.264 AVC Advanced Video Coding
- MVC Multi-view Video coding
- an image stream called a base view and an image stream called a dependent view are defined.
- the base view does not allow predictive coding using another stream as a reference image, but the dependent view allows predictive coding using the base view as a reference image.
- the left-eye image and the right-eye image for example, the left-eye image can be a base view, and the right-eye image can be a dependent view.
- the 3D-compatible player draws the 3D image drawn on the logical plane 10 on the graphics plane 11 or the background plane 14.
- the graphics plane 11 includes an L graphics plane (L (Left) graphics plane) 11L that stores an image for the left eye, and an R graphics plane (R (Right) graphics plane) 11R that stores an image for the right eye. .
- the left-eye image constituting the graphics 3D image drawn on the logical plane 10 is drawn on the L graphics plane 11L, and the right-eye image is drawn on the R graphics plane 11R.
- the L graphics plane 11L is a storage area (L area) of an image for one surface that stores an L (Left) image (left eye image) observed with the left eye.
- the R graphics plane 11 ⁇ / b> R is a storage area (R area) of an image for one surface that stores an R (Right) image (right eye image) observed with the right eye.
- the entity of the L graphics plane 11L and the R graphics plane 11R, that is, the graphics plane 11 is a storage area of a part of the RAM 104 in FIG.
- the PG plane 12 includes an L-PG plane (L (Left) PG plane) 12L that stores an image for the left eye and an R-PG plane (R (Right) PG plane) 12R that stores an image for the right eye. Is done.
- L-PG plane L (Left) PG plane
- R-PG plane R (Right) PG plane
- the 3D-compatible player draws the left-eye image constituting the PG 3D image on the L-PG plane 12L, and draws the right-eye image on the R-PG plane 12R.
- the video plane 13 includes an L video plane (L (Left) video plane) 13L that stores an image for the left eye, and an R video plane (R (Right) video plane) 13R that stores an image for the right eye. .
- the 3D-compatible player draws the left-eye image constituting the video 3D image on the L video plane 13L, and draws the right-eye image on the R video plane 13R.
- the background plane 14 includes an L background plane (L (Left) background plane) 14L that stores an image for the left eye and an R background plane (R (Right) background plane) 14R that stores an image for the right eye. Composed.
- L background plane L (Left) background plane
- R background plane R (Right) background plane
- the image for the left eye constituting the background 3D image drawn on the logical plane 10 is drawn on the L background plane 14L, and the image for the right eye is drawn on the R background plane 14R.
- the left-eye image and the right-eye image drawn (stored) on the graphics plane 11, the PG plane 12, the video plane 13, and the background plane 14 are supplied to the mixer 15.
- the mixer 15 is a graphics left-eye image from the graphics plane 11, a PG left-eye image from the PG plane 12, a video left-eye image from the video plane 13, and a background from the background plane 14.
- the left-eye image is blended (mixed) (synthesized), and the left-eye image as a result of the synthesis is output.
- the mixer 15 is a graphics right-eye image from the graphics plane 11, a PG right-eye image from the PG plane 12, a video right-eye image from the video plane 13, and a background plane 14.
- the background image for the right eye is blended and synthesized, and the image for the right eye that is the result of the synthesis is output.
- the left-eye image output from the mixer 15 is supplied to a display (not shown) as a left display output (L (Left) display output).
- the right-eye image output from the mixer 15 is supplied to a display (not shown) as a right display output (R (Right) display output).
- the left eye image and the right eye image from the mixer 15 are displayed alternately or simultaneously, thereby displaying a 3D image.
- the BD-J application can draw an image on the graphics plane 11 and the background plane 14 among the graphics plane 11, the PG plane 12, the video plane 13, and the background plane 14.
- the BD-J application can only access the logical plane 10 and cannot directly access the graphics plane 11 and the background plane 14.
- the BD-J application can only perform image drawing on the logical plane 10 and not directly on the graphics plane 11 and the background plane 14. Therefore, the BD-J application indirectly draws an image on the graphics plane 11 or the background plane 14 by drawing an image on the logical plane 10.
- drawing of an image on the graphics plane 11 and the background plane 14 via the logical plane 10 by the BD-J application is simply performed on the graphics plane 11 and the background plane 14. Also described as drawing an image on
- a 3D-compatible player can be configured without the logical plane 10.
- the BD-J application directly draws an image on the graphics plane 11 or the background plane 14.
- the BD-J application can perform video and PG playback control such as video and PG scaling and position (display position) control in addition to drawing images on the graphics plane 11 and the background plane 14. it can.
- the BD-J application video and PG are handled as a set (collectively). That is, the BD-J application does not distinguish (cannot distinguish) video and PG.
- FIG. 5 is a diagram for explaining the drawing of a graphics 3D image on the graphics plane 11 (Stereoscopic graphics planes) by the BD-J application.
- the first drawing method and the second drawing method can be adopted.
- FIG. 5A is a diagram for explaining the first drawing method.
- the author of the BD-J application draws on the stereo plane.
- graphics 3D image data is composed of left-eye image data and right-eye image data
- the BD-J application uses the left-eye image and right-eye image.
- the work image is drawn on the logical plane 10.
- the image for the left eye and the image for the right eye drawn on the logical plane 10 are drawn on the graphics plane 11 as they are. That is, the image for the left eye drawn on the logical plane 10 is directly drawn on the L graphics plane 11L, and the image for the right eye drawn on the logical plane 10 is drawn on the R graphics plane 11R as it is. .
- FIG. 5B is a diagram for explaining the second drawing method.
- the author of the BD-J application draws on a monoplane.
- the author supplies an offset value (graphics plane offset value).
- the 3D-compatible player generates a stereo plane from the mono plane based on the offset value.
- the 3D image data generates a 3D image, that is, the original original image data, and the original image is given parallax, and the left image and the right eye are converted from the original image. And parallax data for generating an image for use.
- BD-J application draws the original image on the logical plane 10.
- the 3D-compatible player draws an image for the left eye and an image for the right eye generated by giving parallax to the original image drawn on the logical plane 10 on the L graphics plane 11L and the R graphics plane 11R, respectively. To do.
- the parallax data is an offset value (offset)
- the number of pixels (number of pixels) by which the position of the original image is shifted in the horizontal direction (x direction) can be employed as the offset value.
- the original image drawn on the logical plane 10 is drawn at a position shifted in the horizontal direction by the offset value, with the right direction from the left as the positive direction. That is, an image obtained as a result of shifting the horizontal position of the original image drawn on the logical plane 10 by the offset value is drawn on the L graphics plane 11L as the left-eye image.
- the original image drawn on the logical plane 10 is drawn at a position shifted in the horizontal direction by the offset value. That is, an image obtained as a result of shifting the horizontal position of the original image drawn on the logical plane 10 by the offset value is drawn on the R graphics plane 11R as the right-eye image.
- the offset value when the offset value is positive, the 3D image displayed by the left-eye image and the right-eye image is lifted to the near side in the depth direction perpendicular to the display screen of the display (not shown). Looks. On the other hand, when the offset value is negative, the 3D images displayed by the left-eye image and the right-eye image appear to be recessed on the far side in the depth direction.
- FIG. 6 is a diagram illustrating a graphics mode in which a BD-J application reproduces a graphics image by drawing a graphics 3D image on the graphics plane 11.
- the 3D-compatible player always has a 2-plane (L graphics plane 11L and R graphics plane 11R), and the BD-J application is designed to draw on the logical plane 10.
- the graphics left-eye image (L graphics plane) drawn on the L graphics plane 11L is the video (and PG) left-eye image (L video plane) drawn on the L video plane 13L.
- the graphics right-eye image (R graphics plane) drawn on the R graphics plane 11R is blended with the video right-eye image (R video plane) drawn on the R video plane 13R.
- FIG. 6A shows a mono-logical-plane + offset value mode (hereinafter also referred to as an offset graphics mode) which is one mode Mode # 1 of the graphics mode.
- the BD-J application draws a mono image that is a graphics 2D image on the logical plane 10. Further, the BD-J application gives an offset value to the 3D-compatible player.
- the 3D-compatible player generates a stereo image, which is a graphics 3D image, from the mono image drawn on the logical plane 10 and the offset value given from the BD-J application. Further, the BD player draws (stores) the image for the left eye constituting the stereo image on the L graphics plane 11L (L region), and also converts the image for the right eye constituting the stereo image to the R graphics plane 11R. Draw (store) in (R area).
- the mixer 15 blends the graphics image for the left eye drawn (stored) on the L graphics plane 11L with the video (and PG) image for the left eye drawn on the L video plane 13L, and outputs the blended image. Furthermore, the mixer 15 blends the graphics right-eye image drawn on the R graphics plane 11R with the video right-eye image drawn on the R video plane 13R, and outputs the blended result.
- FIG. 6B shows a stereo-logical-plane mode (hereinafter also referred to as stereo graphics mode) which is one mode Mode # 2 of the graphics mode.
- the BD-J application draws the left-eye image and the right-eye image constituting the stereo image, which is a graphics 3D image, on the logical plane 10.
- the 3D-compatible player draws the left-eye image drawn on the logical plane 10 on the L graphics plane 11L, and draws the right-eye image drawn on the logical plane 10 on the R graphics plane 11R.
- the mixer 15 blends the graphics image for the left eye drawn on the L graphics plane 11L with the video image for the left eye drawn on the L video plane 13L, and outputs the blended image. Furthermore, the mixer 15 blends the graphics right-eye image drawn on the R graphics plane 11R with the video right-eye image drawn on the R video plane 13R, and outputs the blended result.
- 6C shows a forced-mono-logical-plane mode (hereinafter also referred to as forced mono graphics mode) which is one mode Mode # 3 of the graphics mode.
- forced mono graphics mode a forced-mono-logical-plane mode
- the BD-J application draws a stereo image, which is a graphics 3D image, on the logical plane 10.
- the 3D-compatible player is one of the L graphics image and the R graphics image among the stereo images drawn on the logical plane 10, for example, only the L graphics image is converted into the L graphics plane 11L and the R graphics plane. For example, the drawing is performed only on the L graphics plane 11L.
- the mixer 15 blends the graphics mono image drawn on the L graphics plane 11L with the video image drawn on the L video plane 13L and outputs the blended image.
- 6D shows a flattened-stereo-logical-plane mode (hereinafter also referred to as a flat stereo graphics mode) which is one mode Mode # 4 of the graphics mode.
- a flat stereo graphics mode the BD-J application draws a left-eye image and a right-eye image that form a stereo image that is a graphics 3D image on the logical plane 10.
- the 3D-compatible player is one of the left-eye image and the right-eye image drawn on the logical plane 10, for example, only the left-eye image is converted into the L graphics plane 14L and the R graphics plane 14R. Draw both images and discard the other image for the right eye.
- the graphics left-eye image drawn on the L graphics plane 14L is supplied to the mixer 15, and the graphics left-eye image drawn on the graphics plane 14R is either (as the right-eye image). Is also supplied to the mixer 15.
- 6E shows a mono-logical-plane mode (hereinafter also referred to as a mono graphics mode) which is one mode Mode # 5 of the graphics mode.
- the BD-J application draws a mono image that is a graphics 2D image on the logical plane 10.
- the 3D-compatible player draws the mono image drawn on the logical plane 10 only on the L graphics plane 11L, for example, one of the L graphics plane 11L and the R graphics plane 11R.
- the mixer 15 blends the graphics mono image drawn on the L graphics plane 11L with the video image drawn on the L video plane 13L and outputs the blended image.
- the offset value can be applied to the graphic plane 11 and the PG plane 12.
- the offset value applied to the graphic plane 11 (data that gives a parallax to a graphics image) is also referred to as a graphics plane offset value.
- an offset value (data that gives parallax to a PG image) applied to the PG plane 12 is also referred to as a PG plane offset (PG plane offset) value.
- the following API for reading and writing offset values is defined, and the graphics plane offset value can be set and obtained by the dedicated API.
- the setOffset () method is a method for storing (setting) the graphics plane offset value in an internal storage area that is a storage area provided inside the BD player, and getOffset () is an internal storage of the BD player. This method acquires the graphics plane offset value stored in the area.
- the BD player also has a PSR (Player Setting Register) that stores information related to BD playback, and the graphics plane offset value and PG plane offset value are reserved in the legacy player of the PSR. For example, it can be stored in PSR # 21.
- PSR Player Setting Register
- the substance of the internal storage area and the PSR is a partial storage area of the RAM 104 and the hard disk 105 in FIG.
- 3D-compatible players can indirectly write to PSR by defining an offset value as General ⁇ ⁇ ⁇ ⁇ ⁇ Preference.
- the 3D-compatible player stores information related to BD playback, using the BD standard graphics and the offset value, which is data that gives disparity to the PG image, as one of the general preferences of the BD standard (General Preference). It has a general preference API (Application Programming Interface) that reads and writes offset values for PSR # 21.
- PSR # 21 is mapped to General Preference of BD standard part3-2 Annex L, and the value can be set and obtained with org.dvb.user. GeneralPreference API.
- the general preference name (General Preference name) for accessing the PSR with the General Preference API can be defined as follows.
- the general preference name of the graphics plane offset value can be defined as, for example, “graphics offset”.
- the general preference name of the PG plane offset value can be defined as, for example, “subtitle ⁇ ⁇ ⁇ offset”.
- the following dedicated API can be defined, and the graphics plane offset value can be set and obtained by the dedicated API.
- the setOffset () method is a method for storing the graphics plane offset value in the internal storage area of the BD player (in this case, for example, PSR), and getOffset () is stored in the internal storage area of the BD player. This method obtains the graphics plane offset value.
- FIG. 7 shows the functional configuration of the BD player in FIG. 3 as a 3D-compatible player that reads and writes BD standard graphics and offset values of PG (hereinafter, including TextST unless otherwise specified) as described above. It is a block diagram which shows an example.
- a in FIG. 7 is a diagram as a 3D-compatible player having an API dedicated to read / write offset values, in which the BD standard graphics and PG offset values are read / written from / to the internal storage area of the 3D-compatible player.
- 3 is a block diagram illustrating a functional configuration example of a BD player 3; FIG.
- the BD-J application requests the offset value reading / writing (setting or acquisition) to the API for reading / writing the offset value (General Preference API).
- the API for reading / writing the offset value sets the offset value (graphics plane offset value, PG plane offset value) in the internal storage area of the 3D-compatible player in response to a request from the BD-J application, or Get the offset value from the internal storage area of the 3D-compatible player and return it to the BD-J application.
- the playback control engine (Playback Control Engine) is an image (original) drawn on the logical plane 10 by the BD-J application in accordance with the offset value set in the internal storage area of the 3D-compatible player. Control for generating (reproducing) an image for the left eye and an image for the right eye from the image).
- an API for reading and writing offset values is defined, and the API for reading and writing offset values gives disparity to BD standard graphics and PG images in response to requests from BD-J applications.
- the offset value which is data
- the offset value that gives the parallax to the image can be indirectly set and acquired from the BD-J application.
- FIG. 7 is a 3D-compatible player having a general preference API for reading and writing an offset value to / from PSR # 21, using BD standard graphics and PG offset values as one of the general preferences of the BD standard.
- FIG. 4 is a block diagram illustrating a functional configuration example of the BD player in FIG. 3.
- the BD-J application requests the general preference API (General Preference API) to read / write (set or acquire) the offset value.
- general preference API General Preference API
- the BD-J application calls the general preference API with the general preference name (General Preference name) as “graphics offset”.
- the BD-J application calls the general preference API with the general preference name as “subtitlesuboffset”.
- the General Preferences API sets an offset value in PSR # 21 of PSR (Player Setting Register) or obtains an offset value from PSR # 21 in response to a request from the BD-J application. Return to BD-J application.
- the playback control engine uses an image (original image) drawn on the logical plane 10 by the BD-J application in accordance with the offset value set in PSR # 21. Control for generating (reproducing) a left-eye image and a right-eye image is performed.
- the General Preference API uses the BD standard graphics and the offset value, which is data that gives parallax to the PG image, as one of the BD standard General Preference.
- the offset value which is data that gives parallax to the PG image
- an offset value that gives parallax to an image can be indirectly set and acquired from a BD-J application.
- FIG. 8 is a diagram showing a video mode for reproducing a video image, which is one of the configurations of the video plane 13.
- a mono-video mode (hereinafter also referred to as a mono video mode) which is one mode Mode # 1 of the video mode.
- the 3D-compatible player converts a mono image, which is a 2D image of video, into one of the L video plane 13L (L region) and the R video plane 13R (R region), for example, L video. Draw (store) only on the plane 13L.
- the video mono image drawn (stored) only on the L video plane 13L is supplied to the mixer 15.
- FIG. 8B shows a dual-mono-video mode (hereinafter also referred to as a dual mono video mode) which is one mode Mode # 2 of the video mode.
- the 3D-compatible player draws (stores) a mono image that is a 2D image of a video on the L video plane 13L (L region) (as an image for the left eye) and displays the mono image. , (As an image for the right eye), is drawn (stored) on the R video plane 13R (R region).
- the video mono image drawn (stored) on the L video plane 13L and the video mono image drawn on the R video plane 13R are both supplied to the mixer 15.
- stereo video mode 8C shows a stereo-video mode (hereinafter also referred to as stereo video mode) which is one mode Mode # 3 of the video mode.
- the 3D-compatible player draws the left-eye image constituting the stereo image that is the 3D image of the video on the L video plane 13L, and the right-eye image constituting the stereo image is rendered as the R video. Draw on the plane 13R.
- the video image for the left eye drawn (stored) on the L video plane 13L and the video image for the right eye drawn on the R video plane 13R are both supplied to the mixer 15.
- FIG. 8D shows a flattened-stereo-video mode (hereinafter also referred to as a flat stereo video mode) which is one mode Mode # 4 of the video mode.
- the 3D-compatible player selects one of the left-eye image and the right-eye image constituting the stereo image that is a 3D image of the video, for example, only the left-eye image, Drawing is performed on both the video plane 13L and the R video plane 13R, and the other right-eye image is discarded.
- the video image for the left eye drawn (stored) on the L video plane 13L is supplied to the mixer 15, and the video image for the left eye drawn on the R video plane 13R is (the image for the right eye). ) To the mixer 15.
- 8E shows a forced-mono-video mode (hereinafter also referred to as forced mono video mode) which is one mode Mode # 5 of the video mode.
- the 3D-compatible player selects one of the left-eye image and the right-eye image that form a stereo image that is a 3D image of the video, for example, only the left-eye image.
- the drawing is performed only on the L video plane 13L, which is one of the video plane 13L and the R video plane 13R, and the other right-eye image is discarded.
- FIG. 9 is a diagram showing a background mode in which a background image is reproduced, which is one of the configurations of the background plane 14.
- FIG. 9A shows a dual-mono-background mode (hereinafter also referred to as dual mono background mode) which is one mode Mode # 1 of the background mode.
- the BD-J application draws a mono image, which is a background 2D image, on the logical plane 10 as an image for the left eye and an image for the right eye.
- the 3D-compatible player draws (stores) the image for the left eye drawn on the logical plane 10 on the L background plane 14L (L region), and the image for the right eye drawn on the logical plane 10 , R is drawn (stored) on the R background plane 14R (R region).
- the background image for the left eye drawn (stored) on the L background plane 14L and the image for the right eye background drawn on the R background plane 14R are both supplied to the mixer 15.
- FIG. 9B shows a stereo-background mode (hereinafter also referred to as a stereo background mode) which is one mode Mode # 2 of the background mode.
- the BD-J application draws the left-eye image and the right-eye image constituting the stereo image that is the background 3D image on the logical plane 10.
- the 3D-compatible player draws the image for the left eye drawn on the logical plane 10 on the L background plane 14L, and draws the image for the right eye drawn on the logical plane 10 on the R background plane 14R. To do.
- the background left-eye image drawn on the L background plane 14L and the background right-eye image drawn on the R background plane 14R are both supplied to the mixer 15.
- FIG. 9C shows a flattened-stereo-background mode (hereinafter also referred to as flat stereo background mode) which is one mode Mode # 3 of the background mode.
- the BD-J application draws the image for the left eye and the image for the right eye constituting the stereo image that is the background 3D image on the logical plane 10.
- the 3D-compatible player selects one of the left-eye image and the right-eye image drawn on the logical plane 10, for example, only the left-eye image, the L background plane 14L, and R Drawing on both of the background planes 14R, discarding the other right-eye image.
- the background image for the left eye drawn on the L background plane 14L is supplied to the mixer 15, and the image for the left eye drawn on the R background plane 14R is (as the image for the right eye). ,) Are supplied to the mixer 15.
- FIG. 9D shows a mono-background mode (hereinafter also referred to as a mono background mode) which is one mode Mode # 4 of the background mode.
- the BD-J application draws a mono image that is a background 2D image on the logical plane 10.
- the 3D-compatible player draws the mono image drawn on the logical plane 10 only on the L background plane 14L, for example, one of the L background plane 14L and the R background plane 14R.
- the background mono image drawn on the L background plane 14L is supplied to the mixer 15.
- 9E shows a forced-mono-background mode (hereinafter also referred to as a forced mono background mode) that is one mode Mode # 5 of the background mode.
- the BD-J application draws the image for the left eye and the image for the right eye constituting the stereo image that is the background 3D image on the logical plane 10.
- the 3D-compatible player selects one of the left-eye image and the right-eye image drawn on the logical plane 10, for example, only the left-eye image, the L background plane 14L, and R Drawing is performed only on one of the background planes 14R, for example, the L background plane 14L, and the other image for the right eye is discarded.
- the background image for the left eye drawn on the L background plane 14 ⁇ / b> L is supplied to the mixer 15.
- the graphics plane 11 for storing graphics the video plane 13 for storing video (and the PG plane 12 for storing PG), and the background plane 14 for storing background shown in FIG. 4 are summarized. Also referred to as a device plane.
- the device plane configuration is (1) image frame and color depth, (2) video mode (Video mode), and (3) graphics mode (BD-J Graphics). mode), and (4) background mode (Background mode).
- FIG. 10 shows the relationship among the graphics plane 11, the PG plane 12, the video plane 13, and the background plane 14, which are device planes.
- the graphics plane 11 includes an L graphics plane 11L as an L area that is a storage area for storing a left-eye image and an R graphics plane 11R as an R area that is a storage area for storing a right-eye image. .
- the L graphics plane 11L and the R graphics plane 11R are arranged side by side.
- the L graphics plane 11L and the R graphics plane 11R are moved up and down so that the L graphics plane 11L that is the L region is on the upper side and the R graphics plane 11R that is the R region is on the lower side.
- the graphics plane 11 is configured by arranging them side by side.
- Images drawn on the graphics plane 11, the PG plane 12, the video plane 13, and the background plane 14 are superimposed in this order from the front side in the order of the graphics plane 11, the PG plane 12, the video plane 13, and the background plane 14. (Blend), the L region image and the R region image obtained as a result are alternately drawn (stored) on the logical screen 21 that abstracts the display screen of the display, for example.
- the substance of the logical screen 21 is a partial storage area of the RAM 104.
- the device plane is a storage area in which an L area and an R area, which are storage areas for one image, are arranged one above the other, and is therefore a storage area for two images.
- the screen 21 is a storage area for one image.
- the configuration of the device plane is defined for the entire device plane, which is a storage area for two images, for 3D images.
- FIG. 11 shows (1) image frame (Resolution) and color depth (color-depth), which is one of the device plane configurations.
- the image frame of 5 lines from the top (the width of the device plane ⁇ the number of vertical pixels) (resolution) and the color depth indicate the image frame and color depth of the 3D image, and the remaining 5
- the image frame and color depth of the row (5 rows from the bottom) indicate the image frame and color depth of the 2D image.
- the 3D image is composed of an image for the left eye and an image for the right eye, so it becomes an image for two surfaces.
- the device plane is a storage area in which an L area and an R area, which are storage areas for one image, are arranged vertically, a 3D image stored in such a device plane
- the image frame has a size obtained by doubling the number of pixels in the vertical direction of the corresponding 2D image (a 2D image having the same size as the left-eye image (or right-eye image)).
- graphics (images) stored in the graphics plane 11 and background (images) stored in the background plane 14 are both In principle, it matches the image frame of the video stored in the video plane 13.
- the background frame stored in the background plane 14 is stored in the video plane 13.
- the image frame is 1920 ⁇ 1080 pixels
- the graphics image frame stored in the graphics plane 11 is 1/2 in the horizontal and vertical directions of the video image frame stored in the video plane 13, respectively.
- 2 960 ⁇ 540 pixels
- the graphics of 960 ⁇ 540 pixels stored in the graphics plane 11 is 1920 ⁇ 1080 which is a video image frame stored in the video plane 13 by doubling the horizontal and vertical dimensions. Displayed after matching the pixels.
- 3D image mismatch cases there are cases corresponding to 2D image mismatch cases (hereinafter also referred to as 3D image mismatch cases).
- the background frame stored in the background plane 14 is stored in the video plane 13.
- the image frame is 1920 ⁇ 2160 pixels
- the graphics image frame stored in the graphics plane 11 is 1/2 in the horizontal and vertical directions of the video image frame stored in the video plane 13, respectively. It becomes 960 ⁇ 1080 pixels set to 2 (in FIG. 11, the third row from the top).
- FIG. 12 is a diagram for explaining a method of drawing a 3D image by the second drawing method (B in FIG. 5) in the case of mismatching 3D images.
- the original image from which the 3D image is generated is drawn on the logical plane 10, and then the original image is moved in the horizontal direction by an offset value.
- the left-eye image and the right-eye image generated by sliding are drawn on the graphics plane 11.
- each of an upper half and a lower half of a vertically long image obtained by arranging two images of an original image and a copy of the original image in the horizontal direction is horizontal according to an offset value. It can be said that this is a method of drawing two images obtained by shifting in the direction on the graphics plane 11 as an image for the left eye and an image for the right eye.
- 960 ⁇ 1080 pixel graphics in the case of mismatching 3D images can be obtained by sliding each of the upper half and the lower half horizontally according to the offset value.
- the image for the left eye and the image for the right eye of ⁇ 540 pixels are drawn on the graphics plane 11, and then the horizontal and vertical images of the left eye image and the right eye image on the graphics plane 11 are respectively doubled,
- the resulting left-eye image and right-eye image are images in which the amount of shift in the horizontal direction is twice the offset value.
- the position in the depth direction of the 3D image displayed by the left-eye image and the right-eye image is different from the position intended by the author.
- the position in the depth direction of the 3D image displayed by the image for the left eye and the image for the right eye is the position intended by the author.
- FIG. 13 is a diagram for explaining the device plane.
- an image storage area for one screen is assumed as the logical screen 21, and an image for the left eye (Left / Left is stored in the logical screen 21 that is the image storage area for one screen.
- -eye and right-eye images are not supposed to be drawn alternately.
- the current BD standard assumes that there is a one-to-one relationship between the configuration of the device plane and the logical screen 21. Under this premise, the processing of 3D images requires two separate logical screens, that is, a logical screen for drawing a left-eye image and a logical screen for drawing a right-eye image. .
- the device configuration for L / R is defined on one side by doubling the definition of resolution in the vertical direction.
- the logical screen itself is one surface as before, and a drawing model that simultaneously draws the output for L / R is defined there.
- the BD player in FIG. 3 includes device planes (graphics plane 11, video plane 13 (PG plane 12), and background plane 14) that store BD-standard graphics, video, or background images. .
- device planes graphics plane 11, video plane 13 (PG plane 12), and background plane 14 that store BD-standard graphics, video, or background images.
- the device plane has two planes, an L area that is an area for storing one image for storing an image for the left eye and an R area that is an area for storing one image for storing an image for the right eye.
- the storage areas of the image planes are arranged side by side, and the configuration of the device plane is defined for the entire device plane, which is the storage area of the image for two planes.
- the image for the left eye and the image for the right eye stored in the device plane are drawn on the logical screen 21 alternately, for example.
- a logical screen for storing a left-eye image (L image) and a logical screen for storing a right-eye image (R image) are not separately provided as logical screens. It will end.
- Configuration can be specified (set) by providing a bit field for specifying the configuration in the BD-J object (Object) file.
- FIG. 14 shows bit fields provided in the BD-J object file to specify the configuration.
- initial_configuration_id In the BD-J object file, four fields of initial_configuration_id, initial_graphics_mode, initial_video_mode, and initial_background_mode can be provided to specify the configuration.
- Initial_configuration_id is a field for specifying (1) image frame and color depth. Assuming that the value taken by initial_configuration_id is configuration id, the following value is defined as configuration id.
- HD_1920_1080 is the image frame and color depth of the sixth line from the top of FIG. 11
- HD_1280_720 is the image frame and color depth of the eighth line from the top of FIG. 11
- SD_720_576 is 10 lines from the top of FIG.
- the image frame and color depth of the eye SD_720_480 is the image frame and color depth of the ninth line from the top in FIG. 11
- QHD_960_540 is the image frame and color depth of the seventh line from the top of FIG.
- HD_1920_2160 is 11 shows the image frame and color depth of the first line from the top
- HD_1280_1440 shows the image frame and color depth of the second line from the top of FIG.
- SD_720_1152 shows the image frame and color depth of the fifth line from the top of FIG.
- SD_720_960 represents the image frame and color depth in the fourth row from the top in FIG. 11
- QHD_960_1080 represents the image frame and color depth in the third row from the top in FIG. 11, respectively.
- “Initial_graphics_mode” is a field for specifying (3) graphics mode.
- initial_graphics_mode that specifies the graphics mode.
- GRAPHICS_MONO_VIEW represents the mono-brax mode
- GRAPHICS_STEREO_VIEW represents the stereo graphics mode
- GRAPHICS_PLANE_OFFSET represents the offset graphics mode
- GRAPHICS_DUAL_MONO_VIEW represents the flat stereo graphics mode
- GRAPHICS_FORCED_MONO_VIEW represents the forced mono graphics mode
- initial_configuration_id is set to 1, 2, 3, 4, or 5
- initial_graphics_mode is ignored.
- Initial_video_mode is a field for specifying (2) video mode.
- the video mode Video mode
- the dual mono video mode dual-mono
- stereo video mode stereo
- flat stereo video mode flattened-stereo
- mono video mode mono Legacy playback mode
- forced mono video mode forced-mono
- initial_video_mode that specifies the video mode.
- VIDEO_MONO_VIEW represents the mono video mode
- VIDEO_STEREO_VIEW represents the stereo video mode
- VIDEO_FLATTENED_STEREO_VIEW represents the flat stereo video mode
- VIDEO_DUAL_MONO_VIEW represents the dual mono video mode
- VIDEO_FORCED_MONO_VIEW represents the forced mono video mode
- initial_configuration_id is set to any one of 1, 2, 3, 4 and 5
- initial_video_mode is ignored.
- Initial_background_mode is a field for specifying (4) background mode.
- the background mode the dual mono background mode (dual-mono), stereo background mode (stereo), flat stereo background mode (flattened-stereo) described in FIG.
- ground mode monoochrome (Legacy playback ⁇ mode)
- forced mono background mode forced-mono
- initial_background_mode that specifies the background mode.
- the background_mono_view is a mono background mode
- the background_stereo_view is a stereo background mode
- the background_flattened_stereo_view is a flat stereo background mode
- the background_dual_mono_view is a dual mono background mode
- the background_forced_mono_view is a forced mono background mode
- initial_configuration_id is set to 1, 2, 3, 4, or 5
- initial_background_mode is ignored.
- initial_configuration_id When only initial_configuration_id is specified in the BD-J Object file, initial default values of initial_video_mode, initial_graphics_mode, and initial_background_mode are required.
- FIG. 15 shows default specified values of initial_video_mode, initial_graphics_mode, and initial_background_mode.
- STEREO_VIEW of the video mode represents the above-mentioned VIDEO_STEREO_VIEW or VIDEO_FLATTENED_STEREO_VIEW
- MONO_VIEW represents the above-mentioned VIDEO_MONO_VIEW or VIDEO_DUAL_MONO_VIEW.
- STEREO_VIEW of the graphics mode represents the above GRAPHICS_STEREO_VIEW or GRAPHICS_PLANE_OFFSET
- MONO_VIEW represents the above GRAPHICS_MONO_VIEW or GRAPHICS_DUAL_MONO_VIEW.
- STEREO_VIEW in the background mode represents the above-mentioned BACKGROUND_STEREO_VIEW or BACKGROUND_FLATTENED_STEREO_VIEW
- MONO_VIEW represents the above-mentioned BACKGROUND_MONO_VIEW or BACKGROUND_DUAL_MONO_VIEW.
- the configuration is the timing when BD-J title is activated, when auto-reset is performed during PlayList playback (dynamic change), and when API is called by the BD-J application (dynamic change). It can be changed.
- the plane configuration can be changed even during AV playback.
- the configuration can be changed while an AV stream (video) is being played.
- the image frame is aligned (when BD-J title is activated, video / background is aligned with the graphics image frame, and during PlayList playback, graphics / background is Change the configuration so that when the API is called by the BD-J application, the plain image frame set by the API is aligned with the other non-configured plain image frames so that the video image frame is aligned.
- This process is performed by a 3D-compatible player. Also, error processing when changing the configuration depends on the 3D-compatible player.
- KEEP_RESOLUTION playback is a playback mode that synthesizes SD (Standard definition) video, HD (High Definition) graphics, and HD background, 1920 ⁇ 1080 pixel Graphics, 720 ⁇ 480 pixel Video + PG.
- a background of 1920 ⁇ 1080 pixels is combined, and in some cases, a Graphics of 1920 ⁇ 1080 pixels, a Video + PG of 720 ⁇ 576 pixels, and a background of 1920 ⁇ 1080 pixels are combined.
- playback of an image of 1280 ⁇ 720 pixels, which is an HD image is not included in KEEP_RESOLUTION playback.
- FIG. 16 and 17 show combinations of resolutions (image frames) of Video + PG, BD-J graphics, and background for playback other than KEEP_RESOLUTION playback.
- FIG. 17 is a figure following FIG.
- FIG. 18 shows an example of configuration change processing.
- FIG. 18A shows an example of processing of a 3D-compatible player when the configuration (video mode) of graphics (graphics plane 11) is changed from STEREO_VIEW to MONO_VIEW.
- the graphics plane 11L and the R graphics plane 11R constituting the graphics plane 11 of 1920 ⁇ 2160 pixels the graphics plane Suppose that the video mode is changed from STEREO_VIEW to MONO_VIEW without resetting 11 (as a storage area).
- the 3D-compatible player may forcibly terminate as an error (image playback).
- FIG. 18B shows an example of processing of a 3D-compatible player when the video mode is changed from MONO_VIEW to STEREO_VIEW.
- the graphics drawn on the L graphics plane 11L are copied to the R graphics plane 11R, and the graphics drawn on the L graphics plane 11L are supplied to the logical screen 21 as an image for the left eye.
- graphics copied to the R graphics plane 11R are supplied to the logical screen 21 as an image for the right eye.
- the 3D-compatible player may forcibly terminate as an error (image playback).
- rule # 1-1 is a rule that the resolution (image frame) of the three images of Graphics, Video, and Background must always be the same in the configuration (of the device plane). is there.
- rule # 1-2 when PlayList playback is performed in addition to KEEP_RSOLUTION playback, the resolution (image frame) of the three images, Graphics, Video, and Background, must match the video resolution in the configuration. This is the rule.
- Rule # 1-3 is a rule that, when the graphics is QHD graphics in the configuration, the resolution after scaling by 2 times in the vertical direction and 2 times in the horizontal direction is used as the resolution of the configuration. .
- videoinitialmode, graphics mode, and background mode are determined according to the default values specified by the initial_configuration_id of the BD-J object file (video mode, graphics mode, and background mode are determined) .
- rule # 2-1 is a rule that in the (device plane) configuration, the resolution (image frame) of the three images of Graphics, Video, and Background must always be the same resolution. is there.
- rule # 2-2 when PlayList playback is performed in addition to KEEP_RSOLUTION playback, the resolution (image frame) of the three images Graphics, Video, and Background must match the video resolution in the configuration. This is the rule.
- Rule # 2-3 is a rule that, when the graphics is QHD graphics in the configuration, the resolution after scaling by 2 times in the vertical direction and 2 times in the horizontal direction is used as the resolution of the configuration. .
- the video plane configuration is automatically aligned with the video attribute of the PlayList.
- the current BD standard specifies that graphics plane and background plane should also automatically align with the video plane attribute as an essential function on the BD player side. Yes.
- the mode of graphics and background are set to predetermined initial values (predetermined initial values).
- FIG. 19 shows predetermined initial values of the graphics mode and the background mode.
- FIG. 20 shows graphics and background images that are played back when a 3D image (stereo image) of 1920 ⁇ 2160 pixels is played back.
- a 1920 ⁇ 2160 pixel 3D image is played back, and as a background, a 1920 ⁇ 2160 pixel 3D image is played back.
- rule # 3-1 is a rule that in the configuration (device plane), the resolution (image frame) of the three images of Graphics, Video, and Background must always be the same resolution. is there.
- Rule # 3-3 is a rule that when the graphics is QHD graphics in the configuration, the resolution after scaling by 2 times in the vertical direction and 2 times in the horizontal direction is set as the resolution of the configuration. .
- FIG. 21 is a diagram for explaining a change in resolution (image frame) as a configuration by calling an API by a BD-J application.
- the 3D-compatible BD player automatically changes the resolution of the video 3D image and the background 3D image in accordance with the rules # 3-1, # 3-2, and # 3-3 described above.
- the resolution of the background 3D image is changed by calling an API.
- the 3D-compatible BD player automatically changes the resolution of the graphics 3D image and video 3D image according to the rules # 3-1, # 3-2, and # 3-3 described above. To do.
- the resolution of the video 3D image is changed by calling an API.
- the 3D-compatible BD player automatically changes the resolution of the graphics 3D image and background 3D image according to the rules # 3-1, # 3-2, and # 3-3 described above. To do.
- the 3D-compatible player can seamlessly change (switch) the graphics mode between the stereo graphics mode (stereo graphics) and the offset graphics mode (offset graphics).
- FIG. 22 is a diagram for explaining the change of the graphics mode.
- FIG. 22A shows playback of graphics 3D images (plane (offset gfx (graphics)), video (and PG) 3D images (stereo video + PG), and background 3D images (stereo background) in the offset graphics mode.
- the graphics mode is changed from the offset graphics mode to the stereo graphics mode.
- graphics 3D image plane offset gfx
- video (and PG) 3D image stereo video + PG
- background 3D image stereo background
- the graphics 3D image (stereo gfx (graphics)), video (and PG) 3D image (stereo video + PG), and background 3D image (stereo background) are switched to playback. Can be done seamlessly.
- Offset graphics from reverse switching ie, playback of graphics 3D images (stereo gfx), video (and PG) 3D images (stereo video + PG), background 3D images (stereo background) in stereo graphics mode
- 3D graphics plane offset ⁇ gfx
- video (and PG) 3D images stereo video + PG
- background 3D images stereo background
- FIG. 22 shows graphics during playback of a graphics 3D image (stereo gfx), video (and PG) 3D image (stereo video + PG), and background 2D image (mono background). This shows a case where the mode is changed from the stereo graphics mode to the offset graphics mode.
- FIG. 23 shows a change of the graphics mode from the stereo graphics mode to the offset graphics mode.
- the playback target is switched from the graphics 3D image (stereo gfx) in the stereo graphics mode to the graphics 3D image (plane offset gfx) in the offset graphics mode.
- blackout may occur if the resolution is changed when the graphics mode is changed.
- the 3D-compatible player can seamlessly change (switch) the background mode between the stereo background mode (stereo background) and the mono background mode (mono background).
- FIG. 24 is a diagram for explaining the change of the background mode.
- FIG. 24A shows a graphics 3D image (stereo gfx), a video (and PG) 3D image (stereo video + PG), and a background 3D image (stereo background) in stereo background mode. This shows a case where the background mode is changed from the stereo background mode to the mono background mode.
- the reverse switching can be performed seamlessly.
- FIG. 24 shows during playback of a graphics 3D image (plane offset gfx), video (and PG) 3D image (stereo video + PG), and background 2D image (mono ⁇ ⁇ background) in mono background mode.
- This shows a case where the background mode is changed from the mono background mode to the stereo background mode.
- the reverse switching can be performed seamlessly.
- the 3D-compatible player changes (switches) the video mode among the stereo video mode (stereo video), the flat stereo video mode (flattened-stereo video), and the dual mono video mode (dual-mono video). Can be done seamlessly.
- FIG. 25 is a diagram for explaining the change of the video mode.
- FIG. 25A is a diagram for explaining the change of the video mode when a video image is reproduced together with a graphics 3D image (stereo gfx) and a background 3D image (stereo background).
- the video mode is the stereo video mode and the 3D image (stereo3video ⁇ + PG) of the video (and PG) in the stereo video mode is being played back
- the video mode is changed from the stereo video mode to the flat stereo video mode.
- the video image is changed from a 3D image (stereo video + PG) in stereo video mode to a 3D image (and PG) in flat stereo video mode (flattened video + PG) is switched, but this switching can be performed seamlessly.
- the reverse switching can be performed seamlessly.
- the video mode is the flat stereo video mode and a 3D image (flattenedDvideo + PG) of the video (and PG) in the flat stereo video mode is being played back
- the video mode is changed from the flat stereo video mode.
- the video image will change from 3D image (flattened3video + PG) in flat stereo video mode to video (and PG) in dual mono video mode
- Switching to a 3D image (dual-mono video + PG) is possible, but this switching can be performed seamlessly.
- the reverse switching can be performed seamlessly.
- FIG. 25 is a diagram for explaining the change of the video mode when a video image is reproduced together with a graphics 3D image (plane offset) gfx) and a background 2D image (mono background). .
- the video mode is a dual mono video mode and a 3D image (dual-mono video + PG) of a video (and PG) in the dual mono video mode is being played back
- the video mode is changed from the dual mono video mode.
- the video image is from video (and PG) in dual video mode (from dual-mono video ⁇ ⁇ ⁇ ⁇ + PG) to video (and PG) in flat stereo video mode
- the 3D image flattened video + PG
- the reverse switching can be performed seamlessly.
- the video mode is the flat stereo video mode and a 3D image (flattenedDvideo + PG) of the video (and PG) in the flat stereo video mode is being played back
- the video mode is changed from the flat stereo video mode.
- the video image will change from 3D image of video (and PG) in flat stereo video mode (flattened video + PG) to 3D image of video (and PG) in stereo video mode Switching to (stereo video + PG) is possible, but this switching can be done seamlessly.
- the reverse switching can be performed seamlessly.
- the configuration is defined by resolution (image frame) and color depth. For this reason, changing the configuration changes the resolution. However, when the resolution is changed, playback is temporarily stopped and the display screen is blacked out.
- the playback mode of Graphics plane mono-logical-plane + offset value can be specified as a 1920 ⁇ 1080 / 32bpp configuration, but in this case, for example, mono-logical-plane + offset value There is a possibility that blackout is induced by switching from stereo to logical-plane.
- the plane configuration is unified into two plane definitions (1920 x 2160 pixels, 1280 x 1440 pixels, (960 x 1080 pixels), 720 x 960 pixels, 720 x 1152 pixels), and resolution / color Define attributes other than depth as mode values. Then, when only the mode is changed without changing the resolution, the configuration can be changed without setting the display screen in a blackout state. Further, like the legacy player, the configuration can be changed by calling the Configuration Preference setting API.
- FIG. 26 is a block diagram showing a functional configuration example of the BD player in FIG. 3 as such a 3D-compatible player.
- an L area that is a storage area for one image that stores an image for the left eye and an R area that is a storage area for one image that stores an image for the right eye A configuration of a device plane, which is a storage area in which storage areas for two image planes are arranged side by side, is defined for the entire device plane.
- five modes of a mono graphics mode, a stereo graphics mode, an offset graphics mode, a forced mono graphics mode, and a flat stereo graphics mode are defined as graphics modes.
- five modes of a mono video mode, a dual mono video mode, a stereo video mode, a forced mono video mode, and a flat stereo video mode are defined as video modes.
- the background mode five modes of a mono background mode, a dual mono background mode, a stereo background mode, a forced mono background mode, and a flat stereo background mode are defined.
- the configuration of the device plane includes (1) image frame (resolution) and color depth, (2) video mode, (3) graphics mode, and (4) background mode.
- the setting (change) of the video mode, (3) graphics mode, and (4) background mode can be performed by the configuration mode setting API.
- the BD-J application when changing the video mode, the graphics mode, or the background mode, the BD-J application calls the configuration mode (configuration mode) setting API, and the video mode, the graphics mode, or Request background mode change (setting).
- the configuration mode setting API directly selects the necessary ones from the presentation engine (Presentation Engine), video decoder (video decoder), and display processor (Display processor).
- the video mode, the graphics mode, or the background mode is changed (set) by controlling indirectly.
- the BD-J application calls the resolution setting API and requests the change (setting) of the resolution and the like.
- the resolution setting API controls the necessary ones of the presentation engine, video decoder, and display processor directly or indirectly according to the request from the BD-J application. And change (set) the color depth.
- a presentation engine is a playback control engine (Playback (not shown)) that controls playback of BDs, with a decoding function and presentation function (Presentation functions) for audio, video, and HDMV graphics. Provide to Control Engine).
- a video decoder decodes an image. Furthermore, the display processor (Display processor) superimposes each of the graphics plane, video (video + PG) plane, and ⁇ ⁇ ⁇ ⁇ background ⁇ ⁇ ⁇ ⁇ plane, and then displays the image obtained by the overlaying on the display connected to the BD player. The hardware to output.
- Display processor superimposes each of the graphics plane, video (video + PG) plane, and ⁇ ⁇ ⁇ ⁇ background ⁇ ⁇ ⁇ ⁇ plane, and then displays the image obtained by the overlaying on the display connected to the BD player.
- the hardware to output.
- the device plane configuration is defined for the entire device plane, which is an image storage area for two surfaces, and the resolution (image frame) and color depth are defined in the device plane configuration.
- graphics mode etc. are included.
- the 3D-compatible player sets the graphics mode and the like by calling the configuration mode setting API. In this way, the graphics mode and the like can be changed without changing the resolution.
- BD-J application handles Video + PG / TextST (Text subtitle) all together (without distinction). Further, the BD-J application cannot control the PG plane 12 individually, but can control the position and scaling (size) of the video. In the current BD standard, when video position and scaling are controlled from a BD-J application, PG / TextST is aligned with video.
- the PG plane offset value is scaled by a scaling ratio (enlargement ratio or reduction ratio) for scaling the video.
- a 3D image PG is generated by a mode (2-planes) for reproducing a PG image of a certain stereo image, and a left eye image and a right eye image (with parallax) generated from a 2D image and an offset value. It is desirable to be able to set the mode (1-plane + offset) for playing
- a 3D-compatible player indirectly performs PG plane control (configuration switching between 1-plane (legacy playback), 1-plane + offset, and 2-planes) by selecting a PG stream. .
- a PG stream of a PG image of a BD standard a mono PG stream of a PG image of a mono image that is a 2D image, and a PG stream of a PG image of a stereo image that is a 3D image
- a stereo PG stream, and an offset PG stream that is a PG stream of a mono image PG used to generate a stereo image together with an offset value that gives parallax to the mono image (e.g., a mono image PG Stream including the image and the offset value).
- the mono 1-stream (legacy content) mode, L / R 2 stream mode, and 1-stream + plane-offset mode are defined as PG playback modes for playing PG images.
- the PG playback mode is the mono 1-stream mode
- a 2D PG image is played back using the mono PG stream.
- a 3D PG image is played by playing back a left-eye image and a right-eye image using a stereo PG stream.
- a left-eye image and a right-eye image are generated based on the offset value using the offset PG stream, and the left-eye image And the right-eye image are reproduced, and the 3D PG image is reproduced.
- a TextST stream of a BD standard TextST image As a TextST stream of a BD standard TextST image, a mono TextST stream that is a TextST image of a TextST image of a mono image that is a 2D image, and a TextST stream of a TextST image of a stereo image that is a 3D image A stereo TextST stream, and an offset TextST stream that is a TextST stream of a mono image TextST image used to generate a stereo image, together with an offset value that gives parallax to the mono image (e.g., a mono image TextST stream Stream including the image and the offset value).
- an offset value that gives parallax to the mono image
- the mono 1-stream (legacy content) mode, L / R 2 stream mode, and 1-stream + plane-offset mode are defined as TextST playback modes for playing TextST images.
- the TextST playback mode is the mono 1-stream mode
- a 2D TextST image is played back using the mono TextST stream.
- a 3D TextST image is played by playing back a left-eye image and a right-eye image using a stereo TextST stream.
- the TextST playback mode is the 1-stream + plane-offset mode
- the left-eye image and right-eye image are generated based on the offset value using the offset TextST stream, and the left-eye image By replaying the right eye image, the 3D image TextST image is replayed.
- -3P compatible players can switch (set) the PG / TextST configuration through the API for selecting a stream.
- FIG. 27 shows a PG playback mode and a TextST playback mode that can be selected in each video mode.
- the video mode is mono video mode (mono), flat stereo video mode (flattened stereo), dual mono video mode (dual-mono), forced mono video mode (forced-mono), and
- stereo video mode stereo
- the offset PG stream can be selected regardless of whether the video mode is a mono video mode, a flat stereo video mode, a dual mono video mode, a forced mono video mode, or a stereo video mode. is there.
- the video modes are flat stereo video mode (flattened stereo), dual mono video mode (dual-mono), forced mono video mode (forced-mono), and stereo video mode (stereo). In either case, it is possible to select the L / R 2 stream mode (stereo) (stereo PG stream).
- the stereo PG stream can be selected regardless of whether the video mode is any of the flat stereo video mode, the dual mono video mode, the forced mono video mode, and the stereo video mode.
- the offset PG When the stream (mono + offset) is selected, the mono image of the offset PG stream is reproduced ignoring the offset value (with the offset value set to 0).
- stereo PG stream stereo
- the video mode is mono video mode (mono) or forced mono video mode (forced-mono)
- a stereo image corresponding to the stereo PG stream is configured. For example, only the left-eye image (L PG stream) that is one of the left-eye image and the right-eye image is reproduced.
- the video mode is the flat stereo video mode or the dual mono video mode
- the stream number that is a number assigned to the stream matches the selected stereo PG stream. If there is an offset PG stream (if it is recorded on the BD), instead of the selected stereo PG stream, the mono image of the offset PG stream having the same stream number as the stereo PG stream has an offset value. Ignored and played.
- the video mode is mono video mode (mono), flat stereo video mode (flattened stereo), forced mono video mode (forced-mono), and dual mono video mode (dual- In any case of mono), it is possible to select the 1-stream + plane-offset mode (mono + offset) (Text subtitle stream for offset).
- the offset TextST stream (offset Text subtitle stream) should be selected when the video mode is one of the mono video mode, flat stereo video mode, forced mono video mode, or dual mono video mode. Is possible.
- the video mode is flat stereo video mode (flattened stereo), dual mono video mode (dual-mono), forced mono video mode (forced-mono), and stereo video mode (stereo). In either case, it is possible to select the L / R 2 stream mode (stereo) (stereo text subtitle stream).
- a stereo TextST stream (stereo Text subtitle stream) should be selected regardless of whether the video mode is a flat stereo video mode, a dual mono video mode, a forced mono video mode, or a stereo video mode. Is possible.
- stereo stereo TextST stream
- L TextST stream left-eye image
- the stream number which is the number assigned to the stream, matches the selected stereo TextST stream. If there is an offset TextST stream, instead of the selected stereo TextST stream, a mono image of the offset TextST stream having the same stream number as that stereo TextST stream is reproduced while ignoring the offset value.
- FIG. 28 is a block diagram illustrating a functional configuration example of the BD player in FIG. 3 as a 3D-compatible player that plays back PG and TextST images as described above.
- the 3D-compatible player includes a BD-J application, a PG / TextST stream selection API, a video control API, a PG selection engine (Playback Control Function), a TextST selection engine (Playback Control Function), and a video control engine (Playback Control Function). ), A playback control engine (Playback Control Engine), a presentation engine (Presentation Engine), and the like.
- BD-J application calls PG / TextST stream selection API and requests selection of PG stream.
- the PG / TextST stream selection API selects a PG stream requested from the BD-J application.
- the PG / TextST stream selection API can select the PG stream requested from the BD-J application for the current video mode, Control the PG selection engine to select.
- the PG selection engine selects a PG stream in accordance with the control of the PG / TextST stream selection API from the PG streams recorded on the disc 100 (FIG. 3), which is a BD, and is not shown in FIG. This is supplied to a PG decoder (stereo PG decoder) or a mono PG decoder (mono PG decoder).
- the stereo PG stream selected by the PG selection engine is a stereo PG stream
- the stereo PG stream is supplied to the stereo PG decoder.
- the offset PG stream is supplied to the mono PG decoder.
- the stereo PG decoder decodes the PG stream supplied from the PG selection engine into a left-eye image and a right-eye image that form a stereo image, and an L-PG plane 12L and an R-PG plane 12R of the PG plane 12. Draw each.
- the mono PG decoder decodes the offset PG stream supplied from the PG selection engine into a mono image and draws it on the logical plane 10.
- the PG generation API uses an offset value (for example, an offset value included in the offset PG stream, an internal storage area of a 3D-compatible player, or an offset stored in PSR # 21 from a mono image drawn on the logical plane 10. Value) is used to generate a left-eye image and a right-eye image.
- the PG generation API draws the left-eye image and the right-eye image on the L-PG plane 12L and the R-PG plane 12R of the PG plane 12, respectively.
- a stereo image corresponding to the stereo PG stream is displayed.
- One of the left-eye image and the right-eye image for example, only the left-eye image is reproduced or the offset value is ignored, and the mono image corresponding to the offset PG stream Only may be played.
- a mono PG stream that is a PG stream of a mono image that is a 2D image and a PG stream of a stereo image that is a 3D image
- a stereo PG stream that is a PG stream of an image
- an offset PG stream that is a PG stream of a mono PG image
- an offset value that is data that gives parallax to the mono image Is defined.
- the PG / TextST stream selection API selects a mono PG stream, a stereo PG stream, or an offset PG stream in accordance with a request from the BD-J application.
- PG image playback (PG configuration) can be controlled indirectly from the BD-J application.
- FIG. 30 is a diagram for explaining switching between 3D image reproduction and 2D image reproduction in a 3D-compatible player.
- the operation mode of the 3D-compatible player is a 3D playback mode (3D playback mode) for playing back 3D images.
- the graphics mode is stereo graphics mode (stereo gfx (graphics)), the video mode is stereo video mode (stereo video), the background mode is mono background mode (mono background), respectively. .
- the graphics mode has been changed to the offset graphics mode (plane offset gfx), and the video mode has been changed to dual mono video mode (dual-mono video).
- the operation mode is changed from the 3D playback mode to the 2D playback mode (Legacy playback mode) for playing back 2D images in the same manner as the legacy player.
- the graphics mode has been changed from the offset graphics mode (plane gfx) to the mono graphics mode (mono (gfx). Further, the video mode is changed from the dual-mono video mode (dual-mono video) to the mono video mode (mono video). Note that the background mode remains the mono background mode (mono background).
- the operation mode is changed again from the 2D playback mode to the 3D playback mode.
- the graphics mode is changed from the mono graphics mode (mono ⁇ ⁇ ⁇ gfx) to the stereo graphics mode (stereo gfx) according to the change of the operation mode. Furthermore, the video mode is changed from the mono video mode (mono video) to the flat stereo video mode (flattened stereo video). Note that the background mode remains the mono background mode (mono background).
- the background mode is changed from the mono background mode (mono background) to the stereo background mode (stereo background).
- the display screen when the operation mode is changed from the 3D playback mode to the 2D playback mode, when the resolution (image frame) is changed, the display screen may be blacked out.
- JMF Java (registered trademark) Media Framework
- control such as "javax.tv.media.AWTVideoSizeControl” or "org.dvb.media.BackgroundVideoPRsentationControl” Can be used.
- the 3D-compatible player must correct the position and size of each of the left-eye image (L video source) and the right-eye image (R video source).
- L video source left-eye image
- R video source right-eye image
- the display coordinate system is a coordinate system having a size of 1920 ⁇ 1080 pixels whose vertical direction is 1 ⁇ 2.
- the author must set the position and size of the video as follows, for example.
- FIG. 31 is a diagram for explaining the setting of the video position and size by the author, and the correction of the video position and size by the 3D-compatible player.
- the author sets the position and size of the video image for the left eye.
- the position and size of the video image for the left eye are set for a display coordinate system having a size of 1920 ⁇ 1080 pixels.
- the 3D-compatible player sets the position and size of the video image for the left eye relative to the display coordinate system to the L video plane 13L of the video plane 13 as they are.
- the 3D-compatible player applies the settings of the video position and size of the L video plane 13L to the R video plane 13R as they are.
- the video producer should produce a video image so that the intended 3D image is displayed. Therefore, in a 3D-compatible player, for example, a video image (left-eye image and left-eye image) drawn on the video plane 13 by externally applied information such as an offset value stored in PSR # 21 (FIG. 7). If processing such as shifting the position of the image for the right eye) is performed, an image unintended by the video producer may be displayed.
- L / R video plane is defined on the configuration, but the author of the BD-J application is restricted so that only the L video plane can be handled. That is, the 3D-compatible player must apply the API call of L video scaling / L video positioning by the BD-J application to R video scaling / R video positioning as it is.
- the PG plane offset value is the scaling that scales the video, as described in [Switching PG / Text] subtitle configuration. Although scaled by a ratio (magnification or reduction), the graphics plane offset value is similarly scaled by the video scaling ratio.
- FIG. 32 is a block diagram illustrating a functional configuration example of the BD player in FIG. 3 as a 3D-compatible player that performs video position setting (correction) and size setting (scaling) as described above.
- the 3D-compatible player in FIG. 32 has an image size stored in the L video plane 13L (L area), an L API for setting the position, and an image size stored in the R video plane 13R (R area). And an R API for setting the position.
- One API of the L API and the R API sets the same size and position as the image size and position set by the other API.
- the video decoder (Video decoder) decodes the video, and the resulting left-eye image and right-eye image are converted into the L API and the R API. Supply.
- the L API consists of the L video scaling (L (Left) video scaling) API and the L video positioning (L (Left) positioning) API, and requests for setting the position and size of the video from the BD-J application. Is set to the position and size of the left-eye image from the video decoder.
- the L video scaling API performs scaling for controlling the size of the image for the left eye from the video decoder to a size in accordance with a request from the BD-J application, and supplies it to the L video positioning API.
- the L video positioning API controls the position of the image for the left eye from the L video scaling API to a position according to the request from the BD-J application, and the resulting image for the left eye is transferred to the L video plane 13L.
- Draw draw the image for the left eye from the L video scaling API at a position on the L video plane 13L in response to a request from the BD-J application).
- the L video scaling API calls the R video scaling API described later and makes the same request as the BD-J application. Further, the L video positioning API calls an R video positioning API, which will be described later, and makes a request similar to the request from the BD-J application.
- the L video scaling API sets the scaling ratio (enlargement ratio or reduction ratio) S when the video image (the image for the left eye) is scaled in response to a call for setting the video size.
- the R API consists of the R video scaling (R (Right) video scaling) API and the R video positioning (R (Right) positioning) API, which can be used to request video position and size settings from the L API. In response, the position and size of the right-eye image from the video decoder are set.
- the R video scaling API controls the size of the right-eye image from the video decoder to a size according to the request from the L video scaling API, and supplies it to the R video positioning API.
- the R video positioning API controls the position of the right eye image from the R video scaling API to a position according to the request from the L video positioning API, and the resulting right eye image is transferred to the R video plane 13R. draw.
- the size of the image stored in the L video plane 13L (L region) and the API for L for setting the position, the size of the image stored in the R video plane 13R (R region), and The API for R that sets the position is one API, for example, the API for R is the size of the image that the API for L that is the other API sets in response to a request from the BD-J application, The same size and position as the position are set.
- the author can handle only the L video plane 13L, which is one of the L video plane 13L (L region) and the R video plane 13R (R region), for the video plane 13 that stores the BD standard video image. It is possible to prevent a video image unintended by the video producer from being displayed.
- the processing described in FIG. 29 is further performed for PG.
- the PG plane offset value (for example, the offset value included in the offset PG stream, the internal storage area of the 3D-compatible player, or PSR # 21 is used with the scaling ratio S from the L video scaling API.
- the stored offset value is scaled (the PG plane offset value is multiplied by the scaling ratio S).
- a left-eye image and a right-eye image are generated from the mono image drawn on the logical plane 10 using the scaled PG plane offset value.
- the configuration mode change API selects a graphics mode image in response to a request from the BD-J application from graphics images recorded on the disc 100 (FIG. 3) as a BD, Draw on the graphics plane 11.
- a left-eye image and a right-eye image of a graphics that is a stereo image are respectively an L graphics plane 11L and an R graphics plane 11R of the graphics plane 11. Drawn on.
- the video mode is, for example, the offset graphics mode
- a graphics image that is a mono image is drawn on the logical plane 10
- the graphics generation API has a scaling ratio S from the L video scaling API.
- the graphics plane offset value (for example, the internal storage area of the 3D-compatible player or the offset value stored in PSR # 21) is scaled.
- the graphics generation API generates an image for the left eye and an image for the right eye from the mono image drawn on the logical plane 10 using the scaled graphics plane offset value, and the L graphics plane 11L and the R graphics plane. Each is drawn on 11R.
- the effective pixel coordinate system for stereo graphics configuration (configuration for displaying graphics 3D images) is (0, 0)-(1920, 2160) (0, 0)-(1280, 1440) (0, 0)-(720, 960) (0, 0)-(720, 1152) (0, 0)-(960, 1080) One of them.
- Top-half is assigned to L graphics view and bottom-half is assigned to R graphics view.
- FIG. 33 shows the graphics plane 11 having 1920 ⁇ 2160 pixels.
- An image drawn on the L graphics plane 11L which is an upper storage area (top-half) of the graphics plane 11, becomes a left-eye image (L (Left) graphics view) observed with the left eye.
- An image drawn on the R graphics plane 11R which is the lower storage area (bottom-half), is an image for the right eye (R (Right) graphics view) observed with the right eye.
- FIG. 33 one container (Root ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ container) and two components (Components) that are children of the container are drawn on the graphics plane 11.
- the coordinates of a component are expressed as relative coordinates based on the container that is the parent of the component.
- a buffer area for guard purposes should not be provided at the edge of the graphics plane 11.
- 3D-compatible players must introduce a mechanism to prevent inconsistencies with L-view / R-view.
- the BD player which is a legacy player, does not have a mechanism for detecting completion of drawing by the BD-J application and transferring it to the monitor after completion.
- output mismatch may occur between L / R graphics.
- some API call is defined as a signal indicating the completion of drawing by the BD-J application. Conversely, if the BD-J application does not call the corresponding drawing completion notification API, nothing is output to the screen. The author must use this method.
- the drawing contents of the graphics plane 11 are the same as the left-eye image and the right-eye.
- the left-eye image and the right-eye image are not aligned so that they can be viewed as a 3D image (in this case, the right-eye image Since the drawing of the image is incomplete, the user who sees the image on the display screen feels uncomfortable.
- the 3D-compatible player has a function of suppressing inconsistency between the left-eye image and the right-eye image, that is, to be viewed as a 3D image. It has a function of preventing the left-eye image and the right-eye image that are not in a matched state from being displayed on the display screen.
- the 3D-compatible player displays the left-eye image and the right-eye image for display after the drawing of both the left-eye image and the right-eye image on the graphics plane 11 is completed. Output to.
- the 3D-compatible player needs to recognize that drawing of both the left-eye image and the right-eye image on the graphics plane 11 has been completed.
- a 3D-compatible player In Direct-drawing, a 3D-compatible player has no way of knowing whether or not a drawing command for drawing a graphics image from a BD-J application has been completed.
- the BD-J application issues drawing commands # 1, # 2,..., #N and the image is drawn on the graphics plane 11 according to the drawing commands # 1 to #N. Thereafter, the 3D-compatible player cannot recognize whether or not a drawing command is further issued from the BD-J application, that is, whether or not the drawing command has been issued by the BD-J application.
- the rendering integrity guarantee API call that guarantees the graphics rendering integrity is obligatory as signaling to the 3D-compatible player.
- a call for a drawing completion notification API for notifying completion of drawing of an image on the graphics plane 11 is obligated as signaling to a 3D-compatible player.
- a drawing start notification API for notifying that drawing of an image on the graphics plane 11 is started, and subsequent drawing of the image on the graphics plane 11 are performed. Is called as a signaling to the 3D-compatible player.
- the 3D-compatible player calls the graphics plane by calling the drawing integrity guarantee API by the BD-J application, calling the drawing completion notification API, or calling the drawing start notification API, and then calling the drawing completion notification API.
- 11 that the drawing of the image with respect to 11 has been completed, that is, the issue of the drawing command has been completed.
- drawing integrity guarantee API a dedicated API that takes a drawing command string as an argument can be defined.
- the drawing completion notification API for example, the java.awt.Toolkit # sync () method can be adopted.
- the 3D-compatible player does not output the image drawn on the graphics plane 11 unless the java.awt.Toolkit # sync () method is called, and therefore the graphics plane 11 is displayed on the display screen. The image drawn on is not displayed.
- drawing start notification API for example, a predetermined method of Java (registered trademark) or a dedicated API can be defined.
- the graphics-frame may drop frames. Therefore, do not call the java.awt.Toolkit # sync () method more than once in succession or continuously with a small amount of drawing.
- the repaint () method of the root container as a part constituting a graphics image calls the update () method of each component as a part constituting the graphics image.
- the 3D-compatible player can completely control (full control) the graphics image drawing process, and the 3D-compatible player has finished drawing the image on the graphics plane 11. I can recognize that.
- a 3D-compatible player is mounted so that the left-eye image and the right-eye image are displayed in an aligned state without calling the drawing completion notification API described above. be able to.
- FIG. 34 shows the drawing integrity guarantee API call by the BD-J application, the drawing completion notification API call, or the drawing start notification API call, and the subsequent drawing completion notification API call.
- FIG. 4 is a block diagram showing an example of a functional configuration of the BD player in FIG. 3 as a 3D-compatible player that recognizes that the recording has been completed.
- the BD-J application calls the drawing completion notification API.
- the 3D-compatible player has buffers 201L and 201R as the graphics plane 11 and buffers 202L and 202R.
- the buffers 201L and 202L correspond to the L graphics plane 11L
- the buffers 201R and 202R correspond to the R graphics plane 11R.
- the set of buffers 201L and 201R and the set of buffers 202L and 202R alternately function as a back buffer (hidden buffer) and a front buffer.
- the back buffer is a buffer in which graphics images are drawn from the BD-J application, and the front buffer is displayed on the display screen (logical screen 21) while the image is being drawn in the back buffer.
- This is a buffer for storing displayed images.
- 34A shows a 3D-compatible player in a state in which the set of buffers 201L and 201R serves as a back buffer, and the set of buffers 202L and 202R serves as a front buffer.
- graphics images (left eye image and right eye image) are drawn by the BD-J application in the buffers 201L and 201R as back buffers, and the buffers as front buffers are drawn. Images (left-eye image and right-eye image) stored in 202L and 202R are output as output to the display screen.
- the BD-J application calls the drawing completion notification API when drawing of graphics images is finished with respect to the buffers 201L and 201R serving as back buffers.
- the 3D-compatible player starts outputting the image stored in the back buffer to the display screen instead of the front buffer.
- FIG. 34B shows the 3D-compatible player immediately after the drawing completion notification API is called.
- the 3D-compatible player stores the images stored in the buffers 201L and 201R serving as the back buffer instead of the images stored in the buffers 202L and 202R serving as the front buffer. Starts outputting the image to the display screen.
- the 3D-compatible player copies the images stored in the buffers 201L and 201R serving as back buffers to the buffers 202L and 202R serving as front buffers.
- the 3D-compatible player swaps the back buffer and the front buffer.
- the 3D-compatible player sets the buffers 201L and 201R serving as back buffers as front buffers and the buffers 202L and 202R serving as front buffers as back buffers.
- FIG. 34C shows a 3D-compatible player in which the set of buffers 201L and 201R is a front buffer and the set of buffers 202L and 202R is a back buffer.
- the BD-J application starts drawing graphics images to the buffers 202L and 202R serving as back buffers, and the same processing is repeated thereafter.
- FIG. 35 is a flowchart for explaining graphics processing by the 3D-compatible player in FIG. 34 when the BD-J application calls the drawing integrity assurance API.
- step S11 the 3D-compatible player determines whether or not the drawing integrity guarantee API has been called from the BD-J application. If it is determined that there is no call, the process returns to step S11.
- step S11 If it is determined in step S11 that the drawing integrity guarantee API has been called, the process proceeds to step S12, and the 3D-compatible player sequentially executes the drawing command sequence that is an argument of the drawing integrity guarantee API, The graphics image obtained as a result of the execution is drawn in the back buffer, and the graphics image stored in the front buffer is output to the display screen (output for display).
- the 3D-compatible player outputs the graphics image stored in the back buffer on the display screen instead of the front buffer (output for display) in step S13. To do).
- step S14 the 3D-compatible player copies the graphics image stored in the back buffer to the front buffer.
- step S15 the 3D-compatible player switches the back buffer and the front buffer, returns to step S11, and thereafter the same processing is repeated.
- a BD-J application calls a rendering integrity guarantee API that guarantees the rendering integrity of graphics images for the graphics plane 11 (which is a back buffer).
- a rendering integrity guarantee API that guarantees the rendering integrity of graphics images for the graphics plane 11 (which is a back buffer).
- an image drawn on the graphics plane 11 is output for display.
- the image drawn on the graphics plane 11 can be displayed after waiting for the graphics image drawing by the 3D-compatible player to be completed. It is possible to prevent the ophthalmic image from being displayed on the display screen.
- FIG. 36 is a flowchart for explaining graphics processing by the 3D-compatible player in FIG. 34 when the BD-J application calls the drawing completion notification API.
- the 3D-compatible player waits for a drawing command issued from the BD-J application, and executes the drawing command in step S21.
- step S22 the 3D-compatible player draws the graphics image obtained as a result of executing the drawing command in the back buffer, and outputs the graphics image stored in the front buffer to the display screen (display). Output for).
- step S23 the 3D-compatible player determines whether or not the drawing completion notification API is called from the BD-J application.
- step S23 If it is determined in step S23 that the drawing completion notification API has not been called, the process returns to step S21 after issuing a drawing command from the BD-J application, and the same processing is repeated thereafter.
- step S23 If it is determined in step S23 that the drawing completion notification API has been called, the 3D-compatible player proceeds to step S24, and displays the graphics image stored in the back buffer instead of the front buffer. Output to (output for display).
- step S25 the 3D-compatible player copies the graphics image stored in the back buffer to the front buffer.
- step S26 the 3D-compatible player switches the back buffer and the front buffer, waits for the issuance of a drawing command from the BD-J application, returns to step S21, and thereafter the same processing is repeated.
- the BD-J application calls the drawing completion notification API that notifies the graphics plane 11 (that is, the back buffer) that drawing of the graphics image has ended.
- the drawing completion notification API that notifies the graphics plane 11 (that is, the back buffer) that drawing of the graphics image has ended.
- an image drawn on the graphics plane 11 is output for display.
- the image drawn on the graphics plane 11 can be displayed after the notification that the drawing of the graphics image by the BD-J application has been completed, the left that is not in a consistent state is displayed. It is possible to prevent the image for the eye and the image for the right eye from being displayed on the display screen.
- FIG. 37 is a flowchart for explaining graphics processing by the 3D-compatible player in FIG. 34 when the BD-J application calls the drawing start notification API and then calls the drawing completion notification API.
- step S31 the 3D-compatible player determines whether or not the drawing start notification API is called from the BD-J application. If it is determined that the drawing start notification API has not been received, the process returns to step S31.
- step S31 If it is determined in step S31 that the drawing start API has been read, the 3D-compatible player waits for a drawing command issued from the BD-J application, proceeds to step S32, and executes the drawing command.
- step S33 the 3D-compatible player determines whether the drawing completion notification API is called from the BD-J application.
- step S33 If it is determined in step S33 that the drawing completion notification API has not been called, the process returns to step S32 after issuing a drawing command from the BD-J application, and the same processing is repeated thereafter.
- step S33 If it is determined in step S33 that the drawing completion notification API has been called, the 3D-compatible player proceeds to step S34 to draw the graphics image obtained as a result of the execution of the drawing command in the back buffer.
- the graphics image stored in the front buffer is output to the display screen (output for display).
- step S35 the 3D-compatible player outputs the graphics image stored in the back buffer on the display screen instead of the front buffer (outputs for display).
- step S36 the 3D-compatible player copies the graphics image stored in the back buffer to the front buffer.
- the 3D-compatible player switches the back buffer and the front buffer in step S37, returns to step S31, and thereafter the same processing is repeated.
- the calling of the drawing start notification API for starting the drawing of the graphics image to the graphics plane 11 (that is, the back buffer) and the subsequent drawing of the graphics image are completed.
- the drawing completion notification API that notifies this is called from the BD-J application, an image drawn on the graphics plane 11 is output for display.
- the image drawn on the graphics plane 11 can be displayed after the notification that the drawing of the graphics image by the BD-J application has been completed, the left that is not in a consistent state is displayed. It is possible to prevent the image for the eye and the image for the right eye from being displayed on the display screen.
- the effective pixel coordinate system in the stereo background configuration is (0, 0)-(1920, 2160) (0, 0)-(1280, 1440) (0, 0)-(720, 960) (0, 0)-(720, 1152) One of them.
- ⁇ top-half is assigned to L background view and bottom-half is assigned to R background view.
- the background image format (Contents format) is one of single-color, JPEG (JFIF), or MPEG2 drip-feed, and the format is MPEG2 drip-feed.
- the background image must be an SD image (SD video only).
- JPEG JPEG
- GUI GraphicGraphUser Interface
- a container is a component (part) of a graphics image and can have a parent (upper layer) and a child (lower layer).
- a container that has no children and only children is called a root container.
- a component is a kind of container and can have a parent but cannot have children.
- each of the left-eye image and the right-eye image constituting the 3D image has a focus on the corresponding container, and the focus transition is the same ( Equivalent).
- a container constituting one of the left-eye image and right-eye image is focused, but the container constituting the other image corresponding to that container is not focused.
- the user who sees the 3D image displayed by the left-eye image and the right-eye image feels uncomfortable.
- the 3D-compatible player ensures that the focus transition is the same between the container for the left eye image and the container for the right eye image. , Manage focus.
- FIG. 38 shows an example of a GUI drawn on the graphics plane 11.
- the GUI in FIG. 38 is composed of one root container and two corresponding components # 1, # 2, and # 3 that are children of the root container.
- components # 1, # 2, and # 3 drawn on the L graphics plane 11L constitute a left-eye image
- components # 1, # drawn on the R graphics plane 11R. 2 and # 3 constitute a right-eye image.
- the component #i of the image for the left eye when the component #i of the image for the left eye is focused, the component #i that is the corresponding component of the image for the right eye must also be focused.
- ⁇ 3D-compatible players support by allowing two containers or components to have focus at the same time in order to make widget state transition / management symmetrical between L / R. For this purpose, it is necessary to make a container or component instance have a flag representing whether the focus is held or not so that it can be managed.
- the third focus request must fail. That is, the number of containers or components that hold the focus is limited to zero or two.
- first focus method and a second focus method as a focus method for focusing two corresponding containers (components) of the left-eye image and the right-eye image.
- FIG. 39 shows the first focus method and the second focus method.
- 39A shows the first focus method (1-root-container across L / R graphics plane).
- a container (component) on the L graphics plane 11L which is a child of a container (Root Container) straddling the L graphics plane 11L and the R graphics plane 11R, and the R graphics plane 11R
- the two corresponding containers with the container (component) are given focus at the same time.
- 39B shows the second focus method (2-root-containers (one for L graphics plane, another for R graphics plane)).
- a root container is drawn on each of the L graphics plane 11L and the R graphics plane 11R, and the respective root containers are simultaneously activated (in a focused state).
- FIG. 40 is a flowchart for explaining the focus management of the BD player in FIG. 3 as a 3D-compatible player that gives focus to two corresponding containers (components) of the left-eye image and the right-eye image. .
- the container (component) constituting the GUI drawn on the graphics plane 11 has a focus flag indicating whether or not the focus is set.
- the 3D-compatible player sets 0 as an initial value to a variable i for counting the number of containers in step S51.
- step S52 the 3D-compatible player selects a focused component (hereinafter also referred to as a focus holding component) among the components (containers) that are children of the container c (i) on the graphics plane 11. However, it is determined based on the focus flag of the component whether or not two already exist.
- a focused component hereinafter also referred to as a focus holding component
- step S52 If it is determined in step S52 that two focus holding components are not present among the components that are children of the container c (i), the 3D-compatible player proceeds to step S53, and a request (request) is made. The corresponding focus is given to the two corresponding components. Further, in step S53, the 3D-compatible player sets a value indicating that the focus is set to the focus flag of each of the two components having the focus, and the process proceeds to step S54.
- step S52 determines whether two focus holding components are present among the components that are children of the container c (i). If it is determined in step S52 that two focus holding components are present among the components that are children of the container c (i), the 3D-compatible player skips step S53 and proceeds to step S54. Then, the variable i is incremented by 1, and the process proceeds to step S55.
- step S55 the 3D-compatible player determines whether the variable i is less than the number N of containers on the graphics plane 11. If it is determined in step S55 that the variable i is less than the number N of containers on the graphics plane 11, the process returns to step S22 and the same processing is repeated.
- step S55 If it is determined in step S55 that the variable i is not less than the number N of containers on the graphics plane 11, the process ends.
- the container of the L graphics plane 11L (L region) that stores the left-eye image, and the container The container of the R graphics plane 11R (R region) that stores the image for the right eye corresponding to is brought into a focused state.
- the focus transition can be made similar in the container for the left eye image and the container for the right eye image.
- FIG. 41 shows, for example, the position on the display screen where the 3D image of the cursor of a pointing device such as a mouse can be seen, and the position of the cursor on the graphics plane 11.
- the cursor is displayed by a BD player, but in a 3D-compatible player, the 3D image of the cursor is displayed (so that it can be seen) at a position in front of the graphics 3D image (3D image reproduced from the disc 100). Is desirable.
- the cursor of the image for the left eye on the logical screen 21 is shifted by a predetermined offset value ⁇ x from the position (x, y) of the display screen where the 3D image of the cursor can be seen.
- the right eye image cursor on the logical screen 21 is also shifted by a predetermined offset value ⁇ x from the display screen position (x, y) at which the 3D image of the cursor can be seen. At the position (x ⁇ x, y).
- the position of the 3D image cursor in the depth direction changes according to a predetermined offset value ⁇ x.
- the value max-depth is recorded on the disc 100 (FIG. 3) which is a BD, and the 3D-compatible player sets the value max-depth to PSR (FIG. 7) (for example, PSR # 21). Can be set (stored).
- the value max-depth stored in the PSR is referred to, and the cursor is positioned in front of the position represented by the value max-depth.
- the offset value ⁇ x for displaying can be obtained.
- the 3D image of the cursor can be displayed at a position in front of the graphics 3D image.
- OSD On Screen Display
- 3D-compatible player can also be displayed at a position in front of the graphics 3D image in the same manner as the cursor.
- a value min-depth that represents the position in the depth direction of the 3D image reproduced from the disc 100 that is a BD is recorded together with the value max-depth.
- the value max-depth and the value min-depth can be set.
- playback from a BD is performed by setting, in the PSR, a value max-depth or the like that represents the position in the depth direction of the 3D image recorded on the disc 100 that is a BD.
- the cursor and OSD can be displayed in front of the 3D image.
- the 3D-compatible player can arbitrarily set the offset value ⁇ x for displaying the 3D image of the cursor.
- the offset value ⁇ x does not need to be constant, and can be changed (set) for each frame, for example.
- the display screen position (x, y) is used as the cursor position.
- the application must obtain the position (x + ⁇ x, y) (or (x ⁇ x, y)) of the cursor on the graphics plane 11 by performing coordinate conversion of the position (x, y) of the display screen. I must.
- the coordinate system of mouse events is limited to the L graphics plane.
- BD players are obliged to adopt coordinates on L graphics plane as two-dimensional position information when issuing mouse events.
- a 3D image of a cursor of a pointing device such as a mouse is composed of an image for the left eye and an image for the right eye, but when issuing an event with the cursor position as an argument
- the cursor position is one of the L graphics plane 11L (L region) and the R graphics plane 11R (R region) of the graphics plane 11 of the 3D image of the cursor, for example, the L graphics plane 11L.
- the position on (L region) is used.
- the BD-J application can know (recognize) the position on the L graphics plane 11L as the cursor position of the 3D image, so that the author of the BD-J application uses the L graphics as the cursor position.
- the position on the plane 11L a process for an event (such as a mouse event) having the cursor position as an argument can be described.
- ⁇ 3D compatible players must ensure the consistency of L-view / R-view. That is, the graphics left-eye image and right-eye image are displayed on the display screen after being rendered in a state of being consistent with the graphics plane 11 (so that it can be viewed as a 3D image). Must be guaranteed.
- the initialization (reset) of the graphics plane 11 is the same. That is, when one of the L graphics plane 11L and the R graphics plane 11R of the graphics plane 11 is initialized, the other must also be initialized.
- FIG. 42 is a diagram for explaining the consistency between the graphics image for the left eye and the image for the right eye.
- FIG. 42A the drawing of the image for the left eye on the L graphics plane 11L and the drawing of the image for the right eye on the R graphics plane 11R have been completed, and the 3D-compatible player has thus finished drawing.
- the left-eye image and the right-eye image must be displayed on the display screen.
- the drawing integrity guarantee API described in FIG. 35 uses a drawing command string as an argument, but the drawing command string that is an argument of the drawing integrity guarantee API is consistent (so that it can be seen as a 3D image).
- the drawing command sequence for drawing the left-eye image and the right-eye image in a state of being in a state of being in accordance with the drawing integrity guarantee API, the graphics left-eye image and right-eye image are It is guaranteed that the image is drawn in a consistent state.
- the 3D-compatible player must not display the left-eye image and the right-eye image in the state of B in FIG. 42 on the display screen.
- the consistency between the graphics left-eye image and right-eye image can be ensured, for example, by adopting triple buffering in a 3D-compatible player.
- FIG. 43 is a block diagram showing a functional configuration example of the BD player in FIG. 3 as a 3D-compatible player adopting triple buffering.
- the 3D-compatible player has a back buffer (hidden buffer) 211 as a graphics plane 11, and front buffers 212 and 213.
- the back buffer 211 includes buffers 211L and 211R.
- the front buffer 212 includes buffers 212L and 212R, and the front buffer 213 includes buffers 213L and 213R.
- buffers 211L, 212L, and 213L correspond to the L graphics plane 11L, and store an image for the left eye.
- the buffers 211R, 212R, and 213R correspond to the R graphics plane 11R and store an image for the right eye.
- the BD-J application issues a drawing command, and graphics 3D images (left-eye image and right-eye image) as a result of executing the drawing command are drawn in the back buffer 211.
- the front buffers 212 and 213 are alternately selected, and the left-eye image and the right-eye image stored in the selected buffer (hereinafter also referred to as a selection buffer) are displayed on the display screen. (Supplied to Display processor).
- the left buffer for the left eye stored (drawn) in the back buffer 211 after the drawing of the left eye image and the right eye image in the back buffer 211 is completed.
- the image and the right eye image are copied.
- the switching of the selection to alternately select the front buffers 212 and 213 as the selection buffer is performed by reading (copying) the left-eye image and the right-eye image from the back buffer. It is executed at the timing of VBI (Vertical Blanking Interval) after completion to the last horizontal line.
- VBI Very Blanking Interval
- FAA Framework Accurate Animation
- Image Frame Accurate Animation and Sync Frame Accurate Animation.
- the left eye image and right eye image for animation are operated in synchronization.
- Image ⁇ Frame Accurate Animation or Sync Frame Accurate Animation draws the image for the left eye for animation and the right eye for animation It is desirable that the image drawing is performed separately (animation is simultaneously operated at two locations).
- animation works only in one place. If an image or buffer that straddles L / R is used, animation can be simulated in two places, but due to the performance requirements of the BD player, a sufficient animation frame rate cannot be achieved.
- FIG. 44 is a diagram for explaining animation by images straddling L / R.
- one image of w ⁇ (h + 1080) pixels is drawn across the L graphics plane 11L of the graphics plane 11 of 1920 ⁇ 2160 pixels and the R graphics plane 11R.
- a portion (center portion) of the w ⁇ (h + 1080) pixel image except the upper w ⁇ h pixel image and the lower w ⁇ h pixel image is represented by a transparent pixel ( By painting with a transparent color), the upper w ⁇ h pixel image becomes the left eye image for animation, and the lower w ⁇ h pixel image becomes the right eye image for animation. be able to.
- the appearance of the one image is the same for the L graphics plane 11L and the R graphics plane 11R. It can be in a state drawn at a position. Therefore, it is possible to realize an animation of a 3D image in which a w ⁇ h pixel image on the L graphics plane 11L and a w ⁇ h pixel image on the R graphics plane 11R are operated in synchronization.
- FIG. 45 is a diagram illustrating drawing of an image for the left eye for animation and drawing of an image for the right eye for animation.
- an image for the left eye for animation is drawn on the L graphics plane 11L (L region). Further, in the 3D-compatible player, the right-eye image for animation is displayed on the R graphics plane 11R (R region) separately from the drawing of the left-eye image for animation on the L graphics plane 11L (L region). Drawn.
- FIG. 46 is a diagram of a 3D-compatible player that separately performs drawing of the left-eye image for animation on the L graphics plane 11L and drawing of the right-eye image for animation on the R graphics plane 11R.
- 3 is a block diagram illustrating a functional configuration example of a BD player 3;
- 46A shows a configuration example of a 3D-compatible player that draws an animation as Image Accurate Animation.
- An image buffer (Image buffer) 231 is a buffer that functions as a cache memory for a BD-J application to load and save resources from the disc 100 (FIG. 3) that is a BD.
- a list of images for the left eye (a list of images for L) and a list of images for the right eye for animation (a list of images for R) are stored.
- the pixel transfer unit 232L sequentially reads out images for the left eye for animation from the image buffer 231 in units of pixels (pixels), and draws them on the L graphics plane 11L.
- the pixel transfer device 232R sequentially reads out images for the right eye for animation from the image buffer 231 in units of pixels (pixels) and draws them on the R graphics plane 11R.
- FIG. 46B shows a configuration example of a 3D-compatible player that draws an animation as Sync FrameAccurate Animation.
- the graphics memory 241 is a work memory of a 3D-compatible player, and stores a left eye image buffer for animation (L image buffer) and a right eye image for animation (R use). Image buffer).
- the pixel transfer unit 242L sequentially reads out images for the left eye for animation from the graphics memory 241 in units of pixels and draws them on the L graphics plane 11L.
- the pixel transfer unit 242R sequentially reads out images for the right eye for animation from the graphics memory 241 in units of pixels and draws them on the R graphics plane 11R.
- FIG. 47 shows the definition of the extended API of Image Accurate Animation.
- Fig. 48 shows the definition of the extension API of Sync Frame Accurate Animation.
- FIG. 49 and 50 show sample codes of Image Accurate Animation.
- FIG. 50 is a diagram following FIG. 49.
- FIG. 51 and 52 show sample codes for Sync Accurate Animation.
- FIG. 52 is a diagram following FIG.
- the 3D-compatible player in FIG. 3 performs processing for the content of the 3D image recorded on the disc 100 and the Java (registered trademark) application.
- the 3D image content and Java (registered trademark) application to be processed in the player are data supply means other than the recording medium such as the disc 100, specifically, for example, an object carousel or data carousel that is a digital broadcasting application. Therefore, the 3D-compatible player can process the 3D image content and Java (registered trademark) application supplied from the object carousel and data carousel.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Library & Information Science (AREA)
- Human Computer Interaction (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Television Signal Processing For Recording (AREA)
- Processing Or Creating Images (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
Abstract
Description
フラットステレオグラフィクスモードでは、BD-Jアプリケーションは、グラフィクスの3D画像であるステレオ画像を構成する左眼用画像、及び、右眼用画像を、論理プレーン10に描画する。
public void setOffset(int offset)
デフォルト値は、0
public int getOffset()
デフォルト値は、0
public void setOffset(int offset)
デフォルト値は、0
public int getOffset()
デフォルト値は、0
HD_1280_720 = 2
SD_720_576 = 3
SD_720_480 = 4
QHD_960_540 = 5
HD_1920_2160 = 6
HD_1280_1440 = 7
SD_720_1152 = 8
SD_720_960 = 9
QHD_960_1080 = 10
GRAPHICS_STEREO_VIEW = 23
GRAPHICS_PLANE_OFFSET = 24
GRAPHICS_DUAL_MONO_VIEW = 25
GRAPHICS_FORCED_MONO_VIEW = 26
VIDEO_STEREO_VIEW = 28
VIDEO_FLATTENED_STEREO_VIEW = 29
VIDEO_DUAL_MONO_VIEW = 30
VIDEO_FORCED_MONO_VIEW = 31
BACKGROUND_STEREO_VIEW = 18
BACKGROUND_FLATTENED_STEREO_VIEW = 19
BACKGROUND_DUAL_MONO_VIEW = 20
BACKGROUND_FORCED_MONO_VIEW = 21
例えば、1920×2160画素のビデオプレーン13に対し、ディスプレイ座標系は、垂直方向が1/2の、1920×1080画素のサイズの座標系となる。この場合、オーサは、例えば、以下のように、ビデオの位置とサイズを設定しなければならない。
RctangL dest = new RctangL(100,100,960,540);
AWTVideoSizeControl videoSizeControl = (AWTVideoSizeControl)player.getControl("javax.tv.media.AWTVideoSizeControl");
videoSizeControl.setSize(new AWTVideoSize(src, dest));
(0, 0)-(1920, 2160)
(0, 0)-(1280, 1440)
(0, 0)-(720, 960)
(0, 0)-(720, 1152)
(0, 0)-(960, 1080)
のいずれかである。
(0, 0)-(1920, 2160)
(0, 0)-(1280, 1440)
(0, 0)-(720, 960)
(0, 0)-(720, 1152)
のいずれかである。
Claims (3)
- ビデオの画像を記憶するビデオプレーンは、左眼用画像を記憶する1面分の画像の記憶領域であるL領域と、右眼用画像を記憶する1面分の画像の記憶領域であるR領域との、2面分の画像の記憶領域が並んで配置された記憶領域であり、
前記L領域に記憶される画像のサイズ、及び、位置を設定するL用API(Application Programming Interface)と、
前記R領域に記憶される画像のサイズ、及び、位置を設定するR用APIと
を備え、
前記L用API、及び、前記R用APIのうちの一方のAPIは、他方のAPIが設定する前記画像のサイズ、及び、位置と同一のサイズ、及び、位置を設定し、
グラフィクスの画像に視差を与えて、元の画像から、左眼用画像と右眼用画像とを生成するためのデータであるグラフィクスプレーンオフセット値、及び、PG(Presentation Graphics) の画像に視差を与えて、元の画像から、左眼用画像と右眼用画像とを生成するためのデータであるPGプレーンオフセット値を、前記L用API、及び、前記R用APIが前記ビデオプレーンに記憶されるビデオの画像のサイズを設定するスケーリングを行うときのスケーリング比率でスケーリングする
情報処理装置。 - ビデオの画像を記憶するビデオプレーンは、左眼用画像を記憶する1面分の画像の記憶領域であるL領域と、右眼用画像を記憶する1面分の画像の記憶領域であるR領域との、2面分の画像の記憶領域が並んで配置された記憶領域であり、
前記L領域に記憶される画像のサイズ、及び、位置を設定するL用API(Application Programming Interface)と、
前記R領域に記憶される画像のサイズ、及び、位置を設定するR用APIと
のうちの、一方のAPIは、他方のAPIが設定する前記画像のサイズ、及び、位置と同一のサイズ、及び、位置を設定し、
グラフィクスの画像に視差を与えて、元の画像から、左眼用画像と右眼用画像とを生成するためのデータであるグラフィクスプレーンオフセット値、及び、PG(Presentation Graphics) の画像に視差を与えて、元の画像から、左眼用画像と右眼用画像とを生成するためのデータであるPGプレーンオフセット値を、前記L用API、及び、前記R用APIが前記ビデオプレーンに記憶されるビデオの画像のサイズを設定するスケーリングを行うときのスケーリング比率でスケーリングする
情報処理方法。 - ビデオの画像を記憶するビデオプレーンは、左眼用画像を記憶する1面分の画像の記憶領域であるL領域と、右眼用画像を記憶する1面分の画像の記憶領域であるR領域との、2面分の画像の記憶領域が並んで配置された記憶領域であり、
前記L領域に記憶される画像のサイズ、及び、位置を設定するL用API(Application Programming Interface)と、
前記R領域に記憶される画像のサイズ、及び、位置を設定するR用APIと
であるプログラムであり、
前記L用API、及び、前記R用APIのうちの一方のAPIは、他方のAPIが設定する前記画像のサイズ、及び、位置と同一のサイズ、及び、位置を設定し、
グラフィクスの画像に視差を与えて、元の画像から、左眼用画像と右眼用画像とを生成するためのデータであるグラフィクスプレーンオフセット値、及び、PG(Presentation Graphics) の画像に視差を与えて、元の画像から、左眼用画像と右眼用画像とを生成するためのデータであるPGプレーンオフセット値を、前記L用API、及び、前記R用APIが前記ビデオプレーンに記憶されるビデオの画像のサイズを設定するスケーリングを行うときのスケーリング比率でスケーリングする
プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/993,417 US8866885B2 (en) | 2009-04-03 | 2010-03-24 | Information processing device, information processing method, and program |
EP10758504.4A EP2273797A4 (en) | 2009-04-03 | 2010-03-24 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND CORRESPONDING PROGRAM |
CN2010800016955A CN102301726B (zh) | 2009-04-03 | 2010-03-24 | 信息处理设备、信息处理方法 |
HK12104571.5A HK1164001A1 (en) | 2009-04-03 | 2012-05-10 | Information processing device, information processing method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009091162 | 2009-04-03 | ||
JP2009-091162 | 2009-04-03 | ||
JP2010046030A JP4915458B2 (ja) | 2009-04-03 | 2010-03-03 | 情報処理装置、情報処理方法、及び、プログラム |
JP2010-046030 | 2010-03-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010113730A1 true WO2010113730A1 (ja) | 2010-10-07 |
Family
ID=42828025
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/055134 WO2010113730A1 (ja) | 2009-04-03 | 2010-03-24 | 情報処理装置、情報処理方法、及び、プログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US8866885B2 (ja) |
EP (1) | EP2273797A4 (ja) |
JP (1) | JP4915458B2 (ja) |
CN (3) | CN103167297B (ja) |
HK (1) | HK1164001A1 (ja) |
MY (1) | MY156833A (ja) |
WO (1) | WO2010113730A1 (ja) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4915456B2 (ja) | 2009-04-03 | 2012-04-11 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP2010244245A (ja) * | 2009-04-03 | 2010-10-28 | Sony Corp | 情報処理装置、情報処理方法、及び、プログラム |
JP4915459B2 (ja) * | 2009-04-03 | 2012-04-11 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP5510700B2 (ja) * | 2009-04-03 | 2014-06-04 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP4915457B2 (ja) | 2009-04-03 | 2012-04-11 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP4919122B2 (ja) | 2009-04-03 | 2012-04-18 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP2010245761A (ja) * | 2009-04-03 | 2010-10-28 | Sony Corp | 情報処理装置、情報処理方法、及び、プログラム |
JP7392374B2 (ja) * | 2019-10-08 | 2023-12-06 | ヤマハ株式会社 | 無線送信装置、無線受信装置、無線システム及び無線送信方法 |
CN110996124B (zh) * | 2019-12-20 | 2022-02-08 | 北京百度网讯科技有限公司 | 原创视频确定方法及相关设备 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005049668A (ja) * | 2003-07-30 | 2005-02-24 | Sharp Corp | データ変換装置、表示装置、データ変換方法、プログラム及び記録媒体 |
JP2006115198A (ja) * | 2004-10-14 | 2006-04-27 | Canon Inc | 立体画像生成プログラム、立体画像生成システムおよび立体画像生成方法 |
WO2009090868A1 (ja) * | 2008-01-17 | 2009-07-23 | Panasonic Corporation | 3d映像が記録された記録媒体、3d映像を記録する記録装置、並びに3d映像を再生する再生装置及び再生方法 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02305291A (ja) | 1989-05-19 | 1990-12-18 | Sharp Corp | 立体画像通信装置 |
JPH08191463A (ja) | 1994-11-11 | 1996-07-23 | Nintendo Co Ltd | 立体画像表示装置およびそれに用いられる記憶装置 |
US20050146521A1 (en) * | 1998-05-27 | 2005-07-07 | Kaye Michael C. | Method for creating and presenting an accurate reproduction of three-dimensional images converted from two-dimensional images |
JP2004191745A (ja) | 2002-12-12 | 2004-07-08 | Toshiba Corp | 画像表示方法、コンピュータ、およびディスプレイドライバ |
US7173635B2 (en) * | 2003-03-25 | 2007-02-06 | Nvidia Corporation | Remote graphical user interface support using a graphics processing unit |
JP4251907B2 (ja) * | 2003-04-17 | 2009-04-08 | シャープ株式会社 | 画像データ作成装置 |
CN101841728B (zh) * | 2003-04-17 | 2012-08-08 | 夏普株式会社 | 三维图像处理装置 |
EP1720352A4 (en) | 2004-02-23 | 2009-01-07 | Panasonic Corp | DISPLAY PROCESSING DEVICE |
CN101814310B (zh) * | 2004-07-22 | 2012-11-28 | 松下电器产业株式会社 | 重放装置和重放方法 |
JP4523368B2 (ja) * | 2004-09-10 | 2010-08-11 | 株式会社マーキュリーシステム | 立体視画像生成装置およびプログラム |
JP2007295391A (ja) | 2006-04-26 | 2007-11-08 | Sharp Corp | 撮影機能付き携帯情報端末 |
JP2010505174A (ja) * | 2006-09-28 | 2010-02-18 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | メニューディスプレイ |
KR101377736B1 (ko) * | 2006-10-11 | 2014-03-24 | 코닌클리케 필립스 엔.브이. | 3차원 그래픽 데이터의 생성 |
KR20080057940A (ko) * | 2006-12-21 | 2008-06-25 | 삼성전자주식회사 | 축소 화면을 표시하는 영상 디스플레이 장치 및 그 방법 |
US8301013B2 (en) | 2008-11-18 | 2012-10-30 | Panasonic Corporation | Reproduction device, reproduction method, and program for stereoscopic reproduction |
JP5510700B2 (ja) | 2009-04-03 | 2014-06-04 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP4915459B2 (ja) | 2009-04-03 | 2012-04-11 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP4915457B2 (ja) | 2009-04-03 | 2012-04-11 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP2010244245A (ja) | 2009-04-03 | 2010-10-28 | Sony Corp | 情報処理装置、情報処理方法、及び、プログラム |
JP4915456B2 (ja) | 2009-04-03 | 2012-04-11 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP4919122B2 (ja) | 2009-04-03 | 2012-04-18 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP2010245761A (ja) | 2009-04-03 | 2010-10-28 | Sony Corp | 情報処理装置、情報処理方法、及び、プログラム |
-
2010
- 2010-03-03 JP JP2010046030A patent/JP4915458B2/ja active Active
- 2010-03-24 CN CN201310042303.5A patent/CN103167297B/zh not_active Expired - Fee Related
- 2010-03-24 EP EP10758504.4A patent/EP2273797A4/en not_active Ceased
- 2010-03-24 US US12/993,417 patent/US8866885B2/en not_active Expired - Fee Related
- 2010-03-24 CN CN201310042875.3A patent/CN103179422B/zh not_active Expired - Fee Related
- 2010-03-24 CN CN2010800016955A patent/CN102301726B/zh active Active
- 2010-03-24 WO PCT/JP2010/055134 patent/WO2010113730A1/ja active Application Filing
- 2010-03-24 MY MYPI2010005571A patent/MY156833A/en unknown
-
2012
- 2012-05-10 HK HK12104571.5A patent/HK1164001A1/xx not_active IP Right Cessation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005049668A (ja) * | 2003-07-30 | 2005-02-24 | Sharp Corp | データ変換装置、表示装置、データ変換方法、プログラム及び記録媒体 |
JP2006115198A (ja) * | 2004-10-14 | 2006-04-27 | Canon Inc | 立体画像生成プログラム、立体画像生成システムおよび立体画像生成方法 |
WO2009090868A1 (ja) * | 2008-01-17 | 2009-07-23 | Panasonic Corporation | 3d映像が記録された記録媒体、3d映像を記録する記録装置、並びに3d映像を再生する再生装置及び再生方法 |
Also Published As
Publication number | Publication date |
---|---|
CN103179422A (zh) | 2013-06-26 |
US8866885B2 (en) | 2014-10-21 |
CN102301726A (zh) | 2011-12-28 |
CN102301726B (zh) | 2013-03-20 |
CN103179422B (zh) | 2015-04-29 |
CN103167297A (zh) | 2013-06-19 |
US20120075416A1 (en) | 2012-03-29 |
JP2010259054A (ja) | 2010-11-11 |
JP4915458B2 (ja) | 2012-04-11 |
MY156833A (en) | 2016-03-31 |
EP2273797A1 (en) | 2011-01-12 |
CN103167297B (zh) | 2015-06-10 |
EP2273797A4 (en) | 2013-07-10 |
HK1164001A1 (en) | 2012-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4919122B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
WO2010113728A1 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
WO2010113729A1 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP4915458B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP4915456B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP4962825B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP4962670B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP4962674B1 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP4962814B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP4998649B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080001695.5 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010758504 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10758504 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 7674/CHENP/2010 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12993417 Country of ref document: US |