WO2010058547A1 - 特殊再生を考慮した再生装置、集積回路、再生方法 - Google Patents

特殊再生を考慮した再生装置、集積回路、再生方法 Download PDF

Info

Publication number
WO2010058547A1
WO2010058547A1 PCT/JP2009/006135 JP2009006135W WO2010058547A1 WO 2010058547 A1 WO2010058547 A1 WO 2010058547A1 JP 2009006135 W JP2009006135 W JP 2009006135W WO 2010058547 A1 WO2010058547 A1 WO 2010058547A1
Authority
WO
WIPO (PCT)
Prior art keywords
playback
video
stream
eye
mode
Prior art date
Application number
PCT/JP2009/006135
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
大久保雅文
Original Assignee
パナソニック株式会社
ライクセンリング ジェルマーノ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社, ライクセンリング ジェルマーノ filed Critical パナソニック株式会社
Priority to JP2010539135A priority Critical patent/JP5632291B2/ja
Priority to EP09827328.7A priority patent/EP2348747A4/en
Priority to CN200980117335.9A priority patent/CN102027749B/zh
Publication of WO2010058547A1 publication Critical patent/WO2010058547A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/005Reproducing at a different information rate from the information rate of recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/11Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/327Table of contents
    • G11B27/329Table of contents on a disc [VTOC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/178Metadata, e.g. disparity information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8233Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a character code signal

Definitions

  • the present invention is an invention that belongs to the technical field of superposition technology of stereoscopic video and captions and graphics.
  • the above-mentioned superposition technology is to provide a playback device with a separate display plane for video streams, subtitles, and graphics, and superimpose subtitles and graphics on individual video frames of a high-quality video stream.
  • This is a technique for outputting to a device such as a connected display device, and is widely used as a technique for realizing a high sense of reality.
  • one of the commonly used methods is a method using shutter glasses.
  • the viewer's left eye and right eye field of view are alternately covered with glasses at high speed, and the display image on the display device is updated at high speed for left and right eyes in synchronization with the operation of the glasses.
  • the left eye image displayed on the display device can be seen only by the shutter glasses, and the right eye image can be seen only by the right eye.
  • the display device side In order to show the viewer a three-dimensional image at a frame rate equivalent to that of a normal flat image, the display device side needs to have a response performance twice as high as normal, for example, 60 frames of video per second. In order to display, at least 120 frames per second must be switched. Therefore, the video stream to be displayed must be encoded in a state of 120 frames per second.
  • the playback device determines whether to play back the digital stream in 3D or 2D only by whether the digital stream supports 2D or 3D, inconvenience may occur.
  • a playback device capable of switching a display mode in either a 3D mode that allows a user to perform a stereoscopic view of a video frame or a 2D mode that allows a user to perform a planar view of a video frame, Reading means for reading out a digital stream including a left-eye video stream and a right-eye video stream from a recording medium; A mode storage unit that stores the current display mode; A dimension determination unit that determines whether the digital stream read from the recording medium is compatible with the 3D mode; When the condition that the digital stream corresponds to the 3D mode and the current display mode is the 3D mode is satisfied, the video stream for the right eye and the video stream for the left eye are separated from the digital stream, A demultiplexer that separates either the right-eye video stream or the left-eye video stream from the digital stream if the condition is not satisfied; And a video decoder for obtaining a video frame for stereoscopic or planar view by decoding the separated video stream.
  • the playback device configured as described above determines whether the digital stream supports 2D or 3D, and whether the playback device is set to play back in 2D or plays back in 3D. By determining both whether the digital stream is played back in 2D or in 3D, a stereoscopic video can be appropriately generated.
  • Video frames subject to repeated output are It is a video frame obtained by decoding one of the right-eye video stream and the left-eye video stream separated from the digital stream, and the repeatedly output video frame is converted into the right-eye video plane and the left-eye video. It is desirable to be configured to write to each of the planes. Even when special playback is performed by a playback device that plays back a stereoscopic video stream, this playback device can prevent the stereoscopic processing from catching up with the playback speed when special playback is performed.
  • the graphics stream can be displayed in 3D, there is an effect that the user can enjoy various stereoscopic images.
  • the playback device When the display mode is changed from 3D to 2D or when the display mode is changed from 3D to 2D, the playback device notifies the display device that the display mode has been switched. When the display device that has received the notification is actually ready to display in the switched mode, the display device notifies the playback device that the preparation to display in the switched mode is complete. After the notification that the display mode has been switched to the display device, until the display device notifies the playback device that it is ready to display in the switched mode, writing to the plane is performed. Although it is possible to do this, it is not displayed on the display device (blackout). Accordingly, during this time, the contents written in the plane are not displayed on the display device.
  • the application is a program that can entertain the user by providing, for example, a game using the video of the main part of the movie.
  • an animation may be played in accordance with the main video and audio.
  • an application that draws a graphics image that forms an animation needs to draw a graphics image on an interactive graphics plane in synchronization with a video stream that is the main video.
  • the application cannot synchronize the graphics stream and the video stream.
  • the playback device A platform part for executing bytecode applications; Transmission means for transmitting the video frame to the display device connected to the playback device, and causing the display device to output the video frame;
  • the transmission means performs re-authentication of the display device, After the re-authentication, when a notification indicating that the output according to the mode after the switching can be performed is received, the output according to the mode after the switching can be performed for the bytecode application executed by the platform unit. Should be notified.
  • the application After the notification that the display mode has been switched to the display device, the application is not able to display the interactive graphics plane until it notifies the playback device that the display device is ready to display in the switched mode.
  • the application In order to prevent the processing from proceeding by writing a graphics image, it is possible to create a mechanism for notifying the application that the display device is ready to be displayed in the mode after switching. As a result, the application can draw a graphics image on the interactive graphics plane in synchronization with the video stream.
  • FIG. 2 is a diagram showing an internal configuration of a BD-ROM 100.
  • FIG. It is a figure which shows in detail the internal structure of "playback attribute information" and “playback data information" among the internal structures of a play list file. A specific description of a playlist file that defines a 2D playlist is shown. An example of a playlist file that defines a 3D playlist is shown.
  • FIG. 10 is a flowchart showing a processing procedure when the playback device 200 reads a playlist as the video data and projects stereoscopic captions / stereoscopic graphics superimposed on the stereoscopic video.
  • I-picture, B-picture, P-picture existing in the GOP is selected and played, and which of the multiple Closed-GOPs and Open-GOPs constituting the video stream is selected and played.
  • the internal structure of a demultiplexer and a video decoder is shown. It is a flowchart which shows the processing procedure of special reproduction in consideration of double speed reproduction. It is a flowchart which shows the process sequence of the video frame process in 3D mode. It is a flowchart which shows the image plane process sequence according to a user operation, the request
  • FIG. 10 is a diagram showing a plane shift processing procedure in the image plane 8. It is a figure which shows the pixel data stored in the graphics plane. The stored contents of the graphics plane after the shift is performed.
  • FIG. 1 is a diagram showing the internal configuration of the most basic playback apparatus provided with the problem solving means of the present application. Since the reproducing apparatus shown in the figure excludes the above-described problem solving means and eliminates unnecessary components as much as possible, the reproducing apparatus includes a reading means 201, a dimension determination unit 202, a mode storage unit 203, a demultiplexer 204, and a video decoder 205.
  • the reproducing apparatus includes a reading means 201, a dimension determination unit 202, a mode storage unit 203, a demultiplexer 204, and a video decoder 205.
  • FIG. 2 is a flowchart showing the most basic reproduction method provided with the problem solving means of the present application.
  • the reproduction method shown in the figure includes a reading step S101, a mode storage step S102, a separation step S103, and a decoding and frame output step S104, which are time series elements corresponding to the above problem solving means.
  • FIG. 3 is a diagram showing a usage pattern in using the playback apparatus equipped with the above problem solving means as a specific electrical appliance.
  • a BD-ROM 100 and a playback device 200 which are examples of a recording medium, constitute a home theater system together with a remote controller 300, a display device 400, and liquid crystal glasses 500, and are used by a user.
  • BD-ROM 100 supplies, for example, movie works to the home theater system.
  • the playback device 200 is connected to the display device 400 and plays back the BD-ROM 100.
  • the reproduced video reproduced in this way includes 2D video and 3D video.
  • the 2D video is an image that is expressed by pixels on the XY plane by regarding the display screen of the display device as the XY plane, and is also called a planar view image.
  • 3D video is an image in which the depth in the Z-axis direction is added to the pixels on the XY plane on the screen of the display device.
  • 3D video can be viewed by the user by playing both the left-eye video that should be viewed with the left eye and the right-eye video that should be viewed with the right eye, and producing a stereoscopic effect on the left-eye video and the right-eye video. To be served.
  • the user feels that the pixel having the positive Z-axis coordinate is in front of the screen of the display device, and feels that the pixel having the negative Z-axis coordinate exists behind the screen.
  • the remote controller 300 is a device that accepts various operations related to playback control from the user and also accepts operations for the hierarchical GUI from the user. For receiving such operations, the remote controller 100 has a menu key for calling a pop-up menu, and a pop-up menu. An arrow key that moves the focus of the GUI parts that make up the GUI, a decision key that performs a confirmation operation on the GUI parts that make up the pop-up menu, a return key to return the hierarchical pop-up menu to a higher level, Provide numeric keys.
  • the display device 400 provides a user with an interactive operating environment by displaying a playback image of a movie work or displaying a menu or the like.
  • the liquid crystal glasses 500 are composed of a liquid crystal shutter and a control unit, and realize stereoscopic viewing using parallax in both eyes of the user.
  • the liquid crystal shutter of the liquid crystal glasses 500 is a shutter using a liquid crystal lens having a property that light transmittance is changed by changing an applied voltage.
  • the control unit of the liquid crystal glasses 500 receives a synchronization signal for switching the output of the image for the right eye and the image for the left eye sent from the playback device, and switches between the first state and the second state according to the synchronization signal. .
  • the first state is a state in which the applied voltage is adjusted so that the liquid crystal lens corresponding to the right eye does not transmit light, and the applied voltage is adjusted so that the liquid crystal lens corresponding to the left eye transmits light.
  • the image for the left eye will be used for viewing.
  • the applied voltage is adjusted so that the liquid crystal lens corresponding to the right eye transmits light, and the applied voltage is adjusted so that the liquid crystal lens corresponding to the left eye does not transmit light.
  • the liquid crystal glasses can be used for viewing images for the right eye.
  • the right eye and the left eye have a slight difference in appearance between the image seen from the right eye and the image seen from the left eye due to the difference in position. Using this difference, a human can recognize a visible image as a solid. Therefore, if the liquid crystal glasses 500 synchronize the switching between the first state and the second state as described above with the switching timing of the output of the right-eye image and the left-eye image, the user can obtain a planar view. The illusion is that the display looks three-dimensional. Next, the time interval for displaying the right-eye video and the left-eye video will be described.
  • the right-eye image and the left-eye image have a difference corresponding to a difference in appearance corresponding to human parallax, and these images are switched at short time intervals. Display, it looks as if a three-dimensional display has been made.
  • This short time interval is sufficient as long as it is an illusion that a person can see three-dimensionally by the switching display described above.
  • FIG. 4 is a diagram showing an internal configuration of the BD-ROM 100.
  • BD-ROM is shown in the 4th row of this figure, and tracks on the BD-ROM are shown in the 3rd row.
  • the track in this figure is drawn by extending the track formed in a spiral shape from the inner periphery to the outer periphery of the BD-ROM in the horizontal direction.
  • This track includes a lead-in area, a volume area, and a lead-out area.
  • BCA Burst Cutting Area
  • the volume area in this figure has a layer model of a file system layer and an application layer, and application data such as video data is recorded in the file system layer with the file system information at the top.
  • the file system is UDF, ISO9660, etc., and logical data recorded in the same way as a normal PC can be read out using a directory and file structure, and a 255 character file name.
  • the directory name can be read out.
  • the disk root certificate file (app.discroot.cert) exists under the CERTIFICATE directory.
  • app.discroot.cert is a Java (registered trademark) application that performs dynamic scenario control using a Java (registered trademark) virtual machine. It is a digital certificate used in the process of performing (hereinafter referred to as signature verification).
  • the BDMV directory is a directory in which data such as AV content and management information handled by the BD-ROM is recorded. Under the BDMV directory, there are a PLAYLIST directory, CLIPINF directory, STREAM directory, BDJO directory, JAR directory, META directory, and so on. There are six subdirectories called, and two types of files, INDEX.BDMV and MovieObject.bdmv, are arranged.
  • the STREAM directory is, so to speak, a directory that stores files that become the transport stream itself, and there is a file (00001. m2ts) with the extension “m2ts”.
  • the CLIPINF directory contains a file (00001.clpi) with the extension “clpi”.
  • the BDJO directory contains a file (XXXXX.bdjo) with the extension “bdjo”.
  • the XML file (ZZZZZ.xml) exists in the META directory.
  • Index.bdmv is management information related to the entire BD-ROM, and after the disc is inserted into the playback device, the index.bdmv is read first so that the disc is uniquely recognized by the playback device.
  • Index.bdmv defines the correspondence between individual titles constituting the title structure of the optical disc and the operation mode object that defines the operation mode.
  • the title structure starts playback of a title (first play title) accompanied by warnings to viewers, logo display by content providers, etc.
  • the title of the optical disc can be determined in which operation mode each title operates. Specify in detail.
  • MovieObject.bdmv stores a movie object.
  • a movie object is one of operation mode objects.
  • a command-based operation mode referred to as HDMV mode
  • a batch that supplies a plurality of navigation commands as a batch job to a playback device and operates the playback device based on these navigation commands. It is a job program.
  • the movie object includes one or more commands and a mask flag that defines whether or not to mask these calls when a user makes a menu call or title call to the GUI.
  • the navigation command is a control command described in a so-called interpreter language, and a plurality of navigation commands are decoded as a batch job by an interpreter (job control program), and causes the CPU to execute a desired job.
  • the navigation command is composed of an operation code and an operand, and the operation such as title branching, reproduction, and calculation can be instructed to the reproduction device by the operation code.
  • the operand is a playlist number or title number, and an object to be operated can be designated.
  • a file with the extension “m2ts” is a digital AV stream in the MPEG-TS (TransportStream) format, and is obtained by multiplexing a video stream, one or more audio streams, and subtitle data.
  • the video stream indicates the moving image portion of the movie
  • the audio stream indicates the audio portion of the movie.
  • a transport stream including only a 2D stream is referred to as a “2D stream”
  • a transport stream including a 3D stream is referred to as a “3D stream”.
  • both left-eye and right-eye data can be put in m2ts, and m2ts can be prepared separately for left-eye and right-eye.
  • a codec for example, MPEG-4 AVC MVC
  • a video stream compression-encoded with such a codec is referred to as an MVC video stream.
  • a file with the extension “mpls” is a file storing playlist (PL) information.
  • the playlist information is information that defines a playlist with reference to an AV clip.
  • a dimension identification flag for identifying whether the stream to be played is for 2D or 3D exists on the BD-ROM.
  • a dimension identification flag is embedded in playlist (PL) information.
  • AV playback can be started by a Java TM application for playback control instructing the Java TM virtual machine to generate a JMF player instance that plays back this playlist information.
  • a JMF (Java Media Frame work) player instance is actual data generated on a heap memory of a virtual machine based on a JMF player class.
  • a 2D playlist includes only a 2D playback stream
  • a 3D playlist includes a 3D stereoscopic stream in addition to a 2D stream.
  • a file with the extension “clpi” is Clip information corresponding to each AV clip on a one-to-one basis. Because of the management information, the Clip information has information such as a stream encoding format, a frame rate, a bit rate, and a resolution in the AV clip, and an EP_map indicating a GOP head position.
  • the above Clip information and playlist information are classified as “static scenarios”.
  • the BD-J object is an operation mode object that causes the playback device to operate in a bytecode application-based operation mode (referred to as BD-J mode).
  • a bytecode application is an application generated using a compiler-type language such as an object-oriented language, for example, Java (TM) language. Since the BD-J object defines the operation of the playback device using a "compiler language" such as the Java (TM) language, it has an opposite meaning to the movie object described by a command that is an interpreted language. Have.
  • the BD-J object includes the following “application management table”.
  • the application management table includes a plurality of entries. These entries include a "control code” that indicates whether to start the application automatically (AutoStart) or wait for a call from another application (Present) in the title, and a JAR file This includes a “application ID” indicating a target application and “application detailed information” using a five-digit numerical value as the file name.
  • “Application detailed information” includes “priority” when the application is loaded, “binding information” indicating whether the application is title unbound, disk bound, and the name of the application.
  • a character string to be displayed, a “language code” indicating the language attribute of the application, and an “icon locator” indicating the location of the icon associated with the application are stored for each application.
  • the application management table can manage the consumption of memory resources and the like by each application by managing the title as a life cycle, with the reproduction unit called the title as a delimiter. As a result, even if the use of resources by multiple applications competes during playback of a title and falls into a deadlock state, if a different title is selected by the user, all those applications will be terminated.
  • the deadlock state is forcibly eliminated. Even if a runaway application occupies memory during playback of a title, if the user selects another title, the application is forcibly terminated. Will be resolved. In this way, stable memory resource management can be realized without consuming unnecessary memory resources. Since stable memory resource management can be realized, the present invention is more effective in the implementation of home appliances with limited memory resource capacity.
  • a bytecode application whose operation is defined by the application management table in the BD-J object is called a “BD-J application”.
  • the Java (registered trademark) application is actually the Java (registered trademark) archive file (YYYYY.jar) stored in the JAR directory under the BDMV directory in FIG.
  • the application is, for example, a Java (registered trademark) application, and includes one or more xlet programs loaded in the heap area (also called work memory) of the virtual machine.
  • the metafile (ZZZZZ.xml) stored in the META directory stores various information related to video works on the disc.
  • Information stored in the metafile includes the disc name and image of the disc, information on who created the disc, the title name associated with each title, and the like. This completes the description of the BD-ROM 100.
  • the metafile is not an essential file, and some BD-ROMs do not store this file.
  • the playlist information that realizes stereoscopic and planar views is configured as follows.
  • a “playlist” is a playback path that is defined by logically specifying the playback order between playback sections while specifying playback sections on the time axis of the AV stream. It plays the role of defining which part is played back and in what order the scene is developed.
  • the playlist information stored in the MPLS file defines the “type” of the playlist.
  • the playback path defined by the playlist information is so-called “multipath”. Multipath is a bundle of a playback path (main path) defined for a main AV stream and a playback path (subpath) defined for a substream. A chapter position is defined on the multipath playback time axis.
  • the playback device can realize random access to an arbitrary time point on the multipath time axis.
  • the playlist information includes reproduction attribute information, MainPath information, Subpath information, and PlayListMark information.
  • MainPath information is information that defines a logical playback section by defining one or more sets of a time point that becomes In_Time and a time point that becomes Out_Time in the playback time axis of the base-view video stream, Contains STN_table. This MainPath information corresponds to reproduction data information.
  • Playlist Mark information includes designation of a time point that becomes a chapter among a part of a video stream designated by a combination of In_Time information and Out_Time information.
  • Subpath information is composed of one or more SubPlayItem information, and SubPlayItem information specifies a dependent view stream to be reproduced in synchronization with the base-view video stream, and a playback time of the dependent view stream. It includes a set of In_Time information and Out_Time information in the axis.
  • FIG. 5 is a diagram showing in detail the internal structure of “playback attribute information” and “playback data information” in the internal structure of the playlist file.
  • “Playback attribute information” includes the “version” of the standard on which the playlist file is based, whether the playlist file is a movie or a slide show, and whether each PlayItem that makes up the playlist file is played back sequentially.
  • a “reproduction type” designating whether to reproduce at random and a “dimension identification flag” indicating whether the playlist file is a 2D playlist or a 3D playlist are included.
  • PlayItem data information is composed of N + 1 pieces of PlayItem information (PlayItem # 0 information to PlayItem # N information in the figure).
  • Individual PlayItem information includes “stream file information” indicating the transport stream supported by the PlayItem, “playback time information” indicating the playback time length of the stream file, and any playitem information. It includes “stream registration information” that indicates whether or not playback of the packetized elementary stream is permitted for each packetized elementary stream.
  • the “dimension identification flag” indicates that the 3D playlist is a case where the digital stream in the transport stream file referred to by the playlist file is a digital stream corresponding to 3D.
  • the “dimensional identification flag” indicates that it is a 2D playlist when the digital stream in the transport stream file referred to by the playlist file is a digital stream that only supports 2D.
  • FIG. 6 shows a specific description of a playlist file that defines a 2D playlist.
  • “version information” is 2.00
  • the playitem information of the playback data information is “movie” as the playback type of the playlist file
  • the stream file specified by the PlayItem information is the head
  • the content is set to “sequential” which means that playback is performed in order.
  • the dimension identification flag shows an example of a stream configuration that can be displayed in 2D.
  • the stream file used by PlayItem # 0 is 00001.m2ts under the STREAM directory, and the playback time of PlayItem # 0 is 0x002932E0.
  • the three stream registration information of PlayItem # 0 indicates details of three packetized elementary streams identified by three logical stream numbers of video # 1, audio # 1, and subtitle # 1.
  • the packetized elementary stream identified by the logical stream number of video # 1 is a video stream composed of TS packets having a packet identifier of 0x02.
  • the packetized elementary stream identified by the logical stream number of audio # 1 is composed of TS packets having a packet identifier of 0x80, and indicates that the audio stream is in Japanese.
  • the packetized elementary stream identified by the logical stream number of subtitle # 1 is composed of TS packets with a packet identifier of 0x92, the language is Japanese, and the size is normal size. Show.
  • FIG. 7 shows an example of a playlist file that defines a 3D playlist.
  • the version of the reproduction attribute information in this figure is “2.00”, the reproduction type of the playlist file is “movie”, and the play items included in the reproduction data information are reproduced in order from the top. Meaning “sequential” is set as the playback type.
  • the dimension identification flag indicates that the 3D playlist can be displayed in 3D.
  • the playback data information indicates that there is one PlayItem and the stream file used by PlayItem # 0 is 00001.m2ts under the STREAM directory.
  • PlayItem # 0 has a playback time of 0x002932E0, and PlayItem # 0 includes stream registration information indicating details of four packetized elementary streams.
  • the packetized elementary stream to which the logical stream number of video # 1 is assigned is composed of TS packets whose packet identifier is 0x02, and has a visual attribute for the left eye.
  • a packetized elementary stream to which a logical stream number of video # 2 is assigned is composed of TS packets having a packet identifier of 0x02, and has a visual attribute for right eye use.
  • the packetized elementary stream to which the logical stream number of voice # 1 is assigned is composed of TS packets having a packet identifier of 0x80 and has a Japanese language attribute.
  • the packetized elementary stream to which the logical stream number of subtitle # 1 is assigned consists of TS packets with a packet identifier of 0x92, has Japanese language attributes, and is subtitle data with a normal character size I understand.
  • FIG. 8 is a diagram showing the configuration of the internal configuration of the playback apparatus.
  • the playback device includes a BD drive 1a, a network device 1b, a local storage 1c, read buffers 2a and 2b, a virtual file system 3, a demultiplexer 4, video decoders 5a and 5b, a video plane 6 and an image.
  • the video stream input to the playback apparatus 200 includes a left-eye stream and a right-eye stream, but the subtitle / GUI stream is input for both the left-eye and the right-eye.
  • description is made on the assumption that the stream for the right eye and the stream for the left eye are embedded in one stream file in advance. This is to suppress as much as possible the amount of computation required by a device (for example, a CE device) having a small device resource in memory or graphics.
  • the BD drive 1a includes an optical head having a semiconductor laser, a collimator lens, a beam splitter, an objective lens, a condenser lens, and a photodetector.
  • the light beam emitted from the semiconductor laser passes through the collimator lens, the beam splitter, and the objective lens, and is condensed on the information surface of the optical disk.
  • the condensed light beam is reflected / diffracted on the optical disk, and is collected on the photodetector through the objective lens, the beam splitter, and the condenser lens.
  • a reproduction signal is generated according to the amount of light collected by the photodetector. By demodulating the reproduction signal, various data recorded on the BD can be decoded.
  • the network interface 1b is for communicating with the outside of the playback device, and can access a server accessible via the Internet or a server connected via a local network. For example, it can be used to download additional BD-ROM content published on the Internet, or it can be played back using the network function by performing data communication with a server on the Internet specified by the content. To do.
  • the BD-ROM additional content is content that does not exist in the original BD-ROM, and includes, for example, additional sub audio, subtitles, privilege video, and applications.
  • the network interface 1b can be controlled from the BD-J platform, and additional content published on the Internet can be downloaded to the local storage 1c.
  • the local storage 1c includes built-in media and removable media, and is used for storing downloaded additional content, data used by applications, and the like.
  • the storage area for additional content is divided for each BD-ROM, and the area that an application can use to hold data is divided for each application.
  • merge management information that describes how merged downloaded content and data on the BD-ROM are merged is stored in the built-in media and removable media.
  • Build-in media is a writable recording medium such as a hard disk drive or memory built in the playback device.
  • the removable media is, for example, a portable recording medium, and preferably a portable semiconductor memory card such as an SD card.
  • the playback device has a slot (not shown) for attaching a removable medium and an interface (for example, a memory card I) for reading the removable medium installed in the slot. / F), and when a semiconductor memory is installed in the slot, the removable media and the playback device are electrically connected and recorded in the semiconductor memory using an interface (for example, a memory card I / F). Data can be converted into an electrical signal and read out.
  • the read buffer 2a is a buffer for temporarily storing source packets constituting extents constituting the left-eye stream read from the BD drive 1a, adjusting the transfer speed, and transferring the packets to the demultiplexer 4. .
  • the read buffer 2b is a buffer for temporarily storing source packets constituting extents constituting the right-eye stream read from the BD drive 1a, adjusting the transfer speed, and transferring the packets to the demultiplexer 4. .
  • the virtual file system 3 Based on the merge management information downloaded to the local storage 1c along with the additional content, the virtual file system 3 merges the additional content stored in the local storage with the content on the loaded BD-ROM.
  • Build ROM virtual package.
  • the virtual file system 3 for constructing a virtual package has an “application data association module 3a” for generating and updating application data association information.
  • the application data association information is information that associates local storage information with an application based on file system information on the BD-ROM disc and alias access information set by the application.
  • the HDMV module which is the HDMV mode operator, and the BD-J platform, the BD-J mode operator, can refer to the virtual package and the original BD-ROM without discrimination.
  • the playback device performs playback control using both data on the BD-ROM and data on the local storage.
  • the demultiplexer 4 includes a source packet depacketizer and a PID filter, and a packet identifier corresponding to a stream to be reproduced (the stream is included in the loaded BD-ROM and the local storage corresponding to the loaded BD-ROM).
  • the packet filtering based on the packet identifier is executed, and the TS packet obtained as a result is output to the decoder.
  • the demultiplexer 4 can sort the video frame of the left-eye video stream and the video frame of the right-eye video stream from the stream header information.
  • the demultiplexer alternately processes the video frame of the left-eye video stream and the video frame of the right-eye video stream, and the left-eye video frame and the right-eye video stream. When both of the video frames are decoded, both are output. Further, due to the hardware configuration, when there are two outputs, the left-eye video and the right-eye video are output separately.
  • Video decoder 5 decodes the TS packets constituting the left-eye video stream output from the demultiplexer 4, and indicates the uncompressed video frame by the left-eye video plane 6 (reference (L) in the video plane 6 in FIG. 8). To write).
  • the TS packet constituting the right-eye video stream output from the demultiplexer 4 is decoded, and the uncompressed video frame is represented by the right-eye video plane 6 (indicated by the code (R) in the video plane 6 in FIG. 8). Write to.
  • the video plane 6 is a plane memory that can store picture data forming a video frame with a resolution of 1920 ⁇ 2160 (1280 ⁇ 1440), for example, and is a plane for the left eye having a resolution of 1920 ⁇ 1080 (1280 ⁇ 720) ( 8), and the right-eye plane having the resolution of 1920 ⁇ 1080 (1280 ⁇ 720) (denoted by the symbol (R) in the video plane 6 in FIG. 8).
  • R symbol
  • Image decoder 7a, b The image decoders 7a and 7b decode TS packets that are output from the demultiplexer 4 and are written in the image memories 7c and 7d, and write uncompressed graphics subtitles in the image plane 8.
  • “Subtitle data” decoded by the image decoders 7a and 7b is data representing subtitles compressed by run-length encoding, a pixel code indicating a Y value, a Cr value, a Cb value, and an ⁇ value, and the pixel code Defined by run length.
  • the image plane 8 is a graphics plane that can store graphics data (for example, caption data) obtained by decoding caption data with a resolution of, for example, 1920 ⁇ 1080 (1280 ⁇ 720), for example, 1920 ⁇ 1080.
  • graphics data for example, caption data
  • Left-eye plane (indicated by the symbol (L) in the image plane 8 shown in FIG. 8) having a storage area capable of storing data having a resolution of (1280 ⁇ 720), 1920 ⁇ 1080 (1280 ⁇ 720)
  • a plane for the right eye (indicated by reference numeral (R) in the image plane 8 shown in FIG. 8) having a storage area capable of storing data having a resolution of.
  • the audio decoder 9 decodes the audio frame output from the demultiplexer 4 and outputs uncompressed audio data.
  • the interactive graphics plane 10 is a graphics plane having a storage area in which graphics data drawn by the BD-J application using the rendering engine 27a can be stored with a resolution of 1920 ⁇ 1080 (1280 ⁇ 720), for example.
  • 1920 ⁇ 1080 A plane for the right eye having a storage area capable of storing data having a resolution of 1280 ⁇ 720) (indicated by reference numeral (R) in the interactive graphics plane 10 of FIG. 8).
  • “Graphics data” stored in the interactive graphics plane 10 is a graphic in which each pixel is defined by an R value, a G value, a B value, and an ⁇ value. Graphics written in the interactive graphics plane 10 are images and widgets having a purpose mainly used for configuring a GUI. Although there is a difference in data representing pixels, image data and graphics data are included in the expression graphics data.
  • graphics planes There are two types of graphics planes that are the subject of the present application: an image plane 8 and an interactive graphics plane 10, and when simply referred to as a “graphics plane”, it indicates either or both of the image plane 8 and the interactive graphics plane 10. To do.
  • the background plane 11 is a plane memory that can store still image data to be a background image with a resolution of 1920 ⁇ 1080 (1280 ⁇ 720), for example, specifically, 1920 ⁇ 1080 (1280 ⁇ 720).
  • the register set 12 includes a playback state register that stores the playback state of the playlist, a playback setting register that stores configuration information indicating the configuration of the playback device, and a general-purpose register that can store arbitrary information used by the content. It is a gathering of.
  • the reproduction state of the playlist indicates a state such as which AV data is used in various AV data information described in the playlist and which position (time) of the playlist is being reproduced.
  • the playback control engine 14 stores the contents in the register set 12.
  • the value specified by the application can be stored or the stored value can be stored in the application according to the instruction from the HDMV module that is the HDMV mode operation subject or the Java platform that is the BD-J mode operation subject. It is possible to pass
  • the static scenario memory 13 is a memory for storing current playlist information and current clip information.
  • Current playlist information refers to information that is currently processed among multiple playlist information that can be accessed from a BD-ROM, a built-in media drive, or a removable media drive.
  • Current clip information refers to information that is currently processed among a plurality of clip information that can be accessed from a BD-ROM, a built-in media drive, or a removable media drive.
  • the playback control engine 14 executes an AV playback function and a playlist playback function.
  • the AV playback function is a group of functions followed from DVD players and CD players. Playback start, playback stop, pause, release of pause, release of still image function, fast forward with specified playback speed, playback speed Is a process such as rewind, audio switching, sub-video switching, angle switching, etc., designated as an immediate value.
  • a movie object executed by the HDMV module or a BD-J application executed by the BD-J platform makes a processing request to the playback control engine 14 to perform normal playback such as playback start and playback stop.
  • the playlist playback function refers to performing playback start and playback stop in accordance with current playlist information and current clip information constituting the current playlist in the AV playback function.
  • AV stream playback may be triggered by a user operation (for example, a playback button) or automatically triggered by some event in the terminal.
  • the playback device middleware provides APIs for executing the various functions of the playback control engine to the BD-J application.
  • An API library for causing the playback control engine to execute each playback function is “AV playback library 14a”.
  • Each API in the “AV Playback Library 14a” includes various member functions. By calling the AV playback library member functions (methods) by specifying arguments, the functions of these member functions can be used in the playback control engine. Let it run. On the other hand, the movie object issues a navigation command corresponding to this member function to cause the playback control engine to execute processing corresponding to these APIs.
  • “selectPlayList” is an API for a BD-J application to order switching of playlists, and an argument for calling this API is a BD-J locator.
  • the BD-J locator is a dedicated BD-J application locator that can specify a playlist to be selected using title_id, playlist_id, and PlayItem_id.
  • a playlist to be reproduced is designated.
  • the result of function execution by the playback control engine is notified to the BD-J application by an event. Therefore, when using the AV playback library, it is necessary to register an event listener in the BD-J application so that an event indicating the execution result can be received.
  • the scaling engine 15 can perform reduction, enlargement, and equal magnification control of the video on the image plane 8 and the video plane 5. If the value is set in the plane shift engine 20 when the image data and the video frame are decoded, the scaling engine 15 regards the scaling as occurring and stores the decoded graphics in the video plane. Scaling is performed through the scaling engine 15 before.
  • the synthesis unit 16 performs layer synthesis of the interactive graphics plane 10, the image plane 8, the video plane 6, and the background plane 11.
  • the plane memories such as the interactive graphics plane 10, the image plane 8, the video plane 6, and the background plane 11 form a layer model.
  • the layer synthesis by the synthesis unit 16 is a plane memory between layers in the layer model of the plane memory. The process of superimposing the pixel values of the pixel data stored in is performed for all combinations between layers in the layer model.
  • the pixel value of the line unit of the plane memory located in a certain layer is multiplied by the transmittance ⁇ as a weight
  • the pixel value of the line unit of the plane memory located in the lower layer is multiplied by (1-transmission ⁇ ) Is multiplied
  • the weighted pixel values are added to each other, and the addition result is used as the pixel value of the pixel in line units in the hierarchy.
  • the layer composition is realized by repeatedly performing the superimposition between the hierarchies between the line-unit pixels located in the two hierarchies of the layer model.
  • the HDMI transmission / reception unit 17 includes, for example, an interface conforming to the HDMI standard (HDMI: High Definition Multimedia Interface), and performs transmission / reception so as to conform to the HDMI standard with a playback apparatus (display apparatus 400 in this example).
  • the picture data stored in the video plane and the uncompressed audio data decoded by the audio decoder 9 are transmitted to the display device 400 via the HDMI transmission / reception unit 17.
  • the display device 400 holds, for example, information regarding whether it is compatible with stereoscopic display, information regarding resolution capable of planar display, and information regarding resolution capable of stereoscopic display, and requests from the playback device via the HDMI transmission / reception unit 17.
  • the display device 400 If there is, the display device 400 returns the requested required information (for example, information regarding whether it is compatible with stereoscopic display, information regarding resolution capable of planar display, information regarding resolution capable of stereoscopic display) to the playback device. In this way, information regarding whether or not the display device 400 supports stereoscopic display can be acquired from the display device 400 via the HDMI transmission / reception unit 17.
  • the requested required information for example, information regarding whether it is compatible with stereoscopic display, information regarding resolution capable of planar display, information regarding resolution capable of stereoscopic display
  • the display function flag holding unit 18 stores a 3D display function flag indicating whether or not the playback apparatus can display 3D.
  • the left-right process storage unit 19 stores whether the current output process is an output for the left eye or an output for the right eye.
  • the flag in the left / right processing storage unit 19 indicates whether the output to the display device (the television in the example of FIG. 1) connected to the playback apparatus shown in FIG. 1 is the left eye output or the right eye output. While the left eye is being output, the flag in the left / right processing storage unit 19 is set to a flag indicating the left eye output. While the right eye is being output, the flag in the left / right processing storage unit 19 is set to a flag indicating the right eye output.
  • the plane shift engine 20 also has an area for storing image plane shift information. After determining whether the current processing target is the left-eye video or right-eye video in the left-right processing storage unit 19, the plane-shift engine 20 is indicated by the stored image plane shift information.
  • the shift amount of the horizontal axis of the image plane is calculated using the plane offset, and the shift is performed. Depth changes by changing the width of the subtitle / GUI horizontal axis. For example, it is possible to obtain a visual effect in which the left-eye caption and the right-eye caption are displayed in the foreground as they are separated farther apart in a certain direction, and are visible in the back as they are separated in the opposite direction.
  • the shift information memory 21 is a module that temporarily stores a value when there is an image plane shift information update request from a user or an application.
  • the image plane shift information is, for example, an integer whose depth is represented by -255 to 255 (255 is the foremost and -255 is the innermost), and is converted into pixel coordinates indicating the final shift width.
  • the BD-J platform 22 is a Java platform that is the main operation of the BD-J mode, and is fully equipped with Java2Micro_Edition (J2ME) Personal Basis Profile (PBP 1.0) and Globally Executable MHP specification (GEM1.0.2) for package media targets.
  • J2ME Java2Micro_Edition
  • PBP 1.0 Personal Basis Profile
  • GEM1.0.2 Globally Executable MHP specification
  • the dynamic scenario memory 23 is a memory that stores a current dynamic scenario and is used for processing by an HDMV module that is an HDMV mode operating subject and a Java platform that is an BD-J mode operating subject.
  • the current dynamic scenario refers to an index.bdmv, BD-J object, or movie object that is currently being executed among BD-ROM, built-in media, and removable media.
  • the mode management module 24 holds Index.bdmv read from the BD-ROM, built-in media drive, or removable media drive, and performs mode management and branch control.
  • the mode management by the mode management module 24 is a module assignment that determines which of the BD-J platform 22 and the HDMV module 25 is to execute a dynamic scenario.
  • the HDMV module 25 is a DVD virtual player that is an operation subject in the HDMV mode, and an execution subject in the HDMV mode.
  • This module has a command interpreter and executes HDMV mode control by decoding and executing navigation commands constituting the movie object. Since navigation commands are described in a syntax similar to DVD-Video, DVD-Video-like playback control can be realized by executing such navigation commands.
  • the UO detection module 26 receives a user operation for the GUI.
  • User operations accepted by the GUI include title selection, subtitle selection, and audio selection as to which title is selected from among the titles recorded on the BD-ROM.
  • a level of depth perception of a stereoscopic image may be received. For example, there are cases where the sense of depth is three levels such as far, normal, and close, and the depth sense is accepted by numerical input such as how many centimeters and how many millimeters the sense of depth is.
  • the UO detection module 26 receives a command for changing the scaling of the image plane by operating a button attached to the remote controller or the device, the module in the device directly issues a scaling command.
  • the rendering engine 27a includes basic software such as Java2D and OPEN-GL, decodes JPEG data / PNG data according to the request from the BD-J application, obtains images and widgets, and writes them to the interactive graphics plane and the background graphics plane. .
  • the image data obtained by decoding the JPEG data becomes a GUI wallpaper and is embedded in the background graphics plane.
  • Pixel data obtained by decoding the PNG data is written in the interactive graphics plane, and button display with animation can be realized.
  • the images and widgets obtained by decoding these JPEG / PNG data can be displayed in a pop-up menu for the BD-J application to accept title selection, subtitle selection, and audio selection, or a stream playback-linked game Used to configure GUI parts when running.
  • a BD-J application accesses a WWW site, it is used to construct a browser screen for that WWW site.
  • the rendering memory 27b is a memory into which PNG data and JPEG data to be decoded by the rendering engine are read.
  • a cache area is secured in the rendering memory 27b when the BD-J application executes the live display mode.
  • the live display mode is a combination of a browser screen of a WWW site existing on the network and stream playback using a BD-ROM.
  • the cache area is a cache memory for caching the current browser screen and the previous browser screen in the live display mode, which is uncompressed PNG data or uncompressed JPEG data, and the browser screen What constitutes is stored here.
  • the display mode setting initial display setting unit 28 sets the playback mode and resolution based on the BD-J object in the current title provided to the BD-J platform unit.
  • the dimension mode storage unit 29 stores a playback mode and a stereo mode.
  • the playback mode that is the terminal setting stored in the dimension mode storage unit 29 can be switched to either 2D or 3D.
  • 3D mode the state where the playback mode is indicated as “3D”
  • 2D playback mode the state where the playback mode is indicated as “2D”
  • a register set 12 is an example of one that holds the capability of the playback device and content setting information.
  • FIG. 9 schematically shows an example of the contents of the register set 12.
  • the register set 12 includes “reproduction status registers (0) to (127)” and “general-purpose registers (0) to (4096)”.
  • Within the playback status register is a collection of numbered storage locations for storing certain values. For example, an identifier of a playlist that is currently being reproduced is entered in a storage location of a certain number, and an identifier of an audio that is being used is stored in another storage location of a number.
  • Each value is entered by the playback control engine 18, the HDMV module 25, or the BD-J platform 22.
  • the content can be obtained by the HDMV module 25 or the BD-J platform 22 from the playback status register or the general-purpose register, and the value corresponding to the designated number can be stored.
  • the upper third bit is assigned to the setting of whether to operate in 2D or 3D.
  • the lower third bit is assigned to the setting of whether or not the playback device has the ability to create 3D video.
  • Subtitle / GUI data exists as data handled by the playback apparatus 200 according to the present embodiment.
  • Subtitle / GUI data must be displayed in front of the video stream to reduce eye fatigue.
  • the result of shifting the subtitle / GUI stream on the horizontal axis is synthesized and output.
  • Depth changes by changing the width of the subtitle / GUI horizontal axis. For example, it is possible to obtain a visual effect in which the left-eye caption and the right-eye caption are displayed in the foreground as they are separated farther apart in a certain direction, and are visible in the back as they are separated in the opposite direction.
  • a playback mode that realizes stereoscopic viewing by executing plane shift based on the plane offset is referred to as “1 plane + offset mode”.
  • the plane shift engine 20 realizes this 1 plane + Offset mode.
  • shift amount The difference between the coordinates of the original pixel data and the coordinates of each pixel data when shifted to the right or left is called “shift amount”. This amount of shift should be calculated by a depth value indicating how much depth the image plane 8 or the interactive graphics plane 10 has in stereoscopic view. Further, it can be derived from any parameter that can be adopted as parallax for both eyes in stereoscopic reproduction.
  • plane offset a parameter for moving pixel data in the graphics plane to the left and right by the shift amount as described above. While the shift amount is a scalar amount, the plane offset is a vector having positive and negative values, and the coordinate of the pixel data is moved from the current state to either the right direction or the left direction. Point to.
  • the content creator embeds in advance information on the depth at which the subtitle / GUI is displayed in the playlist information, so that the playback device 200 can reproduce the stream associated with the playlist information while playing back the stream. Based on the information, the subtitle / GUI continues to be displayed three-dimensionally in front of the video stream.
  • the pixel shift amount and the pixel shift direction to be used as a plane offset are supplied from the outside of the playback device, such as a recording medium or a user operation.
  • a set of pixel shift amount and pixel shift direction supplied from the outside of the playback apparatus, which is information that is a source of plane offset, is referred to as “image plane shift information”.
  • the value described in the image plane shift information may be used as it is, but other calculation results, for example, a value obtained by multiplying or combining the image plane smoothing information with a value set in the terminal in advance. It can also be used.
  • the amount of shift of the image plane becomes too large, and there may be a phenomenon that the image looks double without being noticed.
  • adjustment is made so that subtitles and graphics are not displayed too much forward by combining the resolution and size information of the display device based on the values described in the image plane shift information to obtain the shift amount.
  • FIG. 10 is a diagram showing a video frame in which an image plane subjected to plane shift is combined.
  • 9L and 9R are examples of video frames stored in the video plane by the decoder. The difference in the orientation and position of the woman's face indicates that the left-eye stream and the right-eye stream were taken from different angles.
  • 9S is a diagram showing the contents of the graphics plane that has not been plane-shifted
  • 9LS is a snapshot of the image plane when the subtitle “I love you” is shifted to the left
  • 9RS is a snapshot of the image plane when the subtitle “I love you” is shifted to the right.
  • 9LL is a composite image obtained by combining the left-eye video frame and the image shifted in the left direction
  • 9RR is a composite image obtained by combining the right-eye video frame and the image shifted in the right direction. It can be seen that the subtitle “I love you” is shifted to the left in the 9LL left-eye video. In the 9RR right-eye video, the subtitle “I love you” is shifted to the right. When watching TV without LCD glasses, these 9LL and 9RR images appear to be superimposed. This is as shown in FIG.
  • FIG. 11 shows a stereoscopic image that appears when viewing the image plane after the plane shift in the left direction and the image plane after the plane shift in the right direction with the liquid crystal glasses 500.
  • the images for the right eye and the left eye are filtered through, for example, the liquid crystal glasses 500 shown in FIG. 1 to display different images for each eye.
  • the point that should be noted here is that the video stream video is not only three-dimensionalized with the left and right images superimposed, but also the subtitle of "I love you” is shifted laterally, that is, the depth is between It is added (in the case of this embodiment, it is displayed in the foreground). In this way, stereoscopic video and subtitle reproduction that reduce the eye effort for the viewer is possible.
  • FIG. 12 is a flowchart when the playback apparatus 200 according to the first embodiment reads the playlist that is the video data and projects stereoscopic captions / stereoscopic graphics superimposed on the stereoscopic video.
  • the playlist playback request is triggered by an instruction from the content or a user operation (for example, a playback button).
  • a title switching request may be triggered when a disc is inserted or a menu is selected.
  • the static scenario memory 11 extracts the playlist and transport stream that are currently subject to playback processing from the plurality of playlists and multiple streams on the BD-ROM disc,
  • the current playlist information is set (S1).
  • step S5 A value indicating the depth displayed by the subtitle / GUI (hereinafter referred to as image plane shift information) is extracted and stored in a storage area inside the plane shift engine (S5).
  • Step S2 is a determination as to whether or not the dimension identification flag in the current playlist information indicates that 3D mode playback is permitted.
  • Step S3 is whether or not the playback mode in the playback device is the 3D mode. It is a judgment. This determination is made by referring to the flag (2Dor3D) stored in the dimension mode recording unit 29. It is assumed that the value stored in the dimension mode recording unit is switched in advance by, for example, a user operation or an instruction from an application.
  • step S12 If any of these steps is negative, the playback mode is switched to the 2D playback mode in step S12, and play item playback is performed in the 2D mode (step S13).
  • step S4 and step S5 are executed, the process proceeds to a loop of step S6 to step S13.
  • step S6 repeats the processing from step S7 to step S13 until the step S12 is determined to be Yes after initializing the current playitem number to 1 in step S6.
  • the loop end condition is that the play item becomes the last number in the play list, and the current play item number is incremented unless this condition is satisfied (step S13).
  • the AV stream specified by the stream file information of the current play item information is set as the current stream (step S7), and the packet identifier and the right eye video of the left eye video stream of the current play item information are set.
  • the packet identifier of the stream By setting the packet identifier of the stream to the demultiplexer, the left-eye video stream and the right-eye video stream are separated (step S8), and it is determined whether the playback type of the current playitem information is a movie or a slide show. (Step S9), if it is a slide show, slide show playback is performed by video frame processing in 3D mode (step S10), and if it is a movie, movie playback is performed by video frame processing in 3D mode (step S10).
  • S11 is that.
  • FIG. 13 is a flowchart showing video frame processing in 3D mode.
  • the demultiplexer 4 demultiplexes the transport stream on the disk and stores the graphic stream in the image memories 7c and d (S802).
  • the image decoder 8 decodes the graphic stream or the like stored in the image memories 7c and d and writes it in the image plane 8 (S803).
  • the demultiplexer 4 demultiplexes the transport stream on the disc, extracts the corresponding video stream based on the flag of the left and right processing storage unit 19, and converts the video decoded through the video decoder 5 into a video plane.
  • Store (S804).
  • the left-eye process is set by default for the flag in the left-right process storage unit.
  • the order of S802 to S804 shows an example, and these orders may be performed in any order.
  • the plane shift engine After storing the image plane, the plane shift engine refers to the flag of the left / right processing storage unit 19 based on the image plane shift information stored in step S5, shifts the image plane in a certain direction, 16 synthesizes the image of the graphic plane 9 shifted onto the video plane 5 (S805).
  • the direction in which the plane shift engine shifts in S805 differs depending on whether the image plane is displayed on the front side or on the back side, but in the first embodiment, it is assumed that the left-eye image is shifted to the right side, that is, displayed on the front side.
  • the final video synthesized by the synthesis unit 16 in S805 is output to the display device 400 as a left-eye video (S806).
  • the playback apparatus changes the flag of the left / right processing storage unit 19. That is, when the left eye process is set, the process is switched to the right process, and when the right eye process is set, the process is switched to the left process.
  • the demultiplexer 4 demultiplexes the transport stream on the disc, extracts the corresponding video stream based on the flag of the left and right processing storage unit 19, and converts the video decoded through the video decoder 5 into a video plane.
  • Store (S807) In this example, since the right-eye process is performed in this step, the right-eye video stream is extracted.
  • the image decoder 8 decodes the graphic stream stored in the image memory 7c, d and writes it into the image plane 8 (S808).
  • the image plane is shifted in a certain direction with reference to the flag of the left and right processing storage unit 19, and the synthesis unit 16 synthesizes the image of the graphic plane 9 shifted onto the video plane 5 (S809).
  • S805 since the processing for the left eye is performed, the shift is performed to the right. However, since the processing for the right eye is performed this time, the shift is performed in the opposite direction, that is, the left direction.
  • the final video synthesized by the synthesis unit 16 in S809 is output to the display device 400 as a right-eye image (S810).
  • the playback device changes the flag of the left / right processing storage unit 19. That is, when the left eye process is set, the process is switched to the right process, and when the right eye process is set, the process is switched to the left process.
  • the playback device 200 repeatedly executes the processing of S802 to S810 as long as the next frame exists when S810 is completed.
  • FIG. 14 shows a playlist playback processing procedure in the 2D playback mode.
  • the current play item number is set to 1 (step S21), and the AV stream specified by the stream file information of the current play item information is the current stream. (Step S22), and then the processing of step S24-step S25 and step S26-step S27 is selectively executed according to the result of the determination step of step S23.
  • Step S23 is a determination as to whether or not the current stream includes a left-eye video stream and a right-eye video stream. If included, a base-view video stream that can be independently played out of the left-eye video stream and the right-eye video stream is included.
  • the video stream is separated by setting the packet identifier in the demultiplexer (step S24), and then frame processing of the base-view video stream is executed (step S25).
  • the video stream packet identifier is set in the demultiplexer to separate the video stream (step S26), and the frame processing of the video stream is executed (step S27).
  • FIG. 15 is a flowchart showing a processing procedure of 2D stream video frame processing in 2D mode.
  • the demultiplexer 4 demultiplexes the transport stream on the disk and stores the graphic stream in the image memories 7c and d (S1103).
  • the image decoder 8 decodes the graphic stream stored in the image memories 7c and d and writes it into the image plane 8 (S1104).
  • the demultiplexer 4 demultiplexes the transport stream on the disk, extracts the video stream, and stores the video decoded through the video decoder 5 in the video plane (S1105).
  • the synthesizing unit 16 synthesizes the image of the graphic plane 9 on the video plane 5 (S1106).
  • the final video synthesized by the synthesis unit 16 in S1106 is output to the display device 400 (S1107).
  • S1107 after the final video is output to the display device 400, it is determined whether or not it is the first frame processing after the switching of the playback mode occurs (S1108).
  • the final image is output to the display device 400.
  • FIG. 16 is a flowchart showing a processing procedure of video frame processing of a 2D stream in 2D mode.
  • the 2D video output process will be described with reference to the flowchart of FIG.
  • the demultiplexer 4 demultiplexes the transport stream on the disk and stores the graphic stream in the image memories 7c and d (S1201).
  • the image decoder 8 decodes the graphic stream or the like stored in the image memory 7c, d and writes it into the image plane 8 (S1202).
  • the demultiplexer 4 demultiplexes the transport stream on the disc, extracts the video stream for the left eye, and stores the video decoded through the video decoder 5 in the video plane (S1203).
  • the synthesizing unit 16 synthesizes the image of the graphic plane 9 on the video plane 5 (S1204).
  • the final video synthesized by the synthesis unit 16 in S1204 is output to the display device 400 (S1205).
  • the left-eye video stream is extracted in S1203, the right-eye video stream may be extracted and combined in S1204. In this way, even when the transport stream to be played is 3D (S2: 3D), the playback device 200 does not support 2D video if the terminal setting is set to 2D mode (S3: 2D). Can be output.
  • the playback device is set to be played back in 2D or is played back in 3D.
  • a stereoscopic video can be appropriately generated.
  • the previous embodiments consisted of two planes, an image plane and a video plane. However, if the video plane and two or more graphic planes are used, the number of image memories and image planes is set according to the number of planes. The plane is shifted based on the image plane shift information of each image plane, and is superimposed.
  • FIG. 17 is a flowchart showing a processing procedure of playlist reproduction processing that can support a multi-image plane.
  • the playback apparatus 200 extracts image plane shift information from the current playlist information in the static scenario memory 11 by the number of image planes, and stores it on the plane shift engine 28 using an array (S1301).
  • the demultiplexer 4 demultiplexes the transport stream on the disc, extracts the left-eye video stream, and stores the video decoded through the video decoder 5 in the video plane (S1302).
  • the demultiplexer 4 demultiplexes the transport stream on the disk and stores the graphic stream in the image memories 7c and d (S1303).
  • the image decoder 8 decodes the graphic stream and the like stored in the image memories 7c and d and writes them in the image plane 8 (S1304).
  • the plane shift engine shifts the image plane in a certain direction based on the highest value of the image plane shift information array stored in S1301, and the combining unit 16 shifts the graphic that has been shifted onto the video plane 5.
  • the image of the plane 9 is synthesized (S1305).
  • S1305 is the second time or later, the image of the graphic plane 9 shifted onto the video plane is not synthesized, but a new image plane is superimposed on the video synthesized in the previous S1305. If S1305 is the second time or later, the image shift information referred to by the image shift information array is used for the second and subsequent times.
  • the playback apparatus 200 determines whether all the image planes have been combined based on whether or not the processing of both eyes corresponding to the arrangement of the image plane shift information has been performed (S1306). If all the image planes have not been combined (S1306: No), the processing of S1303 to S1305 is repeated using the next image plane shift information to process the next image plane. When all the image planes have been combined (S1306: Yes), the final video combined by the combining unit 16 in S1305 is output to the display device 400 as a left-eye image (S1307). After the left-eye video is output in S1307, the right-eye video is equivalent to the left-eye video (S1302 to S1307).
  • the BD-J application assumed in this embodiment is a Java (TM) Xlet controlled by the application manager in the platform through the Xlet interface.
  • the Xlet interface has four states, “loaded”, “paused”, “active”, and “destoryed”, and is event-driven, that is, performs state transition and control by an event.
  • key events that trigger application operations are registered in advance. In this way, registration of a key event serving as an operation trigger is performed by EventListner.
  • the operation of the BD-J application has the following differences compared to movie objects. For example, when a command interpreter that is a command execution body in HDMV mode is instructed to play a 10-minute digital stream, it does not return any response for 10 minutes, and does not respond until 10 minutes have elapsed. return. However, since the BD-J application is event-driven, the Java virtual machine returns a response to the BD-J application immediately after decoding the playback command and issuing an instruction to the lower layer. As described above, the behavior of the execution subject varies depending on the operation mode.
  • events that should be notified to key control points are determined in advance, and this key event It is necessary to register an event listener for receiving the message in the Xlet interface of the class file in advance to promote proper operation of the virtual machine. For example, if the playback mode of the playback device is switched from 2D to 3D or 3D to 2D, an event indicating this switching is output, and the received event listener is registered in the Xlet interface of the BD-J application. Thus, it is possible to realize processing switching of the BD-J application in accordance with the change in the playback mode.
  • the display device when the playback mode is switched, the display device is notified that the mode of the playback device has been switched, and a notification that the output according to the switched mode can be performed from the display device.
  • a notification that the output according to the switched mode can be performed from the display device.
  • FIG. 18 is a flowchart showing the processing procedure of video frame processing in 3D mode, incorporating the output procedure of the playback mode switching completion notification event. This figure is drawn on the basis of FIG. 13. Compared with FIG. 13, which is the base, steps S1108 and S1109 are added between step S810 and step S811 in this figure. Is different.
  • the playback apparatus 200 can switch between 2D and 3D modes by performing 2D video output processing when the transport stream to be played back is 2D and the terminal setting is the 2D mode. .
  • FIG. 19 is a flowchart showing the processing procedure of the video frame processing in the 2D mode, which incorporates the output procedure of the playback mode switching completion notification event.
  • This figure is drawn on the basis of FIG. 15, and is different from FIG. 15 as the base in that step S1108 and step S1109 are added between step S1107 and step S1110. It is determined whether or not it is the first frame after the reproduction mode switching has occurred (S1108). If it is the first frame, the terminal notifies the application of the end of switching the playback mode (S1109). In this way, the playback apparatus 200 can switch between 2D and 3D modes by performing 2D video output processing when the transport stream to be played back is 2D and the terminal setting is the 2D mode. .
  • an event that prompts appropriate drawing for the 3D mode is output to a BD-J application having an event-driven xlet interface, so that the video content is changed from 2D.
  • graphics rendering by a BD-J application can also be switched from 2D to 3D and from 3D to 2D.
  • FIG. 20 is a flowchart showing a processing procedure of video frame processing in 3D mode in consideration of special playback in addition to normal playback. This figure is drawn on the basis of FIG. 13 and is different from FIG. 13 as the base in that steps S1401 to S1404 are added.
  • Step S1401 determines whether the video currently being played is special playback or normal playback.
  • normal playback an example in which both video and subtitles are displayed in 3D is shown.
  • the demultiplexer 4 demultiplexes the transport stream on the disc, extracts the corresponding video stream based on the flag of the left and right processing storage unit 19, and decodes the video stream through the video decoder 5.
  • the video plane S1402
  • the left-eye process is set by default for the flag in the left-right process storage unit.
  • the final video synthesized by the synthesis unit 16 is output to the display device 400 as a left-eye video (S1403).
  • the playback apparatus changes the flag of the left / right processing storage unit 19. That is, when the left eye process is set, the process is switched to the right process, and when the right eye process is set, the process is switched to the left process.
  • the final video synthesized by the synthesis unit 16 is output to the display device 400 as a right-eye image (S1404).
  • the reproducing apparatus changes the flag of the left / right processing storage unit 19. That is, when the left eye process is set, the process is switched to the right process, and when the right eye process is set, the process is switched to the left process.
  • the playback device 200 repeatedly executes the processing of S1401 to S1404 as long as the next frame exists when S1404 is completed.
  • Processing to continue 3D display when stopped, paused, or played back includes “3D continuation by repeated display”, “next frame setting”, and “exception by ability”.
  • 3D continuation by repeated display For example, when a video is played back in 3D display and stopped, paused, or slid by a user operation or content (Java (registered trademark) application, MovieObject) using the remote control 300, the video The 3D display is continued by continuously displaying the left-eye video frame, the right-eye video frame, and the subtitle data at the position where the image is stopped. That is, the next frame when the next frame processing is performed in FIG. 3D display is continued by setting the video frame position of the left-eye video stream at the stopped position. It is also conceivable to continue 3D display of video only and mute subtitles.
  • the 3D display cannot be continued due to resource constraints such as memory, it is possible to mute the video and subtitles.In this way, by continuing the 3D display as much as possible, an unnatural stereoscopic effect can be obtained. The occurrence of the difference can be suppressed as much as possible, and the discomfort to the viewer can be reduced.
  • Next frame setting 3D display can be continued by setting the next frame when performing the next frame processing to the position of the video frame of the video stream for the left eye where the video is always stopped.
  • 3D display cannot be continued due to resource constraints such as memory, it is possible to mute the video and subtitles. In this way, by continuing 3D display as much as possible, it is possible to suppress the occurrence of an unnatural difference in stereoscopic effect as much as possible, and to reduce discomfort to the viewer.
  • FIG. 21 shows an example of a flowchart in the case of prohibiting special playback by an instruction from a user operation, a BD-J application, or a movie object.
  • the playback control engine 18 acquires the dimension identification flag 40111 from the current playlist (PL) information in the static scenario memory 11, and determines whether the video is 2D or 3D (S1502). If it is determined in S1502 that the video is 3D, the special playback request in S1501 is rejected, and normal video playback is continued (S1503). If it is determined in S1502 that the video is 2D, the special playback request in S1501 is accepted, and the video playback is changed to special playback (S1504).
  • FIG. 22 is a block diagram showing the internal configuration of a playback apparatus to which the Depth calculation method is applied. As shown in FIG. 16, it can be seen from the internal configuration shown in the first embodiment that a depth arithmetic engine 34 is added.
  • Depth calculation engine 34 has a function for calculating the depth from the subtitle / GUI for the left eye and the video frame.
  • a depth calculation method a 2D video stream and the depth of each screen pixel for each frame of the 2D video stream are input, and a playback device generates a left-eye 3D video stream and a right-eye 3DAV stream based on the input. .
  • This scheme is shown in US Pat. No. 5,929,859.
  • three-dimensional subtitles and graphics can be superimposed on the 3D video stream by slightly changing the method shown in FIG. become.
  • FIG. 23 is a flowchart showing a processing procedure of video frame processing in the 3D mode when the depth calculation method is used. This figure is drawn based on FIG. 20, and is different from FIG. 20 as the base in that steps S1701 to S1703 are added between step S1401 and step S802.
  • the depth calculation engine 34 extracts the entire screen depth information indicating the depth information of each pixel of the screen (S1701). Next, the Depth calculation engine 34 extracts depth information corresponding to the pixel determined to be closest to the screen from the entire screen depth information extracted in S1701 (S1702). The depth calculation engine 34 stores the value extracted in S1702 in the storage area of the plane shift engine 28 (S1703).
  • step S1703 in order to display subtitles / graphics slightly before the maximum point in the video, in addition to the value extracted in S1703, a value that comes slightly closer is stored in the storage area of the plane shift engine 28. It is desirable. .
  • the playback device 200 performs the processes of S802 to S810 and S1401 to S1404. Since the detailed processing of S802 to S810 and S1401 to S1404 has been described with reference to FIGS. 8 and 14 of the first embodiment, it will not be described here. After the processing of steps S810 and S1404 is completed, the processing from S1701 onward is repeated as the next frame processing.
  • the video of S804 and S807 in FIG. 13 is not obtained by the demultiplexer 4 demultiplexing the transport stream on the disk and acquiring the left-eye video stream and the right-eye video stream, but the input 2D
  • the left-eye video frame and the right-eye video frame that have been processed to make the video stream appear in 3D are the targets of steps S804 and S807 in FIG.
  • special playback such as fast-forwarding and rewinding is performed in the depth calculation method in which the 2D video stream and the depth information of each pixel of the screen are input, and the video for the left eye is used.
  • the right-eye video frame cannot be generated from the decoding of the video, it is possible to output the video for both eyes, which not only prevents flickering of the video, but also forcibly displaying the subtitles, Unnatural reproduction that does not match can be prevented.
  • playback combining the first embodiment and the sixth embodiment is also possible.
  • the method is switched to the method of the second embodiment during special playback, and the content is played back by the method of the first embodiment during normal playback. It is also possible to do.
  • HDMI transmission / reception unit 17 transfers uncompressed and plain text pixel data for one line in the layer-combined picture data to the display device at a high transfer rate according to the horizontal synchronization period of the display device.
  • audio data in uncompressed / plaintext format is sent to other devices (including not only the display device but also an amplifier and a speaker) connected to the playback device. Forward.
  • devices such as display devices, amplifiers, and speakers connected via HDMI can receive uncompressed and plaintext picture data, and uncompressed and plaintext audio data, and realize playback output. it can.
  • HDMI performs transmission of uncompressed and plaintext picture data and audio data
  • a strict judgment is made as to whether or not the connection partner is a legitimate device. Therefore, in the HDMI connection, mutual authentication is performed between the connected playback device and display device. Since this mutual authentication is basically executed also when the frequency is changed, mutual authentication between the playback device and the display device is performed when the mode is changed.
  • FIG. 24 shows a communication sequence between the playback device 200 and the display device 400.
  • the time axis is a vertical axis
  • HDMI has three phases: a transmission phase, a mutual authentication face, and a transmission phase.
  • Switching from the transmission phase to the mutual authentication face is triggered by a 2D ⁇ 3D switching request (or 3D ⁇ 2D switching request), and switching from the mutual authentication phase to the transmission face is triggered by the end of authentication. . That is, when the L image is decoded on the playback device side and the display output of the L image is performed on the display device 400, if a switching request is made, mutual authentication is started. During this time, the display device 400 is in a blackout state. In the authentication end phase, an event is output to the BD-J application. Then, the L image and the R image are decoded on the playback device side, and the display output of the L image and the R image is performed on the display device 400.
  • FIG. 25 is a flowchart showing a procedure for mode switching in the HDMI interface.
  • the new mode is output to the display device 400 via HDMI (step S31), and resynchronization processing and re-authentication processing are executed with the display device 400 (step S32). Then, it waits for the completion of authentication (step S33). If the authentication is completed, an event is output to the BD-J application (step S34).
  • the left-eye video stream and right-eye video stream that have been described so far can be played back independently, and are referred to as a base-view video stream.
  • a video stream composed of video frames that are compression-encoded based on the correlation with individual video frames constituting the base-view video stream is called a dependent-view stream.
  • the output mode is called “BD presentation mode”.
  • the same video frame is repeated two or more times while maintaining the playback mode in 3D mode.
  • the reproduction is performed so that the video frames output to the respective areas (L, R) and written to the video plane are used for reproduction.
  • the type is called BB presentation mode.
  • BB presentation mode only the video frame of the base-view video stream that can be reproduced independently is repeatedly output as "B"-"B"-"B"-"B".
  • FIG. 26 shows how the picture to be output, the display frequency, and the HDMI change when the playback mode changes such as the BD presentation mode, the BB presentation mode, and the BD presentation mode.
  • the first level shows a picture output to the display device 400
  • the second level shows a display frequency. This frequency is to display the left-eye and right-eye video frames at the frequency of the film material, and has a value of 48 frames / second z (2 ⁇ 24 frames / second).
  • the third level shows the HDMI status.
  • the transition from the BD presentation mode to the BB presentation mode is made when the special playback start is instructed, and the transition from the BB presentation mode to the BD presentation mode is performed when the special playback ends. Made. Even if a transition between the BB presentation mode and the BD presentation mode occurs as described above, the display frequency is maintained at 48 Hz as shown in the second row, and the HDMI replay is performed as shown in the third row. It turns out that authentication does not occur.
  • FIG. 27 shows how the decoding content in the playback device and the display content on the display device 400 change due to switching from normal playback to fast forward and from fast forward to normal playback.
  • the time axis in this figure is the vertical direction, and is composed of three phases: a normal playback phase, a special playback phase, and a normal playback phase.
  • the playback device In the normal playback phase, the playback device is in BD presentation mode, and L and R images are decoded and output. In the display device 400, L images and R images are alternately output.
  • the playback device is in the BB presentation mode.
  • L images and L images are decoded and output.
  • L images and L images are alternately output.
  • the playback device In the normal playback phase, the playback device is in BD presentation mode, and L and R images are decoded and output. In the display device 400, L images and R images are alternately output.
  • MPEG4-AVC format video streams are I picture, B picture, Consists of P pictures. This is the same as a video stream in the MPEG2 format.
  • Non-IDR I picture There are two types of I pictures: IDR pictures and Non-IDRIDI pictures.
  • Non-IDR I picture, P picture, and B picture are compression-coded based on frame correlation with other pictures.
  • a B picture refers to a picture composed of Bidirectionally predictive (B) format slice data
  • a P picture refers to a picture composed of Predictive (P) format slice data.
  • B pictures include a refrence B picture and a nonrefrence B picture.
  • the IDR picture and the B picture and P picture that follow this IDR picture constitute one Closed-GOP.
  • the Non-IDR I picture and the B picture and P picture following the Non-IDR I picture constitute one Open-GOP.
  • Closed-GOP has an IDR picture at the top. Although the IDR picture does not become the head in the display order, other pictures (B picture, P picture) other than the IDR picture cannot have a dependency relationship with the pictures existing in the GOP before the Closed-GOP. As described above, the Closed-GOP has a role of completing the dependency.
  • the difference between the encoding order and the display order is that the order of IDR picture, Non-IDR I picture, and P picture is interchanged.
  • the B picture exists before the Non-IDR I picture.
  • the B picture before the Non-IDR I picture has a dependency relationship with the previous GOP.
  • pictures after the Non-IDR I picture cannot have a dependency relationship with the previous GOP.
  • the Open-GOP can have a dependency relationship with the previous picture.
  • the audio stream is also composed of a plurality of audio frames.
  • the above is the GOP structure in MPEG4-AVC.
  • a GOP structure which of the I-picture, B-picture, and P-picture existing in the GOP is to be selected and played back, which of a plurality of Closed-GOPs and Open-GOPs constituting the video stream is selected. It is possible to adjust the playback speed during double-speed playback by intermittent playback of selecting and playing back.
  • the 3D mode when executing double-speed playback, only the picture data constituting the base-view video stream out of the base-view video stream and the dependent-view stream is read from the BD-ROM, and the BB presentation mode is set. To reduce the access load.
  • FIG. 28 shows which I-picture, B-picture, and P-picture existing in a GOP is selected for playback, and which of a plurality of Closed-GOPs and Open-GOPs constituting a video stream is selected for playback.
  • An example of realization of double-speed reproduction that realizes speed adjustment depending on whether or not to be performed will be described.
  • FIG. 28 (a) is a diagram illustrating normal playback in which picture data included in a plurality of GOPs of the base-view video stream and a plurality of GOPs of the dependent-view stream are sequentially played back.
  • FIG. 28B is a diagram illustrating IP reading in which a B picture existing in a GOP existing in the base-view video stream is skipped and only an I picture and a P picture are sequentially read. In this figure (b), it can be seen that the dependent view stream is not accessed in the 3D mode.
  • FIG. 28 (c) is a diagram showing I reading in which B pictures and P pictures existing in the GOP are skipped and I pictures are read sequentially. In this figure (c) as well, it can be seen that the dependent view stream is not accessed in the 3D mode.
  • FIG. 28 (d) is a diagram showing skip reading in which a plurality of GOPs are skipped.
  • the GOP I picture is played back, and then the readout position is skipped as shown by the arrow, and the next GOP I picture is played back several times. Will be.
  • this figure (d) although it is 3D mode, it turns out that the dependent view stream is not accessed.
  • the playback device performs approximately double speed playback. If the I playback as illustrated in FIG. 28C is performed, the playback device performs 10 times speed playback. Will be made. Further, if the reproduction as shown in FIG. 28D is performed, the reproduction at 30 times speed or more is performed. In order to reduce the access load in the 3D mode, in FIGS. 7B to 7D, only the GOP of the left-eye video stream that is the base-view video stream is used for access. Only pictures are played in the BB presentation mode.
  • the playback speed in the video segment is increased or decreased by adjusting the number of picture data to be skipped according to the speed received from the remote controller. This completes the explanation of the double speed reproduction. Next, details of the video decoder will be described.
  • FIG. 29 shows the internal configuration of the demultiplexer and video decoder.
  • the demultiplexer 4 includes an ATC counter 41, a source depacketizer 42, a PID filter 43, an STC counter 44, an ATC counter 45, a source depacketizer 46, and a PID filter 47.
  • the ATC counter 41 generates Arrival Time Clock (ATC) and adjusts the operation timing in the playback device.
  • ATC Arrival Time Clock
  • the source depacketizer 42 follows the recording rate of the AV clip at the moment when the ATC value generated by the ATC counter becomes the same as the ATS value of the source packet. Then, only the TS packet is transferred to the PID filter. In this transfer, the input time to the decoder is adjusted according to the ATS of each source packet.
  • the PID filter 43 transfers the TS packet output from the source depacketizer 22 whose TS packet PID matches the PID required for reproduction to each decoder according to the PID.
  • the STC counter 44 generates System Time Clock (STC) and adjusts the operation timing of each decoder.
  • STC System Time Clock
  • the ATC counter 45 generates Arrival Time Clock (ATC) and adjusts the operation timing in the playback device.
  • ATC Arrival Time Clock
  • the source depacketizer 46 follows the system rate of the AV clip at the moment when the ATC value generated by the ATC counter becomes the same as the ATS value of the source packet. Then, only the TS packet is transferred to the PID filter. In this transfer, the input time to the decoder is adjusted according to the ATS of each source packet.
  • the PID filter 47 selects a TS packet output from the source depacketizer 26 that has a PID of the TS packet that matches the PID described in the stream selection table of the current play item, according to the PID. Forward to.
  • the video decoder 5 comprises TB51, MB52, EB53, TB54, MB55, EB56, decoder core 57, buffer switch 58, DPB59, and picture switch 60.
  • the Transport Buffer (TB) 51 is a buffer that temporarily accumulates TS packets as they are when TS packets including the left-eye video stream are output from the PID filter 43.
  • a Multiplexed Buffer (MB) 52 is a buffer for temporarily storing PES packets when outputting a video stream from TB to EB.
  • MB Multiplexed Buffer
  • Elementaly Buffer (EB) 53 is a buffer in which a video access unit in an encoded state is stored. The PES header is removed when data is transferred from MB to EB.
  • the Transport Buffer (TB) 54 is a buffer that temporarily accumulates TS packets as they are when TS packets including the right-eye video stream are output from the PID filter.
  • Multiplexed Buffer (MB) 55 is a buffer for temporarily storing PES packets when outputting a video stream from TB to EB.
  • MB Multiplexed Buffer
  • Elementaly Buffer (EB) 56 is a buffer in which a video access unit in an encoded state is stored. The PES header is removed when data is transferred from MB to EB.
  • the decoder core 57 creates a frame / field image by decoding each video access unit of the video elementary stream at a predetermined decoding time (DTS). Since there are MPEG2, MPEG4AVC, VC1, and the like in the compression encoding format of the video stream multiplexed into the AV clip, the decoding method of the decoder core 57 is switched according to the stream attribute. In decoding the picture data constituting the base-view video stream, the decoder core 57 performs motion compensation by using the picture data existing in the future direction or the past direction as a reference picture.
  • DTS decoding time
  • the decoder core 57 when decoding individual picture data constituting the dependent-view video stream, the decoder core 57 performs motion compensation by using the picture data constituting the base-view video stream as a reference picture.
  • the decoder core 57 transfers the decoded frame / field image to the DPB 59 and transfers the corresponding frame / field image to the picture switch at the display time (PTS) timing.
  • the buffer switch 58 uses the decode switch information acquired when the decoder core 57 decodes the video access unit to determine whether to pull out the next access unit from EB53 or EB56, and stores it in EB53 and EB56.
  • the assigned picture is transferred to the decoder core 57 at the timing of the decoding time (DTS) assigned to the video access unit. Since the DTS of the left-eye video stream and the right-eye video stream is set so as to alternate in units of pictures on the time axis, for example, when decoding ahead of schedule while ignoring DTS, the video access unit is decoded in units of pictures. It is desirable to transfer to the core 57.
  • DTS decoding time
  • Decoded PIcture Buffer (DPB) 59 is a buffer that temporarily stores the decoded frame / field image.
  • the decoder core 57 decodes a video access unit such as a P picture or a B picture subjected to inter-picture prediction encoding, it is used to refer to a picture that has already been decoded.
  • the picture switch 60 switches the writing destination to the left-eye video plane and the right-eye video plane.
  • uncompressed picture data is instantly written to the left-eye video plane
  • uncompressed picture data is instantly written to the right-eye video plane.
  • the video decoder having the internal configuration as described above realizes double speed reproduction by reading out the picture data while skipping.
  • caption / GUI, video mute, 2D, and 3D display determination processing are performed according to the playback speed.
  • a playback speed that enables playback of video, audio, and subtitles at the same speed as 1X playback (for example, fast playback that requires a playback speed of 1.3X, frame advance, frame return), it is normal in S1401 Continue subtitle / GUI and video 3D display as in playback.
  • 3D display will continue only for video.
  • the playback speed is such that video and subtitles can be combined
  • the video is decoded only for left-eye or right-eye data
  • subtitles are decoded
  • the left-eye and right-eye videos are not shifted by the plane shift engine 28.
  • both video and subtitles are muted.
  • FIG. 30 is a flowchart showing a special reproduction processing procedure considering double-speed reproduction.
  • steps S53, S54, and S55 are selectively executed according to the determination results of steps S51 and S52.
  • Step S51 is a determination as to whether or not the requested special playback includes decoding control in accordance with the playback speed
  • Step S52 is a determination as to whether or not the playback speed in the BD presentation mode is possible. is there.
  • step S53 the process proceeds to step S53 to switch to the BD presentation mode and execute the decoding control in the BD presentation mode. If it is 2.0 times or more, in step S54, the mode is switched to the BB presentation mode, and decoding control such as IP playback and I playback in the BB presentation mode is executed. If the decoding control according to the speed is not included, the requested special reproduction is executed (step S55).
  • step S52 a configuration is disclosed in which step S53 is executed if the reproduction speed is up to 1.3 times, and step S54 is executed if the reproduction speed is twice or more, and the reproduction speed is 1.3. Although it does not consider the range from larger than double to smaller than twice, this means that the playback device does not have a function of performing double-speed playback until the playback speed is larger than 1.3 and smaller than twice. Based on assumptions.
  • step S52 when the reproduction speed is higher than 1.3 times, the process may be changed by executing step S54.
  • the UO mask table for play items consists of the following flags.
  • Chapter_serch_mask flag is a flag that specifies whether or not to mask the request when playback control called chapter search is requested by the user.
  • the chapter search is a reproduction control in which a number input is received from a user and reproduction is started from a chapter designated by the number.
  • Time_search_mask flag time_search_mask is a flag that defines whether or not to mask the request when playback control called time search is requested by the user.
  • the time search is a reproduction control in which a reproduction time input operation is accepted from a user and reproduction is started from a time point designated as the reproduction time.
  • Skip_next_mask flag, skip_back_mask flag are flags indicating whether or not the request is masked when a skip next and skip back are requested by the user, as in the first embodiment.
  • the play_mask flag is a flag indicating whether or not to mask the request when the playback control to start playback is requested by the user.
  • Stop_mask flag is a flag indicating whether or not to mask the request when the user requests playback control to stop playback.
  • Pause_on_mask flag is a flag indicating whether or not to mask the request when the playback control of pause on (pause) is requested by the user.
  • Pause_off_mask flag is a flag indicating whether or not the request is masked when the playback control of pause-off (pause release) is requested by the user.
  • Still_off_mask flag is a flag indicating whether or not the request is masked when a reproduction control for still image mode off is requested by the user.
  • Forward_play_mask flag, backward_play_mask flag are flags that specify whether or not to mask the request when playback control such as fast forward and fast reverse is requested from the user, as in the first embodiment.
  • Resume_mask flag is a flag indicating whether or not to mask the request when the user requests playback control to resume playback.
  • Audio_change_mask flag is a flag indicating whether or not to mask the request when a playback control called audio switching is requested by the user.
  • the PG_textST_change_mask flag is a flag indicating whether or not to mask a request when switching between a subtitle drawn by graphics (Presentation Graphics) and a subtitle drawn by text is requested.
  • Angle_change_mask flag is a flag indicating whether or not to mask the request when playback control called angle switching is requested by the user, as in the first embodiment.
  • Pop-up_on_mask flag is a flag indicating whether or not the request is masked when a playback control called pop-up menu is requested.
  • Pop-up_off_mask flag is a flag indicating whether or not to mask the request when playback control for displaying the pop-up menu is turned off.
  • the select_menu_language_mask flag is a flag indicating whether or not to mask the request when the user requests playback control to select a menu description language.
  • the UO mask table is masked only by user operation, and the special playback instruction from the BD-J application or movie object is not masked.
  • the above is the UO mask table.
  • the video frame processing using this UO mask table is as shown in FIG.
  • FIG. 31 is a flowchart showing a processing procedure of video frame processing in the 3D mode. This is a loop that repeats the processes of steps S41 to S43, steps S802 to S810, and step S811. The end condition of this loop is that it is determined in step S811 that the next video frame does not exist.
  • Step S41 is a determination as to whether or not there has been a request for special playback from the user
  • Step S42 is a determination as to whether or not a stop request has been made
  • Step S43 is a determination as to whether or not a pause request has been received.
  • Step S44 is a determination as to whether or not the special playback request is masked in the UO mask table of the current play item. If it is not masked, the special playback is executed in step S45.
  • Step S42 is a determination of whether or not a stop request has been made. If a stop request is made, in step S47, the video frame and right eye video stream of the left-eye video stream at the playback stop position are maintained while the BD presentation mode is maintained. Output video frames of the video stream alternately and return.
  • Step S43 is a determination as to whether or not a pause is requested. If so, it is determined whether or not the special playback request is masked in the UO mask table of the current play item in step S46. If not masked, in step S47, the video frame of the left-eye video stream and the video frame of the right-eye video stream at the playback stop position are alternately output while the BD presentation mode is maintained, and the process returns.
  • the 3D content playback method as shown in the figure will not be able to decode the video frame in time, and not only will the video be distorted, it will also be unable to decode subtitle data in time with the video frame. It will provide viewers with objectionable content.
  • FIG. 32 is a flowchart showing an image plane processing procedure in response to a user operation, a request from a BD-J application, and a request from a movie object.
  • step S61 whether or not a special playback instruction has been issued by the user, in step S62 whether a stop request or a pause request has been made, or at the end of the video. It is determined whether or not there is. In step S63, it is determined whether or not the slide show reproduction is performed.
  • the PlayItem playback type in the playlist file indicates whether or not the stream file is played back as a slideshow. Therefore, when the current PlayItem is switched to a PlayItem indicating that slideshow playback is realized, step S63 is performed. Judged as Yes.
  • step S67 it is determined in step S67 whether or not the image plane can be combined. If it can be combined, one of the video frame of the left-eye video stream and the video frame of the right-eye video stream is The graphics whose pixel coordinates are changed by plane shift are synthesized, and the process returns to the loop.
  • the graphics in the image plane is not synthesized in step S69, and one of the video frame of the left-eye video stream and the video frame of the right-eye video stream is output.
  • step S70 it is determined in step S70 whether or not plane shift and composition are possible. If possible, the video frame of the video stream for the left eye at the playback stop position is determined. And the graphics in which the coordinates of the pixels are changed by the plane shift are combined with each of the video frames of the right-eye video stream (step S71).
  • the video frames of the left-eye video stream and the video frame of the right-eye video stream are alternately output in step S72 without synthesizing the graphics in the image plane.
  • the present embodiment is an improvement for providing a function for a BD-J application.
  • the acquisition of the display status and the register setting value by the BD-J application will be described.
  • the current display state may be described in the register set 12. In this way, it is possible to control the content itself so that it does not look unnatural, and it is possible to reduce discomfort to the viewer.
  • the process in which the playback device 200 generates output data to the display device 400 is performed in response to a request issued by the BD-J application, and is realized by a command from the BD-J platform 22 to the playback control engine.
  • a command from the BD-J module 22 to the playback control engine 14 in the playback start process will be described.
  • the playback control engine provides the following three commands to the BD-J application.
  • the reproduction preparation instruction is an instruction for suggesting preparation.
  • the reproduction preparation instruction only suggests preparation, and whether or not the reproduction control engine actually prepares by the reproduction preparation instruction can be acquired by the property “reproduction preparation” in the BD-J platform 22.
  • “Preparation for playback” is “Yes”
  • playback preparation is performed by a playback preparation command, and current playlist acquisition, AV stream playback request, current playlist playback mode determination, and playback mode determination are performed.
  • Made When the playlist is 2D, the playback mode is determined and the playback mode is switched. In response to the synchronization start command, an AV stream reproduction request is made.
  • Synchronization start command is a command for synchronizing the playback mode with the mode attribute of the AV stream.
  • the playback mode is switched to 2D by a playback preparation command.
  • the playback mode is switched to 2D by a synchronization start command.
  • Playback start command is a command for integrating the above two commands to perform playback preparation and synchronization start. Regardless of the value of “Preparation for Playback”, acquisition of the current playlist, AV stream playback request, determination of the playback mode of the current playlist, and determination of the playback mode in the playback device are performed. When the playlist is 2D, the playback mode is determined and the playback mode is switched. Thereafter, a playback request for the AV stream is made.
  • the recording method according to the present embodiment is not limited to real-time recording in which a file as described above is created in real time and directly written in the file system area of the recording medium, and the entire image of the bitstream to be recorded in the file system area. Including a preformat recording in which a master disk is created based on the bitstream and the master disk is pressed to mass-produce the optical disk.
  • the recording method according to the present embodiment is also specified by a recording method using real-time recording and a recording method using preformatting recording.
  • FIG. 33 is a flowchart showing the processing procedure of the recording method.
  • step S201 data materials such as moving images, audio, subtitles, and menus are imported.
  • step S202 the data materials are digitized and compression-encoded, and encoded according to the MPEG standard, so that a packetized elementary stream is obtained. Get.
  • step S203 the packetized elementary stream is multiplexed to generate corresponding clip information.
  • step S204 the AV clip and the clip information are stored in separate files.
  • step S205 a playlist that defines the playback path of the AV clip, a program that defines the control procedure using the playlist, and management information for these are created.
  • step S206 the AV clip, clip information, playlist, and program are created. Write other management information to the recording medium.
  • FIG. 34 is a diagram showing a hardware internal configuration of the playback device.
  • the main components constituting the playback apparatus in this figure are a front end unit 101, a system LSI 102, a memory device 103, a back end unit 104, a nonvolatile memory 105, a host microcomputer 106, and a network I / F 107.
  • the front end unit 101 is a data input source.
  • the front end unit 101 includes, for example, the BD drive 1a and the local storage 1c shown in FIG.
  • the system LSI 102 is composed of logic elements and forms the core of the playback device. At least the components such as the demultiplexer 4, the video decoders 5a and 5b, the image decoders 7a and 7b, the audio decoder 9, the register set 12, the playback control engine 14, the synthesis unit 16, and the plane shift engine 20 are included in the system LSI. Will be incorporated.
  • the memory device 103 is configured by an array of memory elements such as SDRAM.
  • the memory device 107 includes, for example, read buffers 2a and 2b, a dynamic scenario memory 23, a static scenario memory 13, a video plane 6, an image plane 8, an interactive graphics plane 10, and a background plane 11.
  • the back end unit 104 is a connection interface between the playback device and other devices, and includes an HDMI transmission / reception unit 17.
  • the non-volatile memory 105 is a readable / writable recording medium, and is a medium capable of holding recorded contents even when power is not supplied.
  • the nonvolatile memory 105 is used for backup of a reproduction mode stored in a dimension mode storage unit 29 described later.
  • the As the nonvolatile memory 105 for example, a flash memory, FeRAM, or the like can be used.
  • the host microcomputer 106 is a microcomputer system composed of ROM, RAM, and CPU.
  • a program for controlling the playback device is recorded in the ROM, and the program in the ROM is read into the CPU, and the program and hardware resources cooperate.
  • the functions of the virtual file system 3, the HDMV module 24, the BD-J platform 22, the mode management module 24, and the UO detection module 26 are realized.
  • a system LSI is an integrated circuit in which a bare chip is mounted on a high-density substrate and packaged.
  • a system LSI that includes a plurality of bare chips mounted on a high-density substrate and packaged to give the bare chip an external structure like a single LSI is also included in system LSIs (such systems LSI is called a multichip module.)
  • system LSIs are classified into QFP (Quad-Flood Array) and PGA (Pin-Grid Array).
  • QFP is a system LSI with pins attached to the four sides of the package.
  • the PGA is a system LSI with many pins attached to the entire bottom surface.
  • pins serve as an interface with other circuits. Since pins in the system LSI have such an interface role, the system LSI plays a role as the core of the playback apparatus 200 by connecting other circuits to these pins in the system LSI.
  • Such a system LSI can be incorporated in various devices that handle video playback, such as a TV, a game, a personal computer, and a one-seg mobile phone as well as the playback device 200, and can broaden the application of the present invention.
  • the system LSI architecture should conform to the Uniphier architecture.
  • a system LSI that conforms to the Uniphier architecture consists of the following circuit blocks.
  • ⁇ Data parallel processor DPP This is a SIMD-type processor in which multiple element processors operate in the same way. By operating the arithmetic units incorporated in each element processor simultaneously with a single instruction, the decoding process for multiple pixels constituting a picture is performed in parallel. Plan
  • Instruction parallel processor IPP This is a "Local Memory Controller” consisting of instruction RAM, instruction cache, data RAM, and data cache, "Processing Unit” consisting of instruction fetch unit, decoder, execution unit and register file, and Processing Unit part for parallel execution of multiple applications. It consists of a “Virtual Multi Processor Unit section” to be performed.
  • MPU block This is a peripheral interface such as ARM core, external bus interface (Bus Control Unit: BCU), DMA controller, timer, vector interrupt controller, UART, GPIO (General Purpose Input Output), synchronous serial interface, etc. Consists of.
  • -Stream I / O block This performs data input / output to / from drive devices, hard disk drive devices, and SD memory card drive devices connected to the external bus via the USB interface or ATA Packet interface.
  • ⁇ AVI / O block This is composed of audio input / output, video input / output, and OSD controller, and performs data input / output with TV and AV amplifier.
  • Memory control block This is a block that realizes reading and writing of the SD-RAM connected via the external bus.
  • the internal bus connection part that controls the internal connection between each block, the SD-RAM connected outside the system LSI It consists of an access control unit that transfers data to and from the RAM, and an access schedule unit that adjusts SD-RAM access requests from each block.
  • the buses connecting circuit elements, ICs, LSIs, their peripheral circuits, external interfaces, etc. will be defined.
  • connection lines, power supply lines, ground lines, clock signal lines, and the like will be defined.
  • the circuit diagram is completed while adjusting the operation timing of each component in consideration of the specifications of the LSI, and making adjustments such as ensuring the necessary bandwidth for each component.
  • Mounting design refers to where on the board the parts (circuit elements, ICs, and LSIs) on the circuit board created by circuit design are placed, or how the connection lines on the circuit board are placed on the board. This is a board layout creation operation for determining whether to perform wiring.
  • the mounting design result is converted into CAM data and output to equipment such as an NC machine tool.
  • NC machine tools perform SoC implementation and SiP implementation based on this CAM data.
  • SoC (System on chip) mounting is a technology that burns multiple circuits on a single chip.
  • SiP (System-in-Package) packaging is a technology that combines multiple chips into one package with resin or the like.
  • the integrated circuit generated as described above may be called IC, LSI, super LSI, or ultra LSI depending on the degree of integration.
  • the hardware configuration shown in each embodiment can be realized.
  • the LUT is stored in the SRAM, and the content of the SRAM disappears when the power is turned off.
  • the LUT that realizes the hardware configuration shown in each embodiment is defined by the definition of the configuration information. Must be written to SRAM.
  • hardware corresponding to middleware and system LSI hardware other than system LSI, interface part to middleware, interface part of middleware and system LSI, necessary hardware other than middleware and system LSI
  • a specific function is provided by operating in cooperation with each other.
  • the user interface part, middleware part, and system LSI part of the playback device can be developed independently and in parallel, making development more efficient It becomes possible. There are various ways to cut each interface.
  • FIG. 35 shows that when the sign of the plane offset is positive (the left-eye graphics image is shifted to the right and the right-eye graphics image is shifted to the left), the image is in front of the display screen. It is a figure for demonstrating the visible principle.
  • the stereo mode of the 3D mode is off, the image that can be seen by the left eye is made to appear at the right position compared to the case where the plane offset is zero. At this time, nothing is visible to the right eye by the liquid crystal shutter glasses. On the other hand, the image that can be seen by the right eye is made to appear on the left side as compared with the case where the plane offset is 0, and at this time, nothing is seen by the liquid crystal shutter glasses (FIG. 35B).
  • ⁇ Human uses both eyes to focus and recognizes that there is an image at the focal position. Therefore, when the liquid crystal shutter glasses alternate between the state in which the image is visible to the left eye and the state in which the image is visible to the right eye at short time intervals, both human eyes try to adjust the focal position to a position in front of the display screen. As a result, an illusion is caused so that an image is present at the focal position located in front of the display screen (FIG. 35C).
  • FIG. 36 shows that when the sign of the plane offset is negative (the left-eye graphics image is shifted to the left and the right-eye graphics image is shifted to the right), the image is behind the display screen. It is a figure for demonstrating the visible principle.
  • a circle indicates an image displayed on the display screen.
  • the image seen by the right eye and the image seen by the left eye are at the same position, so the focal position when viewing this image using both eyes is located on the display screen (FIG. 36 ( a)).
  • the stereo mode of 3D mode when the stereo mode of 3D mode is off, the image that can be seen by the left eye is made to appear on the left side compared to the case where the plane offset is zero. At this time, nothing is visible to the right eye by the liquid crystal shutter glasses. On the other hand, the image that can be seen by the right eye is made to appear at the right position as compared with the case where the offset is 0, and at this time, nothing is seen by the liquid crystal shutter glasses (FIG. 36B).
  • FIG. 37 is a diagram illustrating how the degree of subtitle pop-up changes depending on the size of the plane offset.
  • the foreground shows a graphics image for the right eye that is output using the shifted graphics plane that is shifted when the right eye is output.
  • the rear part shows the graphics image for the left eye that is output using the shifted graphics plane that is shifted when the left eye is output.
  • This figure (a) shows the case where the sign of the plane offset is positive (the left-eye graphics image is shifted to the right and the right-eye graphics image is shifted to the left).
  • the plane offset is a positive value, as shown in FIG. 35, the caption when the left eye is output can be seen at the right position from the caption when the right eye is output. That is, since the convergence point (focus position) comes to the front of the screen, the subtitles can also be seen to the front.
  • This figure (b) shows the case where the sign of the plane offset is negative. If it is a negative value, as shown in FIG. 36, the caption when the left eye is output can be seen to the left of the caption when the right eye is output. In other words, since the convergence point (focus position) goes deeper than the screen, the subtitles can also be seen deeper.
  • FIG. 38 shows the internal configuration of the image plane 8.
  • the image plane 8 is composed of a horizontal 1920 ⁇ vertical 1080 8-bit storage element as shown in FIG. This means a memory allocation that can store an 8-bit pixel code per pixel at a resolution of 1920 ⁇ 1080.
  • the 8-bit pixel code stored in the storage element is converted into a Y value, a Cr value, and a Cb value by color conversion using a color lookup table.
  • this color lookup table the correspondence between the pixel code and the Y value, Cr value, and Cb value is defined by the palette definition segment in the caption data.
  • the graphics data stored in the image plane 8 is composed of pixel data corresponding to the foreground portion (portion constituting the subtitle “I love”) and pixel data corresponding to the background portion.
  • a pixel code indicating a transparent color is stored in the memory element corresponding to the background portion, and a moving image on the video plane can be seen through this portion when the pixel code is combined with the video plane.
  • the storage element corresponding to the foreground portion stores a pixel code indicating a color other than the transparent color, and the subtitle is drawn by the Y, Cr, Cb, and ⁇ values other than the transparent color.
  • FIG. 39 shows the pixel data in the foreground area and the pixel data in the background area after the shift in the right direction and the shift in the left direction are performed.
  • (A) is pixel data before shifting
  • (b) is pixel data after shifting in the right direction.
  • (C) is pixel data after the leftward shift.
  • the shift amount is 15 pixels, it can be seen that the character “o” appears in the subtitle character “you” following the subtitle character “I love”.
  • FIG. 40 is a diagram showing a plane shift processing procedure in the image plane 8.
  • FIG. 40A shows a left-shifted graphics plane and a right-shifted graphics plane generated from the image plane 8. .
  • (B) shows a shift in the right direction.
  • the horizontal shifting method to the right is performed as follows (1-1) (1-2) (1-3).
  • (1-1) Cut out the right end area of the image plane 8.
  • (1-2) The position of the pixel data existing on the image plane 8 is shifted to the right by the shift amount indicated by the plane offset in the horizontal direction as described above.
  • (1-3) A transparent area is added to the left end of the image plane 8.
  • (C) indicates a shift in the left direction.
  • the horizontal shifting method to the left is performed as follows (2-1) (2-2) (2-3).
  • (2-1) Cut out the left end area of the image plane 8.
  • (2-2) The position of each pixel data in the image plane 8 is shifted to the left by the shift amount indicated by the plane offset in the horizontal direction.
  • (1-3) A transparent area is added to the right end of the image plane 8.
  • the graphics data is composed of pixel data with a resolution of 1920 ⁇ 1080 or 1280 ⁇ 720.
  • FIG. 41 is a diagram showing pixel data stored in the graphics plane.
  • the square frame is a memory element with a word length of 32 bits or 8 bits, and is a hexadecimal number such as 0001,0002,0003,0004,07A5,07A6,07A7,07A8,07A9,07AA, 07AB.
  • addresses continuously assigned to these storage elements in the MPU memory space are addresses continuously assigned to these storage elements in the MPU memory space.
  • numerical values such as (0,0) (1,0) (2,0) (3,0) (1916,0) (1917,0) (1918,0) (1919,0) in the storage element are It shows which coordinate pixel data is stored in the storage element.
  • the pixel data existing at the coordinates (0,0) is stored in the storage element at the address 0001
  • the pixel data existing at the coordinates (1,0) is stored in the storage element at the address 0002
  • the coordinates Pixel data existing at (1918,0) is stored in the storage element at address 07A7
  • pixel data existing at coordinates (0,1) is stored in the storage element at address 07A9. That is, it can be seen that the graphics data is stored so that a plurality of lines constituting the graphics are continuous addresses. By doing so, it is possible to read out these pixel data in a burst manner by sequentially performing DMA transfer on the storage elements to which these consecutive addresses are assigned.
  • FIG. 42 shows the contents stored in the graphics plane after the shift.
  • Figure (a) shows a graphics plane that is set to "3" and is shifted to the right. Since the plane offset is “3”, the pixel data at the coordinate (0,0) in the graphics plane coordinate system is stored in the storage element at the address 0004, and the coordinate (1 in the graphics plane coordinate system is stored in the storage element at the address 0005. , 0) pixel data, it can be seen that the storage element at address 0006 stores pixel data at coordinates (2,0) in the graphics plane coordinate system.
  • the memory element at address 07AC has pixel data at coordinates (0,1) in the graphics plane coordinate system
  • the memory element at address 07AD has pixel data at coordinates (1,1) in the graphics plane coordinate system. It can be seen that the memory element of 07AE stores pixel data of coordinates (2, 1) in the graphics plane coordinate system.
  • (B) in the figure shows a graphics plane shifted to the left with the plane offset set to “3”. Since the plane offset is “3”, the pixel data at the coordinate (3,0) in the graphics plane coordinate system is stored in the storage element at the address 0001, and the coordinate (4 in the graphics plane coordinate system is stored in the storage element at the address 0002. , 0), the storage element at address 0003 shows that pixel data at coordinates (5,0) in the graphics plane coordinate system is stored.
  • the memory element at address 07A9 has pixel data at coordinates (3, 1) in the graphics plane coordinate system, and the memory element at address 07AA has pixel data at coordinates (4, 1) in the graphics plane coordinate system. It can be seen that the storage element 07AB stores pixel data at coordinates (5, 1) in the graphics plane coordinate system.
  • the coordinates of each pixel data in the graphics plane are shifted from the original coordinates to the right and left by the number of pixels indicated by the plane offset.
  • the shift of the graphics plane can be realized by changing the address of the storage element in which each pixel data constituting the graphics data is arranged by a predetermined address. As a matter of course, it is possible to realize the shift of the graphics plane if the process is equivalent to this without changing the address of the storage element where the pixel data is actually arranged.
  • a video decoder that decodes the video stream for the left eye and a video decoder that decodes the video stream for the right eye are mounted, and the number of video decoders to be installed is set to “2”. desirable.
  • the 2D playback mode is switched to the 3D mode and the 3D mode is switched to the 2D playback mode at the timing when the title corresponding to the BD-J object is selected. Switching may be performed. In this case, the processing procedure of the flowchart of FIG. 12 is executed using a title switching request as a trigger, such as when a disc is inserted or a menu is selected.
  • “Accessible playlist information” includes designation of a playlist to be automatically played when the title corresponding to the BD-J object becomes the current title. Also, the accessible playlist information includes the designation of the playlist that can be selected by the application that can be operated when the title corresponding to the BD-J object becomes the current title.
  • the playback control engine in the playback device starts playback of the playlist specified by the accessible playlist information corresponding to the selected current title without waiting for a playback instruction from the application, and from the end of playlist playback However, if the BD-J application execution ends first, the playlist is continuously played back.
  • the application is terminated abnormally due to resource depletion and the GUI of the application is automatically deleted, if the playlist playback screen continues to be displayed as it is, the playback video of the playlist will be displayed. , Which means that it continues to be output to the display device. Even if the Java (TM) language program terminates abnormally due to such output continuation, the display device will be in a state where something is reflected for the time being, preventing a situation where the screen is blacked out due to abnormal termination of the application. be able to. .
  • TM Java
  • the playback mode is changed from 3D to 2D, but the same effect can be obtained when the playback mode is changed from 2D to 3D.
  • step S808 in FIG. 13 the graphic stream stored in the image memories 7c and d is decoded and output to the image plane 8. However, this step is not executed, and the image plane used in step S805 in FIG. In step S809 in FIG. 13, the image plane may be shifted in the reverse direction by an amount twice the original, so that it is not necessary to code the second image.
  • the configuration diagram of FIG. 7 includes one video decoder, one video plane, and one image plane adder. For example, two parts are provided for speeding up, and the left-eye video and the right-eye are added. Images may be processed in parallel.
  • the BD-ROM has a dimension identification flag for identifying whether the stream to be played is for 2D or 3D.
  • the dimension identification flag is embedded in the playlist (PL) information. Any information may be recorded on the BD-ROM in any other form as long as the information is recorded as information that can identify the stream body and whether the stream is for 2D or 3D.
  • step S802 to S809 in FIG. 13 If it is desired that only the subtitle / graphic data has a depth on the 2D video stream, the processing in steps S802 to S809 in FIG. 13 is performed. However, the processing of S804 and S807 can be performed by using a frame on the 2D stream for both eyes instead of using the video frame for the left-eye video stream and the video frame for the right-eye.
  • the image plane shift information is written on the premise that it exists in the current playlist information.
  • an offset table is provided in the MVC video stream. It is desirable to provide and store image plane shift information here. Thereby, precise synchronization can be realized in units of pictures.
  • the process of step S5 in FIG. 12 may be acquired from the current stream management information.
  • the plane shift engine calculates the shift value in the horizontal direction based on the image plane shift information of the current playlist information. That is, in the series of processes for generating the output of the left-eye video and the right-eye video, when playlist switching occurs, it is necessary to re-acquire the image plane shift information in step S5 of FIG.
  • it is not desired to change the depth of the image plane for each playlist For example, there is a case where it is desired to determine a unique depth in disc units or title units.
  • HDMI When connecting a playback device such as a BD-ROM and a display device such as a TV, HDMI may be used, but by switching from 3D display to 2D display, it is possible to prevent video flickering due to HDMI re-authentication.
  • the subtitles and graphics are enlarged by scaling.
  • the stream registration information in the reproduction data information is preferably configured as STN_table.
  • the STN_table is multiplexed with the AV clip referenced in the multi-pass main path when the play item that includes the STN_table among the plurality of play items constituting the playlist becomes the current play item.
  • the stream types here are types such as a primary video stream in picture-in-picture, a secondary video stream in picture-in-picture, a primary audio stream in sound mixing, a secondary audio stream in sound mixing, presentation graphics_text subtitle data, and an interactive graphics stream.
  • STN_table can register a stream that should be permitted to be reproduced for each of these stream types.
  • the STN_table is composed of an array of stream registration information.
  • the stream registration indicates, in association with the stream number, what kind of elementary stream should be permitted to be played when the play item to which the STN_table belongs becomes the current play item.
  • the stream registration information has a data structure in which a combination of a stream entry and a stream attribute is associated with a logical stream number.
  • the stream number in the stream registration is represented by an integer value such as 1, 2, 3, and the maximum number of stream numbers is the number of streams of the corresponding stream type.
  • the stream attribute includes information indicating a stream language code and a stream encoding method.
  • the stream entry corresponding to the stream on the main path side includes a packet identifier, and the entry corresponding to the stream on the sub path side includes an identifier that identifies the transport stream file, an identifier that identifies the sub play item, and a packet identifier. Including.
  • the packet identifier of the elementary stream to be reproduced is described. Since the packet identifier of the elementary stream to be reproduced can be described in the stream entry, the stream number in the stream registration information is stored in the stream number register of the reproduction apparatus, and the packet identifier in the stream entry in the stream registration information is stored. Based on this, the playback device causes the playback device to perform packet filtering on the PID filter. As a result, the TS packet of the elementary stream that is permitted to be reproduced in the STN_table is output to the decoder, and the elementary stream is reproduced.
  • the stream registration information in the stream number table is arranged according to the order of the stream numbers, and the order of the stream registration information based on the order of the stream numbers satisfies the condition that “the playback device can play”. If there are a plurality of these, this is a criterion for selecting one of them preferentially.
  • the stream is excluded from playback, and the playback device can play back. If there are a plurality of streams that satisfy the condition, the author can inform the playback apparatus of which of the streams should be preferentially selected.
  • the determination as to whether there is a stream that satisfies the condition “can be played by the playback device” and the selection of the stream that satisfies the condition “can be played” This is executed when the play item is switched to a new one, or when stream switching is requested by the user.
  • the stream registration information sequence in the stream number table uniformly assigns a priority order to the stream specified by the sub play item information and the stream specified by the play item information, it is multiplexed with the video stream. Even if the stream is not specified, it is a target for selection when selecting a stream to be reproduced in synchronization with the video stream if it is specified by the sub play item information.
  • the playback device can play back the stream specified by the sub play item information, and the priority order of the stream specified by the sub play item information is the priority of the graphics stream multiplexed with the video stream.
  • the stream specified by the sub play item information can be used for reproduction instead of the stream multiplexed with the video stream.
  • the essence of the STN_table is to provide a way for the stream specified by the sub play item information to be used for reproduction instead of the stream multiplexed with the video stream.
  • STN_table_SS (StereoScopic)
  • This STN_table_SS is a stream registration information sequence for the left-eye video stream, a stream registration information sequence for the right-eye video stream, a stream registration information sequence for the left-eye presentation graphics stream, a stream registration information sequence for the right-eye presentation graphics stream, and an interactive graphics for the left eye
  • the stream registration information sequence of the stream and the stream registration information sequence of the interactive graphics stream for the right eye is a stream registration information sequence for the left-eye video stream, a stream registration information sequence for the right-eye video stream, a stream registration information sequence for the left-eye presentation graphics stream, a stream registration information sequence for the right-eye presentation graphics stream, and an interactive graphics for the left eye.
  • the stream registration information sequence for each stream type in STN_table_SS is combined with the stream registration information sequence for the stream type in STN_table.
  • This combination replaces the stream registration information sequence of the primary video stream in STN_tabl with the stream registration information sequence of the left-eye video stream and the stream registration information sequence of the right-eye video stream in STN_table_SS, and stream registration of the secondary video stream in STN_table
  • the information sequence is replaced with the stream registration information sequence of the secondary video stream for the left eye in STN_table_SS and the stream registration information sequence of the secondary video stream for the right eye
  • the stream registration information sequence of the presentation graphics stream in STN_table is replaced with the presentation graphics for left eye in STN_table_SS
  • Stream registration information sequence of stream and stream registration information sequence of presentation graphics stream for right eye Replacing the stream registration information sequence of interactive graphics stream in the STN_table, the stream registration information column of the left-eye interactive graphics stream in STN_table_SS, and are made by replacing the stream registration information sequence of the right-eye interactive graphics stream.
  • the above procedure is executed on the combined STN_table, and the stream registration information of the elementary stream to be selected in the 3D mode is selected from the stream registration information sequence in the combined STN_table.
  • the stream number in the stream registration information is set in the stream number register of the playback device.
  • the packet identifier is extracted from the stream registration information, and the PID filter of the playback apparatus executes packet filtering based on the packet identifier. By doing so, TS packets constituting the left-eye stream and the right-eye stream can be input to the decoder and reproduced and output. By doing so, the stream for the left eye and the stream for the right eye can be used for playback. Therefore, when combined with these playback outputs, stereoscopic playback becomes possible.
  • the recording medium in each embodiment includes all package media such as an optical disk and a semiconductor memory card.
  • the recording medium of the present embodiment will be described by taking an example of an optical disc (for example, an existing readable optical disc such as a BD-ROM or DVD-ROM) in which necessary data is recorded in advance.
  • a terminal device having a function of writing 3D content including data necessary for carrying out the present invention distributed via broadcasting or a network to an optical disc (for example, the function described on the left may be incorporated in a playback device) It may be good or may be a device different from the playback device) and recorded on a writable optical disc (for example, an existing writable optical disc such as BD-RE, DVD-RAM) and the recorded optical disc
  • a writable optical disc for example, an existing writable optical disc such as BD-RE, DVD-RAM
  • the present invention can be implemented even if the recording medium is a semiconductor memory card such as an SD memory card in addition to the optical disk (reproduction of the semiconductor memory card). Also, a playback procedure when using, for example, a semiconductor memory card as a recording medium will be described.
  • an optical disc is configured to read data via an optical disc drive, whereas when a semiconductor memory card is used, data is read via an I / F for reading data in the semiconductor memory card. What is necessary is just to comprise so that it may read.
  • the playback device and the semiconductor memory card are electrically connected via the semiconductor memory card I / F. What is necessary is just to comprise so that the data recorded on the semiconductor memory card may be read through the semiconductor memory card I / F.
  • a program may be created based on the processing procedure shown in the flowchart of each embodiment, and a computer-readable recording medium on which the program is recorded may be implemented.
  • the software developer uses a programming language to write a source program that implements each flowchart and functional components.
  • the software developer describes a source program that embodies each flowchart and functional components using a class structure, a variable, an array variable, and an external function call according to the syntax of the programming language.
  • the described source program is given to the compiler as a file.
  • the compiler translates these source programs to generate an object program.
  • Translator translation consists of processes such as syntax analysis, optimization, resource allocation, and code generation.
  • syntax analysis lexical analysis, syntax analysis, and semantic analysis of the source program are performed, and the source program is converted into an intermediate program.
  • optimization operations such as basic block formation, control flow analysis, and data flow analysis are performed on the intermediate program.
  • resource allocation in order to adapt to the instruction set of the target processor, a variable in the intermediate program is allocated to a register or memory of the processor of the target processor.
  • code generation each intermediate instruction in the intermediate program is converted into a program code to obtain an object program.
  • the object program generated here is composed of one or more program codes that cause a computer to execute the steps of the flowcharts shown in the embodiments and the individual procedures of the functional components.
  • program codes such as a processor native code and a JAVA byte code.
  • a call statement that calls the external function becomes a program code.
  • a program code that realizes one step may belong to different object programs.
  • each step of the flowchart may be realized by combining arithmetic operation instructions, logical operation instructions, branch instructions, and the like.
  • the programmer activates the linker for these.
  • the linker allocates these object programs and related library programs to a memory space, and combines them into one to generate a load module.
  • the load module generated in this manner is premised on reading by a computer, and causes the computer to execute the processing procedures and the functional component processing procedures shown in each flowchart.
  • the program is recorded on a computer-readable recording medium and provided to the user.
  • the present invention relates to a technique for superimposing and displaying subtitles and graphics on a stereoscopic video stream in a playback device that reproduces the stereoscopic video stream, and in particular, stereoscopically including not only a stereoscopic video stream but also subtitles and graphics. Therefore, the present invention can be applied to a stereoscopic video reproducing apparatus that outputs and superimposes the images.
  • BD-ROM 200 playback device 300 remote control 400 TV 500 liquid crystal glasses 1a BD drive 1b network device 1c local storage 2a, 2b read buffer 3 virtual file system 4 demultiplexer 5 video decoder 6 video plane 7a, b image decoder 7c, d, image memory 8 image Plane 9 Audio decoder 10 Interactive graphics plane 11 Background plane 12 Register set 13 Static scenario memory 14 Playback control engine 15 Scaler engine 16 Composition unit 17 HDMI transmission / reception unit 18 Display function flag holding unit 19 Left / right processing storage unit 20 Plane Shift engine 21 shift information memory 22 BD-J platform

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Library & Information Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Television Signal Processing For Recording (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)
PCT/JP2009/006135 2008-11-18 2009-11-16 特殊再生を考慮した再生装置、集積回路、再生方法 WO2010058547A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2010539135A JP5632291B2 (ja) 2008-11-18 2009-11-16 特殊再生を考慮した再生装置、集積回路、再生方法
EP09827328.7A EP2348747A4 (en) 2008-11-18 2009-11-16 REPRODUCTION DEVICE, INTEGRATED CIRCUIT, AND REPRODUCTION METHOD WHEREAS SPECIALIZED REPRODUCTION
CN200980117335.9A CN102027749B (zh) 2008-11-18 2009-11-16 考虑特殊再现的再现装置、集成电路、再现方法

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US11574208P 2008-11-18 2008-11-18
US61/115,742 2008-11-18
JP2008-294501 2008-11-18
JP2008294501 2008-11-18
JP2009-099914 2009-04-16
JP2009099914 2009-04-16
US18403809P 2009-06-04 2009-06-04
US61/184,038 2009-06-04

Publications (1)

Publication Number Publication Date
WO2010058547A1 true WO2010058547A1 (ja) 2010-05-27

Family

ID=42197990

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/006135 WO2010058547A1 (ja) 2008-11-18 2009-11-16 特殊再生を考慮した再生装置、集積回路、再生方法

Country Status (6)

Country Link
US (1) US8548308B2 (ko)
EP (1) EP2348747A4 (ko)
JP (1) JP5632291B2 (ko)
KR (1) KR20110095128A (ko)
CN (1) CN102027749B (ko)
WO (1) WO2010058547A1 (ko)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012015671A (ja) * 2010-06-30 2012-01-19 Hitachi Consumer Electronics Co Ltd 記録装置/方法/媒体、再生装置/方法
JP2012015672A (ja) * 2010-06-30 2012-01-19 Hitachi Consumer Electronics Co Ltd 記録装置/方法/媒体、再生装置/方法
WO2012017643A1 (ja) * 2010-08-06 2012-02-09 パナソニック株式会社 符号化方法、表示装置、及び復号方法
CN102447922A (zh) * 2010-10-13 2012-05-09 华晶科技股份有限公司 将双镜头影像合成为单镜头影像的方法
WO2012161077A1 (ja) * 2011-05-26 2012-11-29 ソニー株式会社 記録装置、記録方法、再生装置、再生方法、プログラム、および記録再生装置
WO2013100641A1 (ko) * 2011-12-27 2013-07-04 엘지전자 주식회사 입체영상 디스플레이가 가능한 디지털 방송 수신방법 및 수신장치
US8890933B2 (en) 2010-11-26 2014-11-18 Kabushiki Kaisha Toshiba Parallax image conversion apparatus
JP2016528800A (ja) * 2013-07-19 2016-09-15 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Hdrメタデータ転送

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102293001B (zh) * 2009-01-21 2014-05-14 株式会社尼康 图像处理装置、图像处理方法及记录方法
US8830301B2 (en) * 2009-06-10 2014-09-09 Lg Electronics Inc. Stereoscopic image reproduction method in case of pause mode and stereoscopic image reproduction apparatus using same
CN102144395A (zh) * 2009-06-10 2011-08-03 Lg电子株式会社 高速搜索模式中的立体图像再现方法和使用该方法的立体图像再现装置
TW201130289A (en) * 2009-07-14 2011-09-01 Panasonic Corp Image reproducing apparatus
US20110285827A1 (en) * 2009-07-14 2011-11-24 Panasonic Corporation Image reproducing apparatus and image display apparatus
KR20110025123A (ko) * 2009-09-02 2011-03-09 삼성전자주식회사 비디오 영상 배속 재생 방법 및 장치
JP2011082666A (ja) * 2009-10-05 2011-04-21 Sony Corp 信号伝送方法、信号送信装置及び信号受信装置
US20110081133A1 (en) * 2009-10-05 2011-04-07 Xuemin Chen Method and system for a fast channel change in 3d video
JP2011101230A (ja) * 2009-11-06 2011-05-19 Sony Corp 表示制御装置、表示制御方法、プログラム、出力装置、および送信装置
JP2011109294A (ja) * 2009-11-16 2011-06-02 Sony Corp 情報処理装置、情報処理方法、表示制御装置、表示制御方法、およびプログラム
KR20110063002A (ko) * 2009-12-04 2011-06-10 삼성전자주식회사 3차원 디스플레이 장치 및 이에 적용되는 3차원 영상 검출방법
JP2013042196A (ja) * 2009-12-21 2013-02-28 Panasonic Corp 再生装置
US20120120207A1 (en) * 2009-12-28 2012-05-17 Hiroaki Shimazaki Image playback device and display device
WO2011080907A1 (ja) * 2009-12-28 2011-07-07 パナソニック株式会社 表示装置と方法、記録媒体、送信装置と方法、及び再生装置と方法
JP5450118B2 (ja) * 2010-01-14 2014-03-26 ソニー株式会社 映像伝送装置、映像表示装置、映像表示システム、映像伝送方法及びコンピュータプログラム
JP2011223558A (ja) * 2010-03-26 2011-11-04 Panasonic Corp 映像信号処理装置およびアクティブシャッターメガネ
JP2011223247A (ja) * 2010-04-08 2011-11-04 Sony Corp 情報処理装置、情報記録媒体、および情報処理方法、並びにプログラム
US9030536B2 (en) 2010-06-04 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for presenting media content
US8640182B2 (en) * 2010-06-30 2014-01-28 At&T Intellectual Property I, L.P. Method for detecting a viewing apparatus
US9787974B2 (en) 2010-06-30 2017-10-10 At&T Intellectual Property I, L.P. Method and apparatus for delivering media content
JP5637750B2 (ja) * 2010-06-30 2014-12-10 日立コンシューマエレクトロニクス株式会社 記録装置/方法/媒体、再生装置/方法
US8593574B2 (en) 2010-06-30 2013-11-26 At&T Intellectual Property I, L.P. Apparatus and method for providing dimensional media content based on detected display capability
US8918831B2 (en) 2010-07-06 2014-12-23 At&T Intellectual Property I, Lp Method and apparatus for managing a presentation of media content
US9049426B2 (en) 2010-07-07 2015-06-02 At&T Intellectual Property I, Lp Apparatus and method for distributing three dimensional media content
JP2012019386A (ja) * 2010-07-08 2012-01-26 Sony Corp 再生装置、再生方法、およびプログラム
JP2012023648A (ja) * 2010-07-16 2012-02-02 Sony Corp 再生装置、再生方法、およびプログラム
US9232274B2 (en) 2010-07-20 2016-01-05 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
US9032470B2 (en) 2010-07-20 2015-05-12 At&T Intellectual Property I, Lp Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US9560406B2 (en) 2010-07-20 2017-01-31 At&T Intellectual Property I, L.P. Method and apparatus for adapting a presentation of media content
CN101959076B (zh) * 2010-07-23 2012-05-30 四川长虹电器股份有限公司 3d电视色差自动调试的方法
JP2012029216A (ja) * 2010-07-27 2012-02-09 Sony Corp 再生装置、再生方法、およびプログラム
WO2012014354A1 (ja) * 2010-07-27 2012-02-02 パナソニック株式会社 映像データの出力装置
US8994716B2 (en) 2010-08-02 2015-03-31 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US8605136B2 (en) * 2010-08-10 2013-12-10 Sony Corporation 2D to 3D user interface content data conversion
US8438502B2 (en) 2010-08-25 2013-05-07 At&T Intellectual Property I, L.P. Apparatus for controlling three-dimensional images
JP5058316B2 (ja) * 2010-09-03 2012-10-24 株式会社東芝 電子機器、画像処理方法、及び画像処理プログラム
JP5610933B2 (ja) * 2010-09-03 2014-10-22 キヤノン株式会社 再生装置及びその制御方法
US8947511B2 (en) 2010-10-01 2015-02-03 At&T Intellectual Property I, L.P. Apparatus and method for presenting three-dimensional media content
KR20120042313A (ko) * 2010-10-25 2012-05-03 삼성전자주식회사 입체영상표시장치 및 그 영상표시방법
JP4908624B1 (ja) * 2010-12-14 2012-04-04 株式会社東芝 立体映像信号処理装置及び方法
KR101767045B1 (ko) 2010-12-17 2017-08-10 삼성전자 주식회사 영상처리장치 및 영상처리방법
JP5595946B2 (ja) * 2011-02-04 2014-09-24 日立コンシューマエレクトロニクス株式会社 デジタルコンテンツ受信装置、デジタルコンテンツ受信方法、およびデジタルコンテンツ送受信方法
KR101763944B1 (ko) * 2011-02-18 2017-08-01 엘지디스플레이 주식회사 영상표시장치
CN103202028A (zh) * 2011-03-11 2013-07-10 日立民用电子株式会社 记录装置/方法/介质、再现装置/方法
JP5238849B2 (ja) * 2011-05-16 2013-07-17 株式会社東芝 電子機器、電子機器の制御方法及び電子機器の制御プログラム
US9602766B2 (en) 2011-06-24 2017-03-21 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
US9030522B2 (en) 2011-06-24 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US9445046B2 (en) 2011-06-24 2016-09-13 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content with telepresence
US8947497B2 (en) 2011-06-24 2015-02-03 At&T Intellectual Property I, Lp Apparatus and method for managing telepresence sessions
US8587635B2 (en) 2011-07-15 2013-11-19 At&T Intellectual Property I, L.P. Apparatus and method for providing media services with telepresence
JP5679578B2 (ja) * 2011-08-05 2015-03-04 株式会社ソニー・コンピュータエンタテインメント 画像処理装置
US20130038685A1 (en) * 2011-08-12 2013-02-14 Alcatel-Lucent Usa Inc. 3d display apparatus, method and structures
CN103210423A (zh) * 2011-11-15 2013-07-17 联发科技(新加坡)私人有限公司 立体图像处理装置及其相关方法
US9392210B2 (en) * 2012-03-22 2016-07-12 Broadcom Corporation Transcoding a video stream to facilitate accurate display
JP2013236357A (ja) * 2012-04-10 2013-11-21 Jvc Kenwood Corp 立体画像表示処理装置、立体画像表示処理方法及び立体画像表示処理プログラム
TWI555400B (zh) * 2012-05-17 2016-10-21 晨星半導體股份有限公司 應用於顯示裝置的字幕控制方法與元件
KR20150021487A (ko) * 2012-05-24 2015-03-02 파나소닉 주식회사 영상송신장치, 영상송신방법 및 영상재생장치
US9781409B2 (en) * 2012-08-17 2017-10-03 Nec Corporation Portable terminal device and program
CN104244069A (zh) * 2014-09-15 2014-12-24 深圳润华创视科技有限公司 网络视频播放方法、存储方法、播放设备、服务端及系统
US11087445B2 (en) 2015-12-03 2021-08-10 Quasar Blu, LLC Systems and methods for three-dimensional environmental modeling of a particular location such as a commercial or residential property
US10607328B2 (en) 2015-12-03 2020-03-31 Quasar Blu, LLC Systems and methods for three-dimensional environmental modeling of a particular location such as a commercial or residential property
US9965837B1 (en) 2015-12-03 2018-05-08 Quasar Blu, LLC Systems and methods for three dimensional environmental modeling
KR20210074880A (ko) * 2019-12-12 2021-06-22 삼성전자주식회사 디스플레이 장치 및 그 동작 방법
CN113923524A (zh) * 2021-08-24 2022-01-11 深圳市科伦特电子有限公司 播放模式切换方法、装置、播放设备及存储介质
CN113905225B (zh) * 2021-09-24 2023-04-28 深圳技术大学 裸眼3d显示装置的显示控制方法及装置

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07327242A (ja) * 1994-05-31 1995-12-12 Sanyo Electric Co Ltd 立体動画像圧縮符号化装置及び立体動画像復号再生装置
JPH11191895A (ja) * 1996-12-04 1999-07-13 Matsushita Electric Ind Co Ltd 高解像度および立体映像記録用光ディスク、光ディスク再生装置、および光ディスク記録装置
US5929859A (en) 1995-12-19 1999-07-27 U.S. Philips Corporation Parallactic depth-dependent pixel shifts
JP2002095017A (ja) 2000-09-11 2002-03-29 Canon Inc 立体映像表示システム
JP2003298938A (ja) 2002-04-01 2003-10-17 Canon Inc マルチ画面合成装置及びマルチ画面合成装置の制御方法及びマルチ画面合成装置の制御プログラム及び記憶媒体
JP2004274125A (ja) * 2003-03-05 2004-09-30 Sony Corp 画像処理装置および方法
WO2005019675A1 (de) 2003-07-23 2005-03-03 Zf Friedrichshafen Ag Kupplungsanordnung in einem automatgetriebe mit bauraumsparender kühlmittelversorgung
WO2005024828A1 (ja) * 2003-09-02 2005-03-17 Matsushita Electric Industrial Co., Ltd. 再生装置、システム集積回路、プログラム、再生方法、及び、情報記録媒体
JP2007221818A (ja) * 2004-06-02 2007-08-30 Matsushita Electric Ind Co Ltd 記録媒体、再生装置、記録方法、再生方法、プログラム
JP2008005203A (ja) * 2006-06-22 2008-01-10 Nikon Corp 画像再生装置
US20080036854A1 (en) 2006-08-08 2008-02-14 Texas Instruments Incorporated Method and system of communicating and rendering stereoscopic and dual-view images

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW436777B (en) * 1995-09-29 2001-05-28 Matsushita Electric Ind Co Ltd A method and an apparatus for reproducing bitstream having non-sequential system clock data seamlessly therebetween
US6502199B2 (en) * 1995-09-29 2002-12-31 Matsushita Electric Industrial Co., Ltd. Method and an apparatus for reproducing bitstream having non-sequential system clock data seamlessly therebetween
EP2259588B1 (en) * 1996-02-28 2013-12-11 Panasonic Corporation High-resolution optical disk for recording stereoscopic video, optical disk reproducing device and optical disk recording device
DE69841532D1 (de) * 1997-08-29 2010-04-15 Panasonic Corp Optische Platte mit hierarchisch codiertem digitalen Videosignal, Wiedergabevorrichtung und Aufnahmevorrichtung für die optische Platte
KR100884395B1 (ko) * 2002-02-05 2009-02-17 삼성전자주식회사 재생모드를 자동 설정할 수 있는 기록매체 재생장치 및 그제어방법
JP4165895B2 (ja) * 2003-01-20 2008-10-15 エルジー エレクトロニクス インコーポレーテッド 記録された静止映像の再生を管理するためのデータ構造を有する記録媒体、それによる記録と再生の方法及び装置
US8050538B2 (en) * 2003-01-20 2011-11-01 Lg Electronics Inc. Recording medium having data structure for managing reproduction of still pictures recorded thereon and recording and reproducing methods and apparatuses
CN100418140C (zh) * 2003-01-20 2008-09-10 Lg电子株式会社 具有用于管理记录在其上的静止图象的再现的数据结构的记录媒体以及记录和再现的方法和装置
US8068723B2 (en) * 2003-02-19 2011-11-29 Panasonic Corporation Recording medium, playback apparatus, recording method, program, and playback method
JP2004357156A (ja) * 2003-05-30 2004-12-16 Sharp Corp 映像受信装置および映像再生装置
JP4393151B2 (ja) * 2003-10-01 2010-01-06 シャープ株式会社 画像データ表示装置
US8391672B2 (en) * 2004-02-06 2013-03-05 Panasonic Corporation Recording medium, reproduction device, program, and reproduction method
EP2413323A1 (en) * 2004-06-03 2012-02-01 Panasonic Corporation Reproduction device and method
KR20070047825A (ko) 2004-09-08 2007-05-07 마츠시타 덴끼 산교 가부시키가이샤 영상데이터와 애플리케이션을 연동시켜서 재생하는재생장치, 재생방법 및 프로그램
JP4602737B2 (ja) * 2004-10-25 2010-12-22 シャープ株式会社 映像表示装置
US7720350B2 (en) * 2004-11-30 2010-05-18 General Instrument Corporation Methods and systems for controlling trick mode play speeds
KR100657322B1 (ko) * 2005-07-02 2006-12-14 삼성전자주식회사 로컬 3차원 비디오를 구현하기 위한 인코딩/디코딩 방법 및장치
WO2007111208A1 (ja) * 2006-03-24 2007-10-04 Matsushita Electric Industrial Co., Ltd. 再生装置、デバッグ装置、システムlsi、プログラム
JP4765733B2 (ja) * 2006-04-06 2011-09-07 ソニー株式会社 記録装置、記録方法および記録プログラム
US8982181B2 (en) * 2006-06-13 2015-03-17 Newbery Revocable Trust Indenture Digital stereo photographic system
US7755684B2 (en) * 2006-08-29 2010-07-13 Micron Technology, Inc. Row driver circuitry for imaging devices and related method of operation
JP4755565B2 (ja) * 2006-10-17 2011-08-24 シャープ株式会社 立体画像処理装置
EP2395772A3 (en) * 2008-09-30 2013-09-18 Panasonic Corporation Glasses and display device
MX2011006360A (es) * 2008-09-30 2011-07-13 Panasonic Corp Dispositivo de reproduccion, medio de grabacion y circuito integrado.
WO2010052857A1 (ja) * 2008-11-06 2010-05-14 パナソニック株式会社 再生装置、再生方法、再生プログラム、及び集積回路
US8335425B2 (en) * 2008-11-18 2012-12-18 Panasonic Corporation Playback apparatus, playback method, and program for performing stereoscopic playback

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07327242A (ja) * 1994-05-31 1995-12-12 Sanyo Electric Co Ltd 立体動画像圧縮符号化装置及び立体動画像復号再生装置
US5929859A (en) 1995-12-19 1999-07-27 U.S. Philips Corporation Parallactic depth-dependent pixel shifts
JPH11191895A (ja) * 1996-12-04 1999-07-13 Matsushita Electric Ind Co Ltd 高解像度および立体映像記録用光ディスク、光ディスク再生装置、および光ディスク記録装置
JP2002095017A (ja) 2000-09-11 2002-03-29 Canon Inc 立体映像表示システム
JP2003298938A (ja) 2002-04-01 2003-10-17 Canon Inc マルチ画面合成装置及びマルチ画面合成装置の制御方法及びマルチ画面合成装置の制御プログラム及び記憶媒体
JP2004274125A (ja) * 2003-03-05 2004-09-30 Sony Corp 画像処理装置および方法
WO2005019675A1 (de) 2003-07-23 2005-03-03 Zf Friedrichshafen Ag Kupplungsanordnung in einem automatgetriebe mit bauraumsparender kühlmittelversorgung
WO2005024828A1 (ja) * 2003-09-02 2005-03-17 Matsushita Electric Industrial Co., Ltd. 再生装置、システム集積回路、プログラム、再生方法、及び、情報記録媒体
JP2007221818A (ja) * 2004-06-02 2007-08-30 Matsushita Electric Ind Co Ltd 記録媒体、再生装置、記録方法、再生方法、プログラム
JP2008005203A (ja) * 2006-06-22 2008-01-10 Nikon Corp 画像再生装置
US20080036854A1 (en) 2006-08-08 2008-02-14 Texas Instruments Incorporated Method and system of communicating and rendering stereoscopic and dual-view images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2348747A4 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012015671A (ja) * 2010-06-30 2012-01-19 Hitachi Consumer Electronics Co Ltd 記録装置/方法/媒体、再生装置/方法
JP2012015672A (ja) * 2010-06-30 2012-01-19 Hitachi Consumer Electronics Co Ltd 記録装置/方法/媒体、再生装置/方法
WO2012017643A1 (ja) * 2010-08-06 2012-02-09 パナソニック株式会社 符号化方法、表示装置、及び復号方法
CN102447922A (zh) * 2010-10-13 2012-05-09 华晶科技股份有限公司 将双镜头影像合成为单镜头影像的方法
US8890933B2 (en) 2010-11-26 2014-11-18 Kabushiki Kaisha Toshiba Parallax image conversion apparatus
WO2012161077A1 (ja) * 2011-05-26 2012-11-29 ソニー株式会社 記録装置、記録方法、再生装置、再生方法、プログラム、および記録再生装置
WO2013100641A1 (ko) * 2011-12-27 2013-07-04 엘지전자 주식회사 입체영상 디스플레이가 가능한 디지털 방송 수신방법 및 수신장치
JP2016528800A (ja) * 2013-07-19 2016-09-15 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Hdrメタデータ転送

Also Published As

Publication number Publication date
US20100150523A1 (en) 2010-06-17
CN102027749B (zh) 2014-02-19
EP2348747A1 (en) 2011-07-27
US8548308B2 (en) 2013-10-01
JP5632291B2 (ja) 2014-11-26
JPWO2010058547A1 (ja) 2012-04-19
KR20110095128A (ko) 2011-08-24
EP2348747A4 (en) 2013-08-28
CN102027749A (zh) 2011-04-20

Similar Documents

Publication Publication Date Title
JP5632291B2 (ja) 特殊再生を考慮した再生装置、集積回路、再生方法
JP4564107B2 (ja) 記録媒体、再生装置、システムlsi、再生方法、記録方法、記録媒体再生システム
JP4772163B2 (ja) 立体視再生を行う再生装置、再生方法、プログラム
JP5416073B2 (ja) 半導体集積回路
JP5728649B2 (ja) 再生装置、集積回路、再生方法、プログラム
WO2010038409A1 (ja) 再生装置、記録媒体、及び集積回路
JP5457513B2 (ja) 立体視映像を再生することができる再生装置
WO2010010709A1 (ja) 立体視再生が可能な再生装置、再生方法、プログラム
WO2010095410A1 (ja) 記録媒体、再生装置、集積回路
WO2010032403A1 (ja) 映像コンテンツを立体視再生する再生装置、再生方法、および再生プログラム
JP5469125B2 (ja) 記録媒体、再生装置、再生方法、プログラム
US20100303437A1 (en) Recording medium, playback device, integrated circuit, playback method, and program

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980117335.9

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09827328

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010539135

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2009827328

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20107025390

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 4293/KOLNP/2010

Country of ref document: IN

NENP Non-entry into the national phase

Ref country code: DE