US8019199B2 - Reproduction device, reproduction method, reproduction program, recording medium, and data structure - Google Patents
Reproduction device, reproduction method, reproduction program, recording medium, and data structure Download PDFInfo
- Publication number
- US8019199B2 US8019199B2 US11/573,696 US57369605A US8019199B2 US 8019199 B2 US8019199 B2 US 8019199B2 US 57369605 A US57369605 A US 57369605A US 8019199 B2 US8019199 B2 US 8019199B2
- Authority
- US
- United States
- Prior art keywords
- stream
- audio
- subtitle
- streams
- language
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61P—SPECIFIC THERAPEUTIC ACTIVITY OF CHEMICAL COMPOUNDS OR MEDICINAL PREPARATIONS
- A61P31/00—Antiinfectives, i.e. antibiotics, antiseptics, chemotherapeutics
- A61P31/04—Antibacterial agents
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/12—Formatting, e.g. arrangement of data block or words on the record carriers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42646—Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/488—Data services, e.g. news ticker
- H04N21/4884—Data services, e.g. news ticker for displaying subtitles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/93—Regeneration of the television signal or of selected parts thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8211—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a sound signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8233—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a character code signal
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/1062—Data buffering arrangements, e.g. recording or playback buffers
- G11B2020/1075—Data buffering arrangements, e.g. recording or playback buffers the usage of the buffer being restricted to a specific kind of data
- G11B2020/10787—Data buffering arrangements, e.g. recording or playback buffers the usage of the buffer being restricted to a specific kind of data parameters, e.g. for decoding or encoding
Definitions
- the present invention relates to a reproducing apparatus, a reproducing method, a reproducing program, a recording medium, and a data structure that allow an audio stream and a subtitle stream to be properly and automatically selected when a program is reproduced from a large capacity recording medium.
- a digital versatile disc As a random-accessible, attachable-detachable recording medium, a digital versatile disc (DVD) has been used for years. In recent years, a disc-shaped recording medium that has a larger record capacity than the DVD and another disc-shaped recording medium that is smaller than the DVD have been developed.
- DVD digital versatile disc
- Such recording mediums that contain content of video and audio such as movies, drams, music concerts have been sold as sell packages.
- a plurality of audio and subtitle streams of different languages can be recorded as one program of content.
- a video stream, a plurality of audio streams of different languages, and a plurality of subtitle streams of different languages are multiplexed as one Moving Pictures Experts Group2 (MPEG) program stream.
- MPEG Moving Pictures Experts Group2
- Movie content created in a foreign country may contain a plurality of audio streams of different languages, which are for example an audio stream of English as an original language and an audio stream of Japanese as a Japanese dubbed audio stream, and a plurality of subtitle streams, which are a subtitle stream of English and a subtitle stream of Japanese.
- the user can select streams to be reproduced.
- the user may be able to select an audio stream of a Japanese dubbed version and a subtitle stream of an English version.
- the user may able to select an audio stream of a Japanese dubbed version and select no subtitle stream.
- a DVD player apparatus has an automatic selection function that allows a priority language to be initially set as language preset information and an audio stream and a subtitle stream to be automatically selected based on the language preset information without need to user's selections.
- the DVD player is provided with a function that selects the next stream to be reproduced depending on the reproduction history and reproduction path of content besides the initial setting.
- audio that is selected and reproduced by priority can be initially set.
- audio may be often initially set to “Japanese”.
- Japanese when the DVD player apparatus reproduces for example an American movie or a French movie, as long as an audio stream of Japanese has been recorded, an audio stream of the Japanese dubbed version is automatically selected by priority.
- some users may want to listen to audio of the original language in which content was created. For example, it seems that users may want to watch movie content with audio of the original language in which the content was created and a subtitle of a translated version of user's mother language. For example, when the user who is Japanese watches movie content created in the United States, the movie content may be reproduced with audio of English as the original language in which the content was created and a subtitle of Japanese, which is a user's mother language. When the user watches content by selecting audio and language in such a manner, he or she can enjoy the content in an atmosphere close to the original.
- Japanese Patent Application Laid-Open No. 2003-46951 discloses a reproducing apparatus having a mode of which a language of audio to be reproduced is selected and no subtitle is displayed, a mode of which a language of audio to be reproduced is set by default and a language of a subtitle is selected, and a mode of which languages of audio, a subtitle, and a menu are selected so that languages can be easily set.
- the exiting DVD video standard does have a mechanism that denotes whether or not the language of a recorded audio stream is the original language in which content was created.
- the original language is not able to be automatically selected when content of a disc is reproduced. Consequently, whenever content is reproduced from a disc, as a problem of the related art, the user needs to change the setting of audio.
- an object of the present invention is to provide a reproducing apparatus, a reproducing method, a reproducing program, a recording medium, and a data structure that allow audio and a subtitle to be properly and automatically selected when content is reproduced from a disc.
- the present invention is a reproducing apparatus of reproducing content data from a disc-shaped recording medium, comprising read means for reading data from a recording medium on which content data containing at least a video stream, one or a plurality of audio streams corresponding to the video stream and a production control program with which reproduction of the content data is controlled have been recorded; player means for reproducing the content data according to the reproduction control program; and first mode setting means for setting a first mode to the player means such that it automatically selects an audio stream of an original language from the one or plurality of audio streams when the content data are reproduced.
- the present invention is a reproducing method of reproducing content data from a disc-shaped recording medium, the method comprising the steps of reading data from a recording medium on which content data containing at least a video stream, one or a plurality of audio streams corresponding to the video stream, and a production control program with which reproduction of the content data is controlled have been recorded; reproducing the content data according to the reproduction control program; and setting a first mode to the content reproduction step such that an audio stream of an original language is automatically selected from the one or plurality of audio streams when the content data are reproduced.
- the present invention is a reproducing program causing a computer device to execute a reproducing method of reproducing content data from a disc-shaped recording medium, the reproducing method comprising the steps of reading data from a recording medium on which content data containing at least a video stream, one or a plurality of audio streams corresponding to the video stream, and a production control program with which reproduction of the content data is controlled have been recorded; reproducing the content data according to the reproduction control program; and setting a first mode to the content reproduction step such that an audio stream of an original language is automatically selected from the one or plurality of audio streams when the content data are reproduced.
- the present invention is a recording medium on which content data containing at least a video stream, one or a plurality of video streams corresponding to the video stream, a reproduction control program with which reproduction of the content data is controlled, and stream information describing at least information which identifies each of one or a plurality of audio streams such that the information which identifies the audio stream used as an original language comes first have been recorded.
- the present invention is a data structure comprising a video stream; content data containing one or a plurality of audio streams corresponding to the video stream; a reproduction control program with which reproduction of the content data is controlled; and stream information containing at least information which identifies each of the one or plurality of audio streams such that the information which identifies the audio stream used as an original language comes first.
- At least content data containing a video stream and one or a plurality of audio streams corresponding to the video stream and a reproduction control program with which reproduction of the content data is controlled have been recorded on a recording medium.
- a reproducing apparatus reproduces content data from the recording medium according to the reproduction control program reproduced therefrom, a mode in which an audio stream of an original language is automatically selected from one or a plurality of audio streams is set. The user can enjoy content of the original language reproduced from the disc without need to check the original language of the content and set the original language to the reproducing apparatus.
- At least content data containing a video stream and one or a plurality of audio streams corresponding to the video stream, a reproduction control program with which reproduction of the content data is controlled, and stream information representing at least information that identifies each of one or a plurality of audio streams in such a manner that information identifying an audio stream of the original language comes first in the stream information have been recorded on a recording medium.
- a reproducing apparatus that reproduces content from the disc can check an arrangement of the information that identifies audio streams in the stream information and identifies the audio stream of the original language.
- a data structure contains at least content data containing a video stream and one or a plurality of audio streams corresponding to the video stream, a reproduction control program with which reproduction of the content data is controlled, and stream information representing at least information that identifies each of one or a plurality of audio streams such that information identifying an audio stream of the original language comes first in the stream information.
- the audio stream of the original language can be identified.
- audio streams and subtitle streams of different languages can be properly and automatically selected.
- attribute “original language” can be set.
- a language in which content was created can be automatically selected.
- the subtitle stream is automatically caused to be not displayed.
- the user does not need to manually operate the apparatus not to display an undesired subtitle.
- the user-friendliness improves.
- FIG. 1 is a schematic diagram showing the structure of layers according to the UMD video standard
- FIG. 2 is a schematic diagram showing an example of a player model according to an embodiment of the present invention.
- FIG. 3 is a schematic diagram showing an example of the internal structure of a movie player
- FIG. 4 is a schematic diagram describing three statuses of the movie player
- FIG. 5 is a schematic diagram showing an event model of the movie player according to the embodiment of the present invention.
- FIG. 6 is a schematic diagram showing examples of events that occur while a play list is being reproduced
- FIG. 7 is a schematic diagram showing a list of examples of properties of a movie player object
- FIG. 8 is a schematic diagram showing a list of examples of methods of the movie player object
- FIG. 9 is a schematic diagram showing examples of key inputs as user's inputs.
- FIG. 10 is a schematic diagram showing examples of key inputs as user's inputs
- FIG. 11A , FIG. 11B and FIG. 11C are schematic diagrams showing examples of control commands according to key inputs
- FIG. 12 is a schematic diagram showing examples of events according to key inputs
- FIG. 13 is a schematic diagram showing examples of event handlers
- FIG. 14 is a schematic diagram showing examples of event handlers
- FIG. 15 is a flow chart showing an example of a process that executes a predetermined program according to a user's input event
- FIG. 16 is a flow chart showing a process performed after a disc is loaded into a UMD video player until the disc is ejected therefrom;
- FIG. 17 is a schematic diagram showing an example of the structure of a script file
- FIG. 18 is a flow chart showing an example of a procedure that executes event handler onAutoPlay( );
- FIG. 19 is a flow chart showing an example of a procedure that executes event handler onContinuePlay( );
- FIG. 20 is a flow chart showing an example of a process performed upon completion of reproduction
- FIG. 21 is a schematic diagram describing an example of a script program
- FIG. 22 is a schematic diagram showing an example of a script program
- FIG. 23 is a schematic diagram describing a file management structure according to the UMD video standard.
- FIG. 24 is a schematic diagram showing an example of syntax of the entire structure of file “PLAYLIST.DAT”;
- FIG. 25 is a schematic diagram showing an example of the internal structure of block PlayItem( );
- FIG. 26 is a schematic diagram showing an example of the internal structure of block PlayListMark( );
- FIG. 27 is a schematic diagram describing field mark_type of block Mark( );
- FIG. 28 is a schematic diagram describing designation of a mark time in a clip AV stream file
- FIG. 29 is a schematic diagram showing an example of syntax that represents the entire structure of clip AV stream file “XXXX.CLP”;
- FIG. 30 is a schematic diagram describing correlation of block StreamInfo( ) and an elementary stream
- FIG. 31 is a schematic diagram showing an example of the internal structure of block StaticInfo( );
- FIG. 32 is a schematic diagram showing an example of the internal structure of block DynamicInfo( );
- FIG. 33 is a schematic diagram showing an example of the internal structure of block EP_map( );
- FIG. 34 is a block diagram showing an example of the structure of a disc reproducing apparatus according to the present invention.
- FIG. 35A and FIG. 35B are a functional block diagram describing in detail the operation of the disc reproducing apparatus
- FIG. 36 is a schematic diagram describing automatic selection of audio and subtitle stream according to an embodiment of the present invention.
- FIG. 37 is a schematic diagram describing the automatic selection of audio and subtitle streams according to the embodiment of the present invention.
- FIG. 38 is a schematic diagram describing the automatic selection of audio and subtitle streams according to the embodiment of the present invention.
- FIG. 39 is a schematic diagram showing examples of values of property audioFlag
- FIG. 40 is a schematic diagram showing examples of values of property subtitleFlag
- FIG. 41A and FIG. 41B are schematic diagrams showing a list of examples of arguments of method play( );
- FIG. 42 is a flow chart showing a flow of a process of automatically selecting audio stream and subtitle stream from a plurality of types of audio streams and subtitle streams;
- FIG. 43 is a flow chart describing in detail an example of a process of automatically selecting an audio stream
- FIG. 44 is a flow chart describing in detail an example of a process of automatically selecting an audio stream
- FIG. 45 is a flow chart describing in detail an example of a process of automatically selecting a subtitle stream
- FIG. 46 is a flow chart describing in detail an example of a process of automatically setting property subtitleFlag.
- FIG. 47 is a flow chart describing another example of the process of automatically setting property subtitleFlag.
- UMD video standard 2. Player model according to UMD video standard 3. Event model of movie player 4. Movie player object 5. Example of script program 6. File management structure 7. Disc reproducing apparatus 8. Automatic selection of audio and subtitle streams 1.
- UMD video standard 2. Player model according to UMD video standard 3. Event model of movie player 4. Movie player object 5. Example of script program 6. File management structure 7. Disc reproducing apparatus 8. Automatic selection of audio and subtitle streams 1.
- a player model is described with a script language called an ECMA script.
- the ECMA script is a script language for a cross platform based on JavaScript (registered trademark) and standardized by European Computer Manufacturers Association (ECMA).
- the ECMA script has higher compatibility with HTML documents.
- the ECMA script allows original objects to be defined, the ECMA script can be suitably used for a player model according to the present invention.
- UMD Universal Media Disc: registered trademark
- UMD video script standard A script part of the UMD video standard.
- FIG. 1 shows the structure of layers of the UMD video standard.
- the UMD video standard defines a three-layer structure composed of a script layer, a play list layer, and a clip layer. Streams are managed according to this layer structure.
- digitally encoded video data, audio data, and subtitle data are treated as an MPEG2 stream of which they have been multiplexed as an elementary stream according to the MPEG2 (Moving Picture Experts Group 2) system.
- An MPEG2 stream of which elementary streams of video data, audio data, and subtitle data have been multiplexed is referred to as a clip AV stream.
- a clip AV stream is stored in a clip AV stream file.
- a clip information file is created according to the clip AV file in the relation of 1 to 1.
- a pair of a clip information file and a clip AV stream file corresponding thereto is referred to as a clip.
- a clip is a recording unit of a disc.
- the reproduction order of clips is managed in the play list layer higher than the clip layer.
- the play list layer is a layer that designates the reproduction path of clips.
- the play list layer contains one or a plurality of play lists.
- a play list is composed of a set of play items.
- a play item contains a pair of an IN point and an OUT point that represent the reproduction range of a clip. When play items are placed, clips can be reproduced in any order.
- a play item can redundantly designate clips.
- the IN point and the OUT point of a clip AV stream are designated with time stamps (intra-clip times). Time stamps are converted into byte positions of a clip AV stream file according to information of a clip information file.
- a play list has a structure that reproduces play items that represent all or part of clips in a predetermined order. Only with a play list, the reproduction order of clips cannot be changed. In addition, a play list does not provide the user with an interactive function. According to the embodiment of the present invention, a plurality of play lists are collectively stored in one file “PLAYLIST.DAT.”
- the script layer is a layer composed of UMD video scripts as an extension of ECMA scripts as language specifications.
- An UMD video script is a script of which an ECMA script is extended to accomplish a special function based on the UMD video standard.
- the script layer is an upper layer of the play list layer.
- the script layer is composed of a sequence of commands that designate the reproduction of a play list and that set a player. Commands in the script layer accomplish play list reproduction including a conditional branch so that one of streams according to a plurality of languages can be selected or streams can be reproduced according to a play list selected in a predetermined condition.
- An example of an application that uses play list reproduction including a conditional branch is a multi-story content.
- the script layer provides the user with an interactive function.
- the script layer is composed of one file “SCRIPT.DAT.”
- File “SCRIPT.DAT” is managed as a resource.
- File “SCRIPT.DAT” contains script data described according to a real ECMA script, sound data for sound effects and so forth in button operations, a screen design composed of image data used for a background image and so forth of a menu screen, and image data (bit map data) for GUI parts such as button images.
- the player reads a script program, a play list, and a clip information file from a disc. Thereafter, the player reads a clip AV stream file in the reproduction order according to those files and reproduces video data, audio data, subtitle data, and so forth.
- a functional block that reproduces a play list is implemented as an object in the script program.
- the object that reproduces the play list is referred to as the movie player object.
- Commands that designate the reproduction of the play list and set the player are methods of the movie player object.
- the movie player object is controlled by the methods of the script layer.
- the movie player object requires a function that informs the script layer of a state change, a reproduction position, and so forth. This function corresponds to an operation that the movie player object issues an event to the script program.
- a process corresponding to the event is described as an event handler.
- the script program can control the reproduction of a clip AV stream.
- FIG. 2 schematically shows an example of the player model according to the embodiment of the present invention.
- a movie player 300 is a module that reproduces video data, audio data, and subtitle data according to the UMD video standard.
- the movie player object is an object in a script program so that the script program operates a movie object.
- the movie player object is a script program that accomplishes the function of the movie player.
- the movie player 300 reads a clip AV stream file according to a database of a play list or clip information with a method of a lower layer (a native implementation platform 301 in the example shown in FIG. 2 ) as a user's input 310 or the like or a method of a script layer 302 as an upper layer and decodes and displays the clip AV stream.
- a method of a lower layer a native implementation platform 301 in the example shown in FIG. 2
- a method of a script layer 302 as an upper layer and decodes and displays the clip AV stream.
- the inside of the movie player object 300 depends on the implementation of the UMD video player that reproduces data from the UMD video disc.
- the script layer 302 provides APIs (Application Programming Interfaces) that are methods and properties as black-box objects.
- the UMD video player represents a real device that implements a movie player. All UMD video players implement a movie player according to the UMD video standard and have reproduction compatibility with other UMD video players.
- the movie player 300 has three input/output paths that are a path through which a control command 311 is received from the native implementation platform 301 , a path through which the script layer 302 is informed of an event 312 , and a path through which a method 313 is received from the script layer 302 .
- the control command 311 is a command that is received from the native implementation platform 301 and that controls the operation of the movie player object 300 .
- the native implementation platform 301 is an interface between an original portion of the UMD video player as a real device and the movie player 300 .
- the event 312 is a script event sent from the movie player 300 to the script layer 302 .
- the method 313 is a method that a script program of the script layer 302 designates to the movie player 300 .
- the movie player object 300 has a database 320 for play lists and clip information according to the UMD video standard.
- the movie player object 300 masks the user's input 310 .
- the movie player object 300 performs for example a process that converts the reproduction position designated by a time into a byte position of a clip AV stream with the database 320 .
- a playback module 321 of the movie player object 300 decodes a clip AV stream, which is an MPEG2 PS (Program Stream) of which video data, audio data, and subtitle data have been multiplexed.
- the playback module 321 has three states that are play, stop, and pause.
- the playback module 321 changes among these states with a control command and a method (see FIG. 3 ).
- the script layer 302 is a layer that executes a script program according to the UMD video script standard, controls the movie player object 300 , and displays data on the display.
- the script layer 302 accomplishes a scenario that the content creator side intends.
- the script layer 302 issues the method 313 to the movie player object 300 .
- the script layer 302 receives the event 312 from the movie player object 300 .
- the script layer 302 exchanges a key event 314 according to the user's input 310 and a method 315 that causes the native implementation platform 301 to display data on the display with the native implementation platform 301 .
- buttons on the menu screen are generated by the native implementation platform 301 according to the method 315 supplied from the script program of the script layer 302 to the native implementation platform 301 .
- the key event 314 according to the user's input 310 is sent from the native implementation platform 301 to the script layer 302 .
- the script program of the script layer 302 performs a process with the key event 314 according to the user's input 310 .
- the movie player 300 performs decode and display controls for video data, audio data, and subtitle data.
- the script layer 302 performs arrange and display processes for part images that compose graphical user interfaces such as buttons (hereinafter, these part images are referred to as GUI parts) and processes against selection and decision operations of the GUI parts.
- the native implementation platform 301 is a platform for operations of the movie player object 300 and the script program.
- the native implementation platform 301 is implemented as hardware so that the native implementation platform 301 intermediates a process between hardware and the player model.
- the native implementation platform 301 receives the user's input 310 from the user and determines whether the received user's input 310 is a command for the movie player 300 or a command for a button generated and displayed in the script layer 302 . When the determined result represents that the user's input 310 is a command for the movie player 300 , the native implementation platform 301 converts the user's input 310 into the control command 311 that is an internal control command for the movie player 300 and issues a control command to the movie player 300 .
- the native implementation platform 301 informs the script layer 302 of the key event 314 according to the user's input 310 .
- the native implementation platform 301 can display for example a button image on the display according to the method 315 that the script layer 302 designates according to the key event 314 .
- the native implementation platform 301 and the script layer 302 can directly exchange an event and a method not through the movie player 300 .
- FIG. 3 shows an example of the internal structure of the movie player 300 .
- the movie player 300 is composed of the database 320 and the playback module 321 .
- the database 320 is an area that stores information of a play list read from the disc and information of clips, namely clip information.
- the playback module 321 is composed of a decoder engine 322 and a property 323 .
- the property 323 is a value that represents the state of the playback module 321 .
- the property 323 has two types of a property 323 A (read-only parameter) whose value depends on the initial setting of the movie player 300 like a language code and a property 323 B (player status) whose value depends on the state of the playback module 321 .
- the value of the property 323 A whose value depends on the initial setting, is set by a native device for example a real device.
- the value of the property 323 A is not changed by a play list, clip information, and a script program.
- the value of the property 323 A can be read from a script program.
- the value of the property 323 B which represents the state of the playback module 321 , can be read from a script program.
- the value of the property 323 B can be written from some script programs.
- the movie player object 300 reproduces a play list designated by the script layer 302 or the native implementation platform 301 .
- the movie player 300 references the database 320 and obtains the reproduction position of the clip AV stream as the byte position of the file according to the designated play list.
- the decoder engine 322 controls the decoding of the clip AV stream according to the information of the reproduction position.
- the movie player 300 has three states of play, stop, and pause depending on the reproduction state of a play list.
- the play state represents that a play list is being reproduced and a time has elapsed.
- the play state includes regular reproduction, variable speed reproduction such as double speed reproduction and 1 ⁇ 2 speed reproduction, fast forward, and fast reverse.
- the pause state represents that a play list is being reproduced and time axis stops. So-called frame reproduction, of which frames are forward and reverse reproduced, is a state of which the pause state and the play state are repeated.
- the stop state represents that a play list is not being reproduced.
- the state of the movie player 300 depends on the state change among play, pause, and stop of the decoder engine 322 of the movie player 300 .
- the value of the property 323 B is updated according to the state change of the decoder engine 322 .
- Resume information 324 stores the state that exits immediately before the stop state occurs. After the movie player 300 decodes a play list, when the movie player 300 is in the play state, if the state of the movie player 300 is changed to the stop state, the resume information 324 stores the state that exists immediately before the stop state occurs. In addition, the resume information 324 for each title of the disc can be stored in a nonvolatile memory of the player as hardware. The disc has unique identification information (referred to as title ID) for each title of the disc. The resume information 324 and the identification information are correlatively stored. Thus, when the state of the disc having the title according to the identification information is changed from the stop state to the play state, data can be reproduced from the position at which the stop state occurred.
- an event model of the movie player 300 In the play state, the movie player 300 reproduces a play list and generates various events.
- the events execute process programs described as scripts and referred to as event handlers.
- the event handlers are methods called upon occurrence of events.
- a program execution model that starts executing a process program upon occurrence of an event is referred to as an event driven model.
- an event driven model In an event driven model, an irregular event occurs.
- a script program controls the operations of the movie player object 300 with an event handler group.
- FIG. 5 schematically shows an event model of the movie player 300 according to the embodiment of the present invention.
- event handlers onEventA( ), onEventB( ), and onEventC( ) are interfaces.
- the contents of the event handlers are described as scripts.
- the contents of the event handlers are created and implemented by for example the content creator side.
- an event handler is provided for each event of which the movie player object 300 informs the script program.
- it is decided that a process program executed upon occurrence of event A is event handler onEventA( ). This applies to event B and event C.
- event B occurs
- corresponding event handler onEventB( ) is executed.
- event C occurs, corresponding event handler onEventC( ) is executed.
- the content creator side Since the system side selects an event handler called upon occurrence of an event, the content creator side does not need to describe a process that determines what event occurred in a script program.
- FIG. 6 shows examples of events that occur while a play list is being reproduced. Since chapter mark ChapterMark is described at the beginning of play list PlayList, when the play list is reproduced from the beginning, event Chapter corresponding to the chapter mark occurs. Whenever the chapter is changed to another chapter, the script layer 302 is informed of event Chapter and the corresponding event handler onChapter is executed. When reproduction time for event mark EventMark elapses, a corresponding mark event occurs. At the end of the play list, the movie player 300 pauses the reproduction of the play list and informs the script layer 302 of event PlayListEnd. The script layer 302 side causes the movie player 300 to start reproducing another play list in the corresponding event handler onPlayListEnd( ). In such a manner, the movie player 300 continues to reproduce a sequence of play lists in the order that the content creator side intended.
- the upper program executes an operation (default event handler) built in the player and that is defined in the standard or ignores the event.
- an operation default event handler
- event models there may be an event listener model, a single-method model, and so forth.
- an event listener model an object registers a listener according to a predetermined event to the player object.
- an event that occurs in the player object is an event that has been registered
- the player object transmits the event to the object that has registered the event.
- the object executes a corresponding method.
- one method is called whatever event occurs.
- the event model according to the embodiment of the present invention is simpler than an event listener model that requires processes such as event registration process and event deletion process.
- the single-method model needs to know what event occurred and describe in the method a pre-process that changes a process routine according to each event that occurs. Since the method is implemented by the content creator side, even if the model is simple, the load of the content creator side increases. In addition, whenever an event occurs, since one large process program (method) is called, a large memory area will be used and the execution speed will become slow. Thus, since the model according to the embodiment of the present invention provides process programs (event handlers) according to individual events, the model is superior to the other models in these points.
- an object defined according to the ECMA script language specifications has a property and a method.
- the movie player object 300 according to the embodiment of the present invention has a property and a method.
- an external object designates an object name and a property name
- the object can directly read and write a property.
- method setXXX( ) (where “XXX” represents a property name) that sets a property value and method getXXX( ) that reads a property value are defined
- the methods can read and write properties of other objects.
- FIG. 7 shows a list of examples of properties that the movie player object 300 has. These properties correspond to the property 323 shown in FIG. 3 .
- the properties that belong to the read-only parameters 323 A shown in FIG. 3 are as follows.
- Property scriptVersion represents the version of the UMD video script.
- Property languageCode represents the language code of the menu display language that is set to the UMD video player.
- Property audioLanguageCode represents the language code of the audio language that is set to the UMD video player.
- Property subtitleLanguagecode represents the language code of the subtitle language that is set to the UMD video player.
- a scrip file that is read from the disc is decided according to the language code represented by property languageCode that is set in the read-only parameter 323 A.
- a default script file is read from the disc. For example, a file recorded at the beginning of a plurality of script files is read as a default script file.
- Property playListNumber represents the play list number of a play list that is currently being reproduced.
- Property chapterNumber represents the chapter number of a chapter that is currently being reproduced.
- Property videoNumber represents the video stream number of a video stream that is currently being reproduced.
- Property audioNumber represents the audio stream number of an audio stream that is currently being reproduced.
- Property subtitleNumber represents the subtitle stream number of a subtitle stream that is currently being reproduced.
- Property playListTime represents the time of the play list when the beginning of the play list is 0.
- Property audioFlag designates ON/OFF of the audio reproduction and dual monaural LR.
- Property subtitleFlag represents ON/OFF of the subtitle indication.
- the dual monaural is a mode of which left and right (L, R) channels of stereo audio are independently used as monaural audio channels.
- each property that belongs to the player status 323 B represents these information.
- each property that belongs to the player status 323 B is backed up as the resume information 324 . At this point, the contents of the player status 323 B may be cleared.
- FIG. 8 shows a list of examples of methods that the movie player object 300 has.
- the methods correspond to the method 313 shown in FIG. 3 .
- Method play( ) reproduces video data.
- Method playChapter( ) designates a chapter and reproduces video data of the designated chapter.
- Method stop( ) stops reproducing video data.
- Method pause( ) pauses the reproduction of video data.
- Method playStep( ) reproduces video data step by step.
- Method changeStream( ) changes a video stream, an audio stream, and/or a subtitle stream.
- Method getPlayerStatus( ) obtains the play state, the stop state, the pause state, or the like of the movie player 300 .
- Method reset( ) stops the reproduction of video data and clears the contents of the resume information 324 .
- video data can be displayed at a part of the display screen.
- the following four methods are methods that display video data at a part of the display screen.
- Method setpos( ) sets the display position of video data.
- Method getPos( ) obtains the display position of video data.
- Method setSize( ) sets the display size of video data.
- Method getSize( ) obtains the display size of video data.
- the movie player 300 and the native implementation platform 301 are integrated.
- the movie player 300 UMD and the native implementation platform 301 are correlated as hardware, a UMD player that loads a disc and reproduces video data from the disc, and software that controls the UMD player.
- What portion is hardware and what portion is software depend on the implemented structure. For example, when the UMD player is a personal computer or the like, the other portions except for the disc dive are composed of software.
- the video decoder, the audio decoder, and so forth may be composed of hardware.
- methods, commands, and events exchanged between the movie player 300 and the native implementation platform 301 are not limited to those explicitly shown in FIG. 2 .
- the user's input 310 is received first by the native implementation platform 301 .
- the native implementation platform 301 receives a key input of the user as the user's input 310 .
- the native implementation platform 301 determines whether the user's input 310 is a command to the movie player 300 or an event to a script program of the script layer 302 .
- the native implementation platform 301 generates the control command 311 or the key event 314 and informs the corresponding upper layer (movie player 300 or the script layer 302 ) of the generated control command 311 or key event 314 .
- FIG. 9 and FIG. 10 show examples of key inputs of the user's input 310 .
- keys having prefix “VM” are abstracted virtual keys that do not depend on the implementation.
- FIG. 9 shows examples of key inputs with respect to the operations of the movie player 300 .
- Key VK_POWER provides a function corresponding to a power key.
- Key VK_POWER_ON provides a function corresponding to a power ON key.
- Key VK_POWER_OFF provides a function corresponding to a power OFF key.
- Key VK_MENU provides a function corresponding to a menu key that displays a menu.
- Key VK_ENTER provides a function corresponding to an enter key that ends a command or data input.
- Key VK_RETURN provides a function that returns the process by one step.
- Key VK_PLAY provides a function corresponding to a play key that starts the reproduction operation.
- Key VK_STOP provides a function corresponding to a stop key that stops the reproduction operation.
- Key VK_PAUSE provides a function corresponding to a pause key that pauses the reproduction operation.
- Key VK_FAST_FORWARD provides a function corresponding to a fast forward key that performs the fast forward reproduction operation.
- Key VK_FAST_REVERSE provides a function corresponding to a fast reverse key that performs the fast reverse reproduction operation.
- Key VK_SLOW_FORWARD provides a function corresponding to a slow (forward) key that performs the forward slow reproduction operation.
- Key VK_SLOW_REVERSE provides a function corresponding to a slow (reverse) key that performs the reverse slow reproduction operation.
- Key VK_STEP_FORWARD provides a function corresponding to a step (forward) key that performs the forward step reproduction operation.
- Key VK_STEP_REVERSE provides a function corresponding to a frame (reverse) key that performs the reverse step reproduction operation.
- FIG. 10 shows key inputs with respect to the menu operations.
- Key VK_NEXT provides a function corresponding to a next designation key that inputs a value that represents “next.”
- Key VK_PREVIOUS provides a function corresponding to a previous designation key that inputs a value that represents “previous.” With key VK_NEXT and key VK_PREVIOUS, the user can designate for example the movement to the next chapter and the previous chapter, respectively.
- Key VK_UP provides a function corresponding to an up direction designation key that inputs a value that represents “up.”
- Key VK_DOWN provides a function corresponding to a down direction designation key that inputs a value that represents “down.”
- Key VK_RIGHT provides a function corresponding to a right direction designation key that input a value that represents “right.”
- Key VK_LEFT provides a function corresponding to a left direction designation key that inputs a value that represents “left.”
- Key VK_UP_RIGHT provides a function corresponding to an upper right direction designation key that inputs a value that represents “upper right.”
- Key VK_UP_LEFT provides a function corresponding to an upper left direction designation key that inputs a value that represents “upper left.”
- Key VK_DOWN_RIGHT provides a function corresponding to a down right direction designation key that inputs a value that represents “down right.”
- Key VK_DOWN_LEFT provides a function corresponding to a down left direction designation key that inputs a
- Key VK_ANGLE provides a function corresponding to an angle change key that designates an angle change operation for multi-angle video data.
- Key VK_SUBTITLE provides a function corresponding to a subtitle change key that designates English subtitle, Japanese subtitle, and subtitle ON/OFF.
- Key VK_AUDIO provides a function corresponding to an audio change key that designates an audio mode such as surround mode or bilingual mode.
- Key VK_VIDEO_ASPECT provides a function corresponding to an aspect change key that changes an aspect ratio of video data.
- Key VK_COLORED_KEY — 1 provides a function corresponding to a colored function key 1.
- Key VK_COLORED_KEY — 2 provides a function corresponding to a colored function key 2.
- Key VK_COLORED_KEY — 3 provides a function corresponding to a colored function key 3.
- Key VK_COLORED_KEY — 4 provides a function corresponding to a colored function key 4.
- Key VK_COLORED_KEY — 5 provides a function corresponding to a colored function key 5.
- Key VK_COLORED_KEY — 6 provides a function corresponding to a colored function key 6.
- the native implementation platform 301 Since the functions of the key inputs shown in FIG. 9 are different in their roles from those of the key inputs shown in FIG. 10 , the native implementation platform 301 needs to select destinations that are informed of the key inputs. As described above, key inputs shown in FIG. 9 designate the reproduction operations of video data, audio data, and subtitle data. When the native implementation platform 301 receives one of the key inputs shown in FIG. 9 as the user's input 310 , the native implementation platform 301 converts the received key input into a command shown in FIG. 11A , FIG. 11B and FIG. 11C and informs the movie player 300 of the converted command.
- the script layer 302 which structures a screen and generates buttons, needs to be informed of these inputs.
- the native implementation platform 301 receives one of key inputs shown in FIG. 10 as the user's input 310 , the native implementation platform 301 converts the key input into the event 314 shown in FIG. 2 and informs the script layer 302 of the event 314 .
- FIG. 12 shows examples of the key event 314 according to the key inputs.
- FIG. 9 and FIG. 10 show also key inputs with respect to stream change operations such as key VK_ANGLE, key VK_SUBTITLE, and key VK_AUDIO. These key inputs are key inputs that accomplish the same functions as stream change methods that the script program performs to the movie player 300 .
- Command uo_timeSearch(playListTime) designates the reproduction of a play list that is being reproduced from a designated time.
- Argument playListTime represents the time of the play list when the beginning of the play list is 0. Since this command does not designate a play list number, the time represented by argument playListTime is a designated time in the range of the play list being reproduced.
- Command uo_play( ) designates the start of the reproduction at a predetermined reproduction speed such as regular reproduction speed. The start position of the play list is decided according to the resume information 324 . When there is no information corresponding to the resume information 324 , the user's operation is invalidated. This command corresponds to the execution of method play( ) without the play list number designated. With this command, the user cannot designate a play list number.
- Command uo_playChapter(chapterNumber) starts reproducing the play list being reproduced from a chapter designated by argument chapterNumber. Without the chapter number designated, this command starts reproducing the play list from the beginning of the chapter being reproduced. This command corresponds to method playChapter( ) without the chapter number designated.
- Command uo_playPrevChapter( ) starts reproducing the play list from the immediately previous chapter.
- Command uo_playNextChapter( ) starts reproducing the play list from the immediately next chapter.
- Command uo_stop( ) stops reproducing the play list.
- Command uo_jumpToEnd( ) jumps to the end of the play list.
- This command corresponds to a user's operation that causes the movie player 300 to stop the reproduction and generate event playListEnd.
- the script layer 302 executes event handler onPlayListEnd.
- Command uo_forwardScan(speed) forward reproduces the play list at a reproduction speed designated by argument speed.
- Command uo_backwardScan(speed) backward reproduces the play list at a reproduction speed designated by argument speed.
- Argument speed of these commands uo_forwardScan(speed) and uo_backwardScan(speed) depends on the implementation of the UMD video player.
- Command uo_playStep(forward) forward reproduces the play list step by step.
- Command uo_playStep(backward) backward reproduces the play list step by step.
- Command uo_pauseOn( ) pauses the reproduction of the play list according to a user's operation.
- Command uo_pauseOff( ) cancels the pause state of the reproduction of the play list according to a user's operation.
- Command uo_changeAudioChannel(value) changes the channel of audio data or one channel of dual monaural reproduction. When this command is executed, the value of flag audioFlag needs to be accordingly changed.
- Command uo_setAudioEnabled(Boolean) turns ON/OFF the audio stream. When this command is executed, the value of flag audioFlag needs to be accordingly changed.
- Command uo_setSubtitleEnabled(Boolean) turns ON/OFF the subtitle stream.
- the value of flag subtitleFlag needs to be accordingly changed.
- Command uo_angleChange( ) changes the display angle. When the movie player 300 is informed of the user's operation for this command, the movie player 300 informs the script layer 302 of event angleChange.
- Command uo_audiochange(audioStreamNumber) changes the audio stream to be reproduced.
- Command uo_subtitleChange(subtitleStreamNumber) changes the subtitle stream to be reproduced.
- Event menu jumps to a menu.
- the native implementation platform 301 informs the script layer 302 rather than the movie player 300 of this event.
- the script layer 302 executes event handler onMenu.
- Event exit is an event that the native implementation platform 301 issues when it completes the UMD video application.
- the script layer 302 executes event handler onExit.
- Event up, event down, event left, event right, event focusIn, event focusOut, event push, and event cancel are events that occur when button images as GUI parts on the screen are focused.
- the native implementation platform 301 informs the script layer 302 rather than the movie player 300 of these events.
- Event up, event down, event left, and event right occur when an up button image, a down button image, a left button image, and a right button image are focused, respectively.
- Event focusIn occurs when any button image is focused.
- Event focusOut occurs when any focused button image is defocused.
- Event push occurs when a press operation is performed for any focused button image.
- Event cancel occurs when a cancel operation is performed against the press operation for any button image.
- Event autoPlay and event continuePlay are events that cause the script layer 302 to start executing a script.
- Event autoPlay is an event that causes a script to automatically start executing when a disc is loaded.
- Event continuePlay causes a script to resume executing from the position that the script was stopped according to for example the resume information 324 when a disc is loaded.
- Event handlers There are programs that are executed when events shown in FIG. 12 occur. These programs corresponding to the events are referred to as event handlers. Events and event handlers can be correlated using for example names. For example, event handler names are created by adding a prefix “on” to event names. FIG. 13 and FIG. 14 show examples of event handlers. When the content creator describes the contents of event handlers, the UMD video player can perform various operations that the content creator intends.
- FIG. 13 shows examples of events that the movie player 300 has and corresponding event handlers. Events shown in FIG. 13 correspond to the event 312 shown in FIG. 2 .
- the movie player 300 informs the script layer 302 of the events shown in FIG. 13 .
- the event handlers are kinds of interfaces. The contents of the event handlers are implemented by the content creator using the script language. Since the event handlers have such a structure, when events occur, operations that the content creator intends can be accomplished.
- Event mark and event handler onMark( ) are executed when an event mark is detected.
- An event mark is embedded in for example a play list. While the movie player 300 is reproducing the play list, the movie player 300 detects a play list from the play list. When the movie player 300 detects an event mark, the movie player 300 informs the script layer 302 of event mark. The script layer 302 executes event handler onMark( ) corresponding to event mark. Likewise, event playListEnd and event handler onPlayListEnd( ) are executed when the reproduction of a play list is completed. Event chapter and event handler onChapter( ) are executed when a chapter mark is detected. A chapter mark is embedded in for example a play list and detected by the movie player 300 while it is reproducing the play list.
- Event angleChange and event handler onAngleChange( ) are executed when the angle change is designated by a user's operation. For example, when key input VK_ANGLE is input to the native implementation platform 301 by the user's operation as the user's input 310 , the native implementation platform 301 converts the user's input 310 into command uo_angleChange( ) and supplies it to the movie player 300 . The movie player 300 generates event angleChange corresponding to command uo_angleChange and supplies event angleChange to the script layer 302 . The script layer 302 executes event handler onAngleChange( ) corresponding to event angleChange.
- event audiochange and event handler onAudioChange( ) are executed when the audio change is designated by a user's operation.
- Event subtitleChange and event handler onSubtileChange( ) are executed when the subtitle change is designated by a user's operation.
- FIG. 14 shows examples of event handlers that the system object has.
- the event handlers shown in FIG. 14 are event handlers that the native implementation platform 301 has in advance.
- the native implementation platform 301 informs the script layer 302 of the event handlers.
- Event menu and event handler onMenu( ) jump to a menu.
- Event menu is an event of which the native implementation platform 301 informs the script layer 302 when the menu key is pressed by a user's operation.
- the script layer 302 receives the event, executes the corresponding event handler onMenu( ), and arranges and displays GUI parts that compose a menu screen with event handler onMenu( ).
- Event exit and event handler onExit( ) are an event and a corresponding event handler that the native implementation platform 301 generates when it completes the UMD video application.
- the native implementation platform 301 informs the script layer 302 of event exit.
- the script layer 302 receives event exit, the script performs an exit process with event handler onExit( ). Event autoPlay, event handler onAutoPlay( ), event continuePlay, and event handler onContinuePlay( ) start executing corresponding scripts.
- event handlers for the system object there are event handlers for buttons.
- event handlers for buttons do not closely relate to the present invention, their description will be omitted.
- FIG. 15 shows an example of which while the UMD video player is normally reproducing data from a disc, when the user presses the “next” key to causes the UMD video player to reproduce the next chapter, the UMD video player jumps to the next chapter according to the key input, starts reproducing the next chapter, and displays a prepared message on the screen.
- the native implementation platform 301 generates user command uo_playNextChapter( ) corresponding to the user's input 310 (at step S 11 ).
- the native implementation platform 301 informs the movie player 300 of user command uo_playNextChapter( ).
- the movie player 300 When the movie player 300 receives command uo_playNextChapter( ), the movie player 300 searches the database 320 for the position of the next chapter mark based on the current reproduction position corresponding to play list information (at step S 12 ). At step S 13 , it is determined whether the next chapter mark exists. When the determined result represents that the next chapter mark does not exist, the movie player 300 does not perform the chapter jump operation, but continues the current reproduction operation.
- step S 14 the movie player 300 stops the current reproduction operation and obtains the byte position of the next chapter mark in the clip AV stream file from feature point information of the clip information file of the database 320 .
- step S 15 the movie player 300 accesses the obtained byte position of the file and starts reproducing the stream from the position.
- step S 16 a process that displays a message that informs the user that the chapter was changed on the screen is performed.
- event chapter occurs (at step S 16 ).
- the movie player 300 detects the chapter mark at the beginning of the chapter, event chapter occurs.
- the movie player 300 informs the script layer 302 of event chapter.
- the movie player 300 also informs the script layer 302 of the chapter number of the chapter to be jumped.
- the script layer 302 starts executing an event handler corresponding to the informed event, for example event handler onChapter( ) (at step S 17 ).
- a script of the script layer 302 executes the event handler, obtains the chapter number of which the movie player 300 informed the script layer 302 when the event occurred (at step S 18 ), and causes the native implementation platform 301 to display a predetermined message that represents for example the beginning of the obtained chapter number on the screen.
- the native implementation platform 301 displays the message on the screen (at step S 19 ) and completes the process of the event handler (at step S 20 ).
- the movie player 300 when the user operates the key “next” that causes the movie player 300 to start reproducing the next chapter, the movie player 300 performs the chapter jump operation and displays a message that represents the beginning of the chapter on the screen when the movie player 300 starts reproducing the next chapter to be jumped.
- the user's input event causes the state of the movie player 300 to change.
- the user's input event causes a new event to occur.
- the movie player 300 can perform various processes.
- FIG. 16 shows a process after a disc is loaded into the UMD video player until the disc is ejected therefrom.
- hatched steps represent states in which a script is being executed.
- the native implementation platform 301 references the resume information 324 and loads continuous reproduction information corresponding to the disc from the resume information 324 (at step S 31 ).
- the resume information 324 corresponding to the disc is referenced. It is determined whether the continuous reproduction information exists (at step S 32 ). When the continuous reproduction information exists, the native implementation platform 301 informs the script layer of event continuePlay. The script layer 302 executes event handler onContinuePlay corresponding to the informed event continuePlay (at step S 33 ). When the determined result at step S 32 represents that the continuous reproduction information corresponding to the disc does not exist, the flow advances to step S 34 . At step S 34 , the native implementation platform 301 informs the script layer 302 of event autoPlay. The script layer 302 executes event handler onAutoPlay corresponding to event autoPlay.
- step S 35 the reproduction operation for the disc and other operations are preformed according to the contents of event handler onAutoPlay and event handler onContinuePlay. An event that occurs corresponding to the reproduction operation for the disc and an event handler corresponding to the event are executed.
- step S 36 the script layer 302 executes event handler onExit corresponding to event exit.
- Event handler onexit executes a process that completes the UMD video application.
- Event exit is generated by the native implementation platform 301 according to the user's input 310 as a predetermined operation on for example the remote control commander.
- the native implementation platform 301 When the script process according to event handler onExit is completed, the native implementation platform 301 operates. At step S 37 , the movie player 300 executes a process that stops the reproduction operation. At this point, the state that exists immediately before the movie player 300 stops the reproduction operation is stored as continuous reproduction information in the resume information 324 . The reproduction operation for the disc is completed (at step S 38 ). When the reproduction operation for the same disc is not preformed (at step S 39 ), the flow advances to step S 40 . At step S 40 , the native implementation platform 301 ejects the disc and completes the sequence of steps of the process. When the reproduction operation for the same disc is performed, the flow returns to step S 31 .
- FIG. 17 shows an example of the structure of a script file.
- a script file is file “SCRIPT.DAT” that composes the script layer 302 .
- a script file is composed of an event handler group and a main process portion.
- the event handler group is composed of one or a plurality of event handlers. Whenever the script layer 302 is informed of occurrence of an event, an event handler corresponding to the informed event is retrieved and executed.
- the main process portion describes definitions of global variables used in event handlers. The main process portion is initially executed one time.
- FIG. 18 shows an example of a procedure that executes event handler onAutoPlay( ).
- the movie player 300 performs this procedure.
- the native implementation platform 301 determines whether the script contains event handler onAutoPlay( ).
- the native implementation platform 301 informs the script layer 302 of event autoPlay (at step S 52 ).
- the script layer 302 executes event handler onAutoPlay( ).
- the movie player 300 automatically starts reproducing data from the loaded disc.
- step S 51 when the determined result at step S 51 represents that the script does not contain event handler onAutoPlay( ), the flow advances to step S 53 .
- the native implementation platform 301 informs the script layer 302 of event exit. In this case, when the user operates the menu key for the reproduction operation on the menu screen implemented in the native implementation platform 301 , the movie player 300 starts reproducing data from the disc.
- the script layer 302 has event handler onExit( )
- the script layer 302 executes event handler onExit( ).
- FIG. 19 shows an example of a procedure that executes event handler onContinuePlay( ).
- the movie player 300 performs this procedure.
- the native implementation platform 301 determines whether the resume information 324 corresponding to the loaded disc exists. When the resume information 324 does not exist, the flow advances to step S 62 .
- the movie player 300 performs the reproduction operation for the disc from the beginning.
- step S 63 the native implementation platform 301 determines whether the script contains event handler onContinuePlay( ).
- the native implementation platform 301 informs the script layer 302 of event handler onContinuePlay( ). Accordingly, the script layer 302 executes event handler onContinuePlay( ) (at step S 64 ).
- the movie player 300 resumes the reproduction for the loaded disc according to event handler onContinuePlay( ).
- step S 65 the native implementation platform 301 executes default event handler onContinuePlay( ).
- the default event handler onContinuePlay( ) simply starts the reproduction operation from the last reproduction end position according to for example the resume information 324 .
- step S 60 after the user causes the movie player 300 to perform the continuous reproduction operation, the native implementation platform 301 determines whether the resume information 324 corresponding to the loaded disc exists. Instead, inversely, first, the native implementation platform 301 may determine whether the resume information 324 corresponding to the loaded disc exists. When the resume information 324 exists, the native implementation platform 301 may ask the user whether to perform the continuous reproduction operation.
- FIG. 20 shows an example of a process preformed upon completion of the reproduction operation. While the movie player 300 is performing the reproduction operation for a disc, when the user causes the movie player 300 to stop the reproduction operation (at step S 70 ), the movie player 300 performs this process. When the user's input 310 that causes the movie player 300 to stop the reproduction operation is input to the native implementation platform 301 , it starts an exit process (at step S 71 ).
- the exist process is composed of for example the following three steps:
- the native implementation platform 301 executes the exit process at step S 71 .
- the flow advances to step S 73 .
- the native implementation platform 301 informs the script layer 302 of event exit. Accordingly, the script layer 302 executes onExit( ) (at step S 74 ).
- Event handler onExit( ) executes for example a predetermined post process performed upon completion of the reproduction operation and method setUserData that stores user's setting data.
- the native implementation platform 301 performs the exit process.
- the native implementation platform 301 stores continuous information to for example a nonvolatile memory (namely, a backup of the state that exists immediately before the reproduction operation is completed to the resume information 324 ) and causes the system menu to appear on the screen.
- a nonvolatile memory namely, a backup of the state that exists immediately before the reproduction operation is completed to the resume information 324
- the player model can reproduce video data, audio data, and subtitle data. Since events that the content creator intended occur at reproduction times that he or she intended and corresponding event handlers that he or she intended are executed, operations that he or she intended can be accomplished.
- the native implementation platform 301 informs the movie player 300 of a command corresponding to the user's operation so that the state of the player is changed to the state that the user intended.
- the native implementation platform 301 informs the script layer 302 of an event corresponding to the user's input. As a result, the script layer 302 can accomplish the operations that the content creator intended corresponding to user's operations.
- the user can interactively operate the video player to reproduce video data, audio data, and subtitle data.
- the content shown in FIG. 21 has as display elements play lists 400 and 401 , a top menu 402 , and a message 403 .
- the play list 400 is used to display a warning message that is automatically displayed when a disc is loaded.
- the play list 401 is a main part of a movie as an example of the content.
- the top menu 402 has GUI parts such as buttons with which the user causes the play list 401 to be reproduced.
- the message 403 is displayed at any reproduction time of the play list 401 .
- Event handler onPlayListEnd( ) is an event handler that is called when the reproduction of the play list is completed.
- event handler onPlayListEnd( ) determines what play list's reproduction is completed.
- event handler onPlayListEnd( ) starts the reproduction of the play list 401 .
- event handler onPlayListEnd calls the top menu 402 .
- Event handler onMenu( ) is called when the user operates the menu key.
- Event handler onMenu( ) calls the top menu 402 and displays it on the screen.
- Event handler onMark( ) is executed when a reproduction time designated by mark Mark elapsed.
- mark Mark is set in the play list 401 .
- the play list 401 is reproduced and the reproduction time designated by mark Mark elapses, the message 403 is displayed on the screen.
- event handler onAutoPlay when the disc is loaded into the UMD video player, event handler onAutoPlay is called.
- Event handler onAutoPlay reproduces the play list 400 and displays a warning message.
- event handler onPlayListEnd After the reproduction time of the play list 400 has elapsed, at the end of the play list 400 , event handler onPlayListEnd is called.
- Event handler onPlayListEnd determines that the play list 400 has been completely reproduced and reproduces the next play list 401 .
- event handler onMenu When the user operates the menu key while the play list 401 is being reproduced, event handler onMenu is called.
- Event handler onMenu displays the top menu 402 on the screen.
- Event handler onMenu starts reproducing the play list 401 from the beginning corresponding to a predetermined operation on the top menu 402 .
- event handler onMark When the reproduction time of the play list 401 has elapsed for the time designated by mark Mark, event handler onMark is called.
- Event handler onMark displays the message 403 on the screen.
- event handler onPlayListEnd When the play list 401 has been completely reproduced, event handler onPlayListEnd is called.
- Event handler determines that the play list 401 has been completely reproduced and displays the top menu 402 on the screen.
- FIG. 22 shows an example of a script program that accomplishes the operation shown in FIG. 21 .
- the script program has event handlers and executes them upon occurrence of corresponding events.
- the script program is stored in file “SCRIPT.DAT” that will be described later.
- Method “movieplayer.play( )” causes the movie player 300 to reproduce a play list.
- a play list number to be reproduced is described in parentheses ( ) as an argument.
- event playListEnd occurs.
- event playListEnd occurs, the script calls event handler movieplayer.onPlayListEnd( ).
- object event_info is supplied to the script.
- the play list number of the play list that has been completely reproduced and so forth are stored in object event_info.
- the script can change the next operation corresponding to the content of object event_info.
- TITLEID.DAT is a file that stores a title identifier that differs in each title (type of content).
- One disk has one file “TITLEID.DAT.”
- File “SCRIPT.DAT” is placed under directory “RESOURCE.” As described above, file “SCRIPT.DAT” stores a script program that composes the script layer 302 . Normally, file “SCRIPT.DAT” as one file is placed under directory “RESOURCE.” Instead, a plurality of files “SCRIPT.DAT” may be placed under directory “RESOURCE.” In this case, parts of the file names are changed so that they become unique. A plurality of files “SCRIPT.DAT” are used for different display languages. In this case, however, one file “SCRIPT.DAT” is used at a time.
- At least one clip information file is placed under directory “CLIP.”
- a clip information file has a file name composed of a character string portion having several to five characters such as “00001” (in this example, numerals), a period as a delimiter, and an extension portion such as “CLP.” Extension portion “CLP” represents that the file is a clip information file.
- At least one clip AV stream file is placed under directory “STREAM.”
- a clip AV stream file has a file name composed of a character string portion having several to five characters such as “00001” (in this example, numerals), a period as a delimiter, and an extension portion such as “PS.” Extension portion “PS” represents that the file is a clip AV stream file.
- a clip AV stream file is an MPEG2 (Moving Pictures Experts Group 2) program stream of which a video stream, an audio stream, and a subtitle stream are multiplexed and stored in a file identified by extension portion “PS.”
- a clip AV stream file is a file of which video data and audio data are compression-encoded and time-division multiplexed.
- a clip information file is a file that describes the characteristics of a clip AV stream file.
- a clip information file and a clip AV stream file are correlated. According to the embodiment of the present invention, since the character string portions having several to five characters of the file names of the clip information file and the clip AV stream file are the same, the relationship therebetween can be easily obtained.
- File “SCRIPT.DAT” is a script file that describes a script program.
- File “SCRIPT.DAT” stores a program that causes reproduction states for a disc to be interactively changed according to the embodiment of the present invention.
- File “SCRIPT.DAT” is read before other files are read from the disc.
- File “PLAYLIST.DAT” is a play list file that describes a play list that designates the reproduction order of a clip AV stream.
- FIG. 24 shows an example of syntax that represents the entire structure of file “PLAYLIST.DAT.”
- the syntax is described in the C language, which is used as a descriptive language for programs of computer devices. This applies to tables that represent other syntaxes.
- Field name_length has a data length of 8 bits and represents the length of the name assigned to the play list file.
- Field name_string has a data length of 255 bytes and represents the name assigned to the play list file. In field name_string, the area from the beginning for the byte length represented by field name_length is used as a valid name. When the value of field “name_length” is “10,” 10 bytes from the beginning of field name_string is interpreted as a valid name.
- Field number_of_PlayList has a data length of 16 bits and represents the number of blocks PlayList( ) that follow. Field number_of_PlayLists is followed by a for loop. The for loop describes blocks PlayList( ) corresponding to field number_of_PlayLists. Block PlayList( ) is a play list itself.
- Block PlayList( ) starts with field PlayList_data_length.
- Field PlayList_data_length has a data length of 32 bits and represents the data length of block PlayList( ), including field PlayList_data_length.
- Field PlayList_data_length is followed by field reserved_for_word_alignment having a data length of 15 bits and flag capture_enable_flag_PlayList having a data length of 1 bit.
- Field reserved_for_word_alignment and flag capture_enable_flag_PlayList having a data length of 1 bit align data at a 16-bit position in block PlayList( ).
- Flag capture_enable_flag_PlayList is a flag that represents whether a moving picture that belongs to block PlayList( ) including flag capture_enable_flag_PlayList is permitted to be secondarily used.
- flag capture_enable_flag_PlayList is for example “1,” it represents that the moving picture that belongs to PlayList( ) is permitted to be secondarily used in the player.
- flag capture_enable_flag_PlayList has a data length of 1 bit.
- flag capture_enable_flag_PlayList may have a data length of a plurality of bits that describe a plurality of secondary use permission levels.
- flag capture_enable_flag_PlayList may have a data length of 2 bits.
- the moving picture When the value of the flag is “2,” the moving picture may be perfectly permitted to be secondarily used without any restriction. Instead, when the value of bit 0 of the flag is “0,” the moving picture may be permitted to be secondarily used in the content reproduction application. When the value of bit 1 of the flag is “1,” the moving picture may be permitted to be secondarily used in another application (for example, wall paper image or a screen saver) in the movie player. In this case, the values of bits 0 and 1 of the flag may be used in combination.
- Field PlayList_name_length has a data length of 8 bits and represents the length of the name assigned to block PlayList( ).
- Field PlayList_name_string has a data length of 255 bits and represents the name assigned to block PlayList( ).
- Field PlayList_name_string the area from the beginning for the byte length represented by field PlayList_name_string is used as a valid name.
- Field number_of_PlayItems has a data length of 16 bits and represents the number of blocks PlayItem( ) that follow. Field number_of_PlayItems is followed by a for loop. The for loop describes blocks PlayItem( ) corresponding to field number_of_PlayItems. Block PlayItem( ) is a play item itself.
- Blocks PlayItem( ) of block PlayList are assigned identification information (ID). For example, block PlayItem( ) described at the beginning of block PlayList( ) is assigned for example 0. Blocks PlayItem( ) are assigned serial numbers in the order of appearance such as 1, 2, and so forth. The serial numbers are used as identification information of blocks PlayItem( ). Argument i of the for loop repeated for blocks PlayItem( ) can be used as identification information for blocks PlayItem( ). Block PlayItem( ) is followed by block PlayListMark( ).
- Block PlayItem( ) starts with field length.
- Field length has a data length of 16 bits and represents the length of block PlayItem( ).
- Field length is followed by field Clip_Information_file_name_length.
- Field Clip_Information_file_name_length has a data length of 16 bits and represents the length of the name of the clip information file corresponding to block PlayItem( ).
- Field Clip_Information_file_name has a variable data length in bytes and represents the name of the clip information file corresponding to block PlayItem( ).
- field Clip_Information_file_name the area from the beginning for the byte length represented by field Clip_Information_file_name is used as a valid name.
- a clip information file is designated by field Clip_Information_file_name, a clip AV stream file corresponding to the clip information file can be identified according to the above-described relationship of the file names.
- Field IN_time and field OUT_time have a data length of 32 bits each.
- Field IN_time and field OUT_time are time information that designate the reproduction start position and the reproduction end position of a clip AV stream file corresponding to the clip information file designated by field Clip_Information_file_name in block PlayItem( ).
- the reproduction start position other than the beginning of the clip AV stream file can be designated.
- the reproduction end position other than the end of the clip AV stream file can be designated.
- Block PlayListMark( ) starts with field length.
- Field length has a data length of 32 bits and represents the length of block PlayListMark( ).
- Field length is followed by field number_of_PlayList_marks.
- Field number_of_PlayList_marks has a data length of 16 bits and represents the number of blocks Mark( ).
- Field number_of_PlayList_marks is followed by a for loop. The for loop describes blocks Mark( ) corresponding to field number_of_PlayList_marks.
- Block Mark( ) starts with field mark_type.
- Field mark_type has a data length of 8 bits and represents the type of block Mark( ) including field mark_type.
- FIG. 27 three types of marks, a chapter mark, an index mark, and an event mark are defined.
- a chapter is a search unit that divides a play list (block PlayList( )).
- An index is a search unit that divides a chapter.
- a chapter mark and an index mark respectively represent a chapter position and an index position as time information.
- An event mark is a mark that cause an event to occur.
- Field mark_name_length has a data length of 8 bits and represents the length of the name assigned to block Mark( ).
- Field mark_name_string at the last line of block Mark( ) represents the name assigned to block Mark( ).
- field mark_name_string the area from the beginning for the byte length represented by field mark_name_length is used as a valid name.
- field ref_to_PlayItem_id Four elements of field ref_to_PlayItem_id, field mark_time_stamp, field entry_ES_stream_id, and field entry_ES_private_stream_id correlate block Mark( ) defined in block PlayList( ) with a clip AV stream file.
- field ref_to_PlayItem_id has a data length of 16 bits and represents identification information of block PlayItem( ).
- field ref_to_PlayItem_id identifies a clip information file and a clip AV stream file.
- Field mark_time_stamp has a data length of 32 bits and designates the time of a mark in a clip AV stream file.
- field mark_time_stamp will be described in brief.
- a play list is composed of three play items assigned 0, 1, and 2 (PlayItem(#0), PlayItem(#1), and PlayItem(#2)).
- Time t 0 of the play list is included in play item 1 (PlayItem(#1)).
- Play items 0, 1, and 2 correspond to program streams A, B, and C of clip AV stream files through clip information files, respectively.
- field mark_time_stamp is followed by field entry_ES_stream_id and field entry_ES_private_stream_id.
- Field entry_ES_stream_id and field entry_ES_private_stream_id have a data length of 8 bits each.
- block Mark( ) is correlated with a predetermined elementary stream, field entry_ES_stream_id and field entry_ES_private_stream_id identify the elementary stream.
- Field entry_ES_stream_id and field entry_ES_private_stream_id represent a stream ID (stream_id) of packets (packet( )) in which elementary streams are multiplexed and a private stream ID (private_stream_id) of a private packet header (private_packet_header( )), respectively.
- the stream ID (stream_id) of the packets (packet( )) and the private stream ID (private_stream_id) of the private packet header (private_packet_header( )) are based on provisions on a program stream of the MPEG2 system.
- Field entry_ES_stream_id and field entry_ES_private_stream_id are used when the chapter structure of clip AV stream #0 is different from that of clip AV stream #1. When block Mark( ) is not correlated with a predetermined elementary stream, the values of these two fields are “0.”
- clip information file “XXXXX.CLP” describes the characteristics and so forth of corresponding clip AV stream file “XXXXX.PS” placed under directory “STREAM.”
- FIG. 29 shows an example of syntax that represents the entire structure of clip AV stream file “XXXXX.CLP.”
- Clip AV stream file “XXXXX.CLP” starts with field presentation_start_time and field presentation_end_time.
- Field presentation_start_time and field presentation_end_time have a data length of 32 bits each and represent the times of the beginning and end of the corresponding clip AV stream file.
- the presentation time stamp (PTS) of the MPEG2 system may be used as time information.
- PTS has an accuracy of 90 kHz.
- Field presentation_start_time and field presentation_end_time are followed by field reserved_for_word_alignment that has a data length of 7 bits and flag capture_enable_flag_Clip that has a data length of 1 bits.
- Field reserved_for_word_alignment and flag capture_enable_flag_Clip having a data length of 1 bit align data at a 16-bit position in file “XXXXX.CLP.”
- Flag capture_enable_flag_Clip is a flag that represents whether a moving picture contained in a clip AV stream file corresponding to file “XXXXX.CLP” is permitted to be secondarily used.
- flag capture_enable_flag_Clip when the value of flag capture_enable_flag_Clip is for example “1,” it represents that the moving picture of the clip AV stream file corresponding to file “XXXXX.CLP” is permitted to be secondarily used in the video player.
- Field number_of_streams has a data length of 8 bits and represents the number of blocks StreamInfo( ) that follow.
- Field number_of_streams is followed by a for loop.
- the for loop describes blocks StreamInfo( ) corresponding to field number_of_streams.
- the for loop is followed by block EP_map( ).
- Block StreamInfo( ) starts with field length.
- Field length has a data length of 16 bits and represents the length of block StreamInfo( ).
- Field length is followed by field stream_id and field private_stream that have a data length of 8 bits each.
- block StreamInfo( ) is correlated with elementary streams. In the example shown in FIG. 30 , when the value of field stream_id of block StreamInfo( ) is in the range from “0xE0” to “0xEF,” block StreamInfo( ) is correlated with a video stream.
- block StreamInfo( ) is correlated with an Adaptive Transform Acoustic Coding (ATRAC) audio stream, a Linear Pulse Code Modulation (LPCM) audio stream, or a subtitle stream.
- ATRAC Adaptive Transform Acoustic Coding
- LPCM Linear Pulse Code Modulation
- block StreamInfo( ) is correlated with an ATRAC audio stream, an LPCM audio stream, and a subtitle stream, respectively.
- Block StreamInfo( ) mainly describes two types of information, the first type not varying in a stream, the second type varying in a stream. Information that does not vary in a stream is described in block StaticInfo( ), whereas information that varies in a stream is described in block DynamicInfo( ) with change points designated with time information.
- Block StaticInfo( ) is followed by field reserved_for_word_alignment that has a data length of 8 bits.
- Field reserved_for_word_alignment aligns data in a byte in block StreamInfo( ).
- Field reserved_for_word_alignment is followed by field number_of_DynamicInfo.
- Field number_of_DynamicInfo has a data length of 8 bits and represents the number of blocks DynamicInfo( ) that follow.
- Field number_of_DynamicInfo is followed by a for loop. The for loop describes fields pts_change_point and blocks DynamicInfo( ) corresponding to field number_of_DynamicInfo.
- Field pts_change_point has a data length of 32 bits and represents a time at which information of block DynamicInfo( ) becomes valid with PTS.
- a time at which each stream starts is represented by field pts_change_point and equal to field presentation_start_time defined in file “XXXXX.CLP.”
- block StaticInfo( ) depends on the type of the corresponding elementary stream.
- the type of the corresponding elementary stream can be identified by the values of field stream_id and field private_stream_id as shown in FIG. 30 .
- FIG. 31 shows block StaticInfo( ) whose content varies depending on the type of an elementary stream, which is a video stream, an audio stream, or a subtitle using an if statement.
- block StaticInfo( ) will be described according to the types of elementary streams.
- block StaticInfo( ) is composed of field picture_size, field frame_rate, and flag cc_flag.
- Field picture_size and field frame_rate each have a data length of 4 bits each.
- Flag cc_flag has a data length of 1 bit.
- Field picture_size and field frame_rate represent the picture size and the frame frequency of the video stream.
- Flag cc_flag represents whether the video stream contains a closed caption. When the value of flag cc_flag is for example “1,” the video stream contains a closed caption.
- Field reserved_for_word_alignment aligns data in 16 bits.
- block StaticInfo( ) is composed of field audio_language_code having a data length of 16 bits, field channel_configuration having a data length of 8 bits, flag lfe_existance having a data length of 1 bit, and field sampling_frequency having a data length of 4 bits.
- Field audio_language_code represents a language code contained in the audio stream.
- Field channel_configuration represents a channel attribute of audio data such as monaural, stereo, multi-channel, or the like.
- Field lfe_existance represents whether the audio stream contains a low frequency emphasis channel. When the value of field lfe_existance is for example “1,” the audio stream contains the low frequency emphasis channel.
- Field sampling_frequency represents the sampling frequency of audio data.
- Field reserved_for_word_alignment is aligned at a 16-bit position.
- block StaticInfo( ) is composed of field subtitle_language_code having a data length of 16 bits and flag configurable_flag having a data length of 1 bit.
- Field subtitle_language_code represents a language code contained in the subtitle stream.
- Field subtitle_language_code represents whether the subtitle stream is a normal subtitle or a subtitle for commentary (for example, special subtitle for description of pictures).
- Flag configurable_flag represents whether the size and position of characters of the subtitle stream that is displayed are permitted to be changed. When the value of flag configurable_flag is for example “1,” it represents that the size and position of characters of the subtitle stream that is displayed are permitted to be changed.
- Field reserved_for_word_alignment is aligned at a 16-bit position.
- Block DynamicInfo( ) starts with field reserved_for_word_alignment having a data length of 8 bits. Elements preceded by field reserved_for_word_alignment depend on the type of the elementary stream. The type of the elementary stream can be identified by field stream_id and field private_stream_id described in FIG. 30 .
- block DynamicInfo( ) whose content varies depending on the type of an elementary stream, which is a video stream, an audio stream, or a subtitle using an if statement.
- block DynamicInfo( ) will be described according to the type of elementary streams.
- block DynamicInfo( ) is composed of field display_aspect_ratio having a data length of 4 bits.
- Field display_aspect_ratio represents whether the display output aspect ratio of video data is 16:9 or 4:3.
- Field reserved_for_word_alignment aligns data in 16 bits.
- block DynamicInfo( ) is composed of field channel_assignment having a data length of 4 bits.
- field channel_assignment represents whether the output is a stereo or a dual monaural.
- the dual monaural is used to reproduce audio data for example in two languages.
- Field reserved_for_word_alignment aligns data in 16 bits.
- block DynamicInfo( ) is composed of field reserved_for_word_alignment.
- Field reserved_for_word_alignment aligns data in 16 bits.
- block DynamicInfo( ) does not define an attribute that dynamically varies.
- Block EP_map( ) represents a valid decode start position (entry point) of a bit stream of each elementary stream with time information and position information.
- the position information may be the minimum access unit for a recording medium on which an elementary stream is recorded.
- Each elementary stream can be decoded from the position represented by block EP_map( ).
- block EP_map( ) Since the valid decode start position of a fixed rate stream can be calculated, information such as block EP_map( ) is not necessary. On the other hand, for a variable rate stream and a stream whose data size varies in each access unit such as a stream according to the MPEG video compression-encoding system, block EP_map( ) is information necessary for randomly accessing data.
- Block EP_map( ) starts with field reserved_for_word_alignment having a data length of 8 bits.
- Field reserved_for_word_alignment aligns data in 16 bits.
- Field reserved_for_word_alignment is followed by field number_of_stream_id_entries.
- Field number_of_stream_id_entries has a data length of 8 bits and represents the number of elementary streams described in block EP_map( ).
- a first for loop describes fields stream_id, fields private_stream_id, and fields number_of_EP_entries corresponding to field number_of_stream_id_entries.
- a second for loop describes fields PTS_EP_start and fields RPN_EP_start corresponding to field number_of_EP_entries.
- the first for loop describes field stream_id and field private_stream_id that have a data length of 8 bits each and identify the type of the elementary stream as shown in FIG. 30 .
- Field stream_id and field private_stream_id are followed by field number_of_EP_entries.
- Field number_of_EP_entries has a data length of 32 bits and represents the number of entry points described in the elementary stream.
- the second for loop describes fields PTS_EP_start and fields RPN_EP_start corresponding to field number_of_EP_entries.
- Field PTS_EP_start and field RPN_EP_start have a data length of 32 bits each and represent entry points themselves.
- Field PTS_EP_start represents a time of an entry point in a clip AV stream file with PTS.
- field RPN_EP_start represents the position of an entry point in a clip AV stream file in the unit of 2048 bytes.
- one sector as a disc access unit is 2048 bytes.
- field RPN_EP_start represents the position of an entry point of a clip AV stream file in sectors.
- Packet private_stream — 2 is a packet that stores information that can be used to decode a video stream.
- the position of an entry point of a video stream is the position of pack pack( ) that stores packet private_stream — 2.
- Block EP_map correlates times of a clip AV stream and positions of a clip AV stream file.
- sets of time information and position information (sets of field PTS_EP_start and field RPN_EP_start in the second for loop) for each elementary stream are pre-registered in the ascending order (descending order). In other words, time information and position information have been rearranged in a predetermined direction. Thus, a binary search can be performed for the data.
- an elementary stream of a video stream is an elementary stream on the basis of the MPEG2-Video standard.
- an elementary stream of a video stream may be an elementary stream according to the MPEG4-Visual standard or MPEG4-AVC standard.
- an elementary stream of an audio stream is an elementary stream on the basis of the ATRAC audio system.
- the present invention is not limited to such an example.
- an elementary stream of an audio stream may be an elementary stream on the basis of for example MPEG1/2/4 audio system.
- FIG. 34 shows an example of the structure of a disc reproducing apparatus 100 according to the present invention.
- a bus 111 Connected to a bus 111 are a central processing unit (CPU) 112 , a memory 113 , a drive interface 114 , an input interface 115 , a video decoder 116 , an audio decoder 117 , a video output interface 118 , and an audio output interface 119 .
- CPU central processing unit
- a disc drive 102 is connected to the drive interface 114 .
- the disc drive 102 exchanges data and commands with the bus 111 through the drive interface 114 .
- the CPU 112 has a read-only memory (ROM) and a random access memory (RAM) (not shown).
- the CPU 112 exchanges data and command with each section of the disc reproducing apparatus 100 through the bus 111 according to a program and data pre-stored in the ROM and controls the entire disc reproducing apparatus 100 .
- the RAM is used as a work memory of the CPU 112 .
- the input interface 115 Supplied to the input interface 115 is an input signal that is input from an input device with which the user performs an input operation.
- the input device is for example a remote control commander with which the user remotely operates the disc reproducing apparatus 100 using for example an infrared signal and keys disposed on the disc reproducing apparatus 100 .
- the input interface 115 converts an input signal supplied from the input device into a control signal for the CPU 112 and outputs the control signal.
- a play list Recorded on a disc 101 in the format shown in FIG. 23 to FIG. 33 are a play list, a script program, a clip information file, a clip AV stream file, and so forth.
- the disc 101 When the disc 101 is loaded into the disc drive 102 , it reproduce data from the disc 101 automatically or according to a user's input operation.
- a script file, a play list file, and a clip information file that are read from the disc 101 are supplied to the CPU 112 and stored in for example a RAM of the CPU 112 .
- the CPU 112 reads a clip AV stream file from the disc 101 according to data and a script program stored in the RAM.
- the clip AV stream file that is read from the disc 101 is temporarily stored in the memory 113 .
- the video decoder 116 decodes a video stream and a subtitle stream of the clip AV stream file stored in the memory 113 according to a command received from the CPU 112 .
- the CPU 112 performs an image process such as an enlargement process or a reduction process for the decoded video data and subtitle data, a synthesization process or an addition process for the video stream and subtitle stream, and obtains one stream of video data.
- the image process may be performed by the video decoder 116 and the video output interface 118 .
- the video data are buffered by the memory 113 and supplied to the video output interface 118 .
- the video output interface 118 converts the supplied video data into an analog video signal and supplies the analog video signal to a video output terminal 120 .
- the audio decoder 117 decodes an audio stream of the clip AV stream file stored in the memory 113 according to a command received from the CPU 112 .
- the decoded audio data are buffered in the memory 113 and supplied to the audio output interface 119 .
- the audio output interface 119 converts the supplied audio data into for example an analog audio signal and supplies the analog audio signal to an audio output terminal 121 .
- each section shown in FIG. 34 is composed of independent hardware.
- the present invention is not limited to this example.
- the video decoder 116 and/or the audio decoder 117 may be composed of software that operates on the CPU 112 .
- FIG. 35A and FIG. 35B are functional block diagrams describing the operation of the disc reproducing apparatus 100 shown in FIG. 34 in detail.
- the disc reproducing apparatus 100 is mainly composed of an operation system 201 and a video content reproduction section 210 .
- the video content reproduction section 210 is substantially a software program that operates on the operation system 201 . Instead, the video content reproduction section 210 may be composed of software and hardware that integrally operate. In the following description, it is assumed that the video content reproduction section 210 is composed of software.
- the disc drive 102 is omitted.
- the operation system 201 When the power of the disc reproducing apparatus 100 is turned on, the operation system 201 initially starts up on the CPU 112 and performs necessary processes such as initial settings for each section, and reads an application program (in this example, the video content reproduction section 210 ) from the ROM.
- the operation system 201 provides basic services such as reading of a file from the disc 101 and interpreting of a file system for the video content reproduction section 210 while the video content reproduction section 210 is operating.
- the operation system 201 controls the disc drive 102 through the drive interface 114 corresponding to a file read request supplied from the video content reproduction section 210 and reads data from the disc 101 .
- the data that are read from the disc 101 are supplied to the video content reproduction section 210 under the control of the operation system 201 .
- the operation system 201 has a multitask process function that controls a plurality of software modules virtually in parallel by for example time-division control.
- each module that composes the video content reproduction section 210 shown in FIG. 35A and FIG. 35B can be operated in parallel by the multitask process function of the operation system 201 .
- the video content reproduction section 210 has more internal modules and accomplishes the following functions.
- the video content reproduction section 210 determines whether the loaded disc 101 is a disc according to the UMD video standard (hereinafter this disc is referred to as the UMD video disc).
- the video content reproduction section 210 reads a script file from the disc 101 and supplies the script file to a script control module 211 .
- the video content reproduction section 210 also reads files that composes a database (namely, a play list file, a clip information file, and so forth) and supplies the files to a player control module 212 .
- the script control module 211 interprets a script program described in script file “SCRIPT.DAT” and executes it. As described in the player model, GUIs that create and output images of the menu screen, move the cursor corresponding to a user's input, and change the menu screen are accomplished by a graphics process module 219 controlled according to the script program. By executing the script program, the script control module 211 can control the player control module 212 .
- the player control module 212 references database information stored in files such as play list file “PLAYLIST.DAT” and clip information file “XXXXX.CLP” that are read from the disc 101 and performs the following controls to reproduce video contents recorded on the disc 101 .
- the player control module 212 analyzes database information such as a play list and clip information.
- the player control module 212 controls a content data supply module 213 , a decode control module 214 , and a buffer control module 215 .
- the player control module 212 performs player state change controls such as reproduction, reproduction stop, and reproduction pause and a reproduction control process such as stream change according to a command received from the script control module 211 or the input interface 115 .
- the player control module 212 obtains time information of a video stream that is being reproduced from the decode control module 214 , displays time, and generates a mark event.
- the content data supply module 213 reads content data such as a clip AV stream file from the disc 101 according to a command received from the player control module 212 and supplies the content data to the buffer control module 215 .
- the buffer control module 215 stores the content data in the memory 113 as a substance 215 A of the buffer.
- the content data supply module 213 controls the buffer control module 215 to supply the content data stored in the memory 113 to a video decoder control module 216 , an audio decoder control module 217 , and a subtitle decoder control module 218 according to requests therefrom.
- the content data supply module 213 reads content data from the disc 101 so that the amount of content data stored under the control of the buffer control module 215 becomes a predetermined amount.
- the decode control module 214 controls the operations of the video decoder control module 216 , the audio decoder control module 217 , and the subtitle decoder control module 218 according to a command received from the player control module 212 .
- the decode control module 214 has an internal clock function and controls the operations of the video decoder control module 216 , the audio decoder control module 217 , and the subtitle decoder control module 218 so that video data and audio data are synchronously output.
- the buffer control module 215 exclusively uses a part of the memory 113 as the substance 215 A of the buffer.
- the buffer control module 215 stores a data start pointer and a data write pointer.
- the buffer control module 215 also has as internal modules a video read function, an audio read function, and a subtitle read function.
- the video read function has a video read pointer.
- the video read function has a register that stores information au_information( ) as access unit information.
- the audio read function has an audio read pointer.
- the subtitle read function has a subtitle read pointer and a subtitle read function flag.
- the subtitle read function flag controls enabling/disabling of the subtitle read function according to a write value. When for example “1” is written to the subtitle read function flag, the subtitle read function becomes enabled. When for example “0” is written to the subtitle read function flag, the subtitle read function becomes disabled.
- the video read function, the audio read function, and the subtitle read function which are internal modules of the buffer control module 215 , have demultiplexer functions that demultiplex a multiplexed clip AV stream, of which a video stream, an audio stream, and a subtitle stream have been multiplexed, into these streams.
- a plurality of elementary streams are multiplexed according to time-division multiplying system and MPEG2 system program stream format and thereby a clip AV stream is formed.
- the video read function, the audio read function, and the subtitle read function have demultiplexer functions for the MPEG2 system program streams.
- the video read function reads the value of field stream_id (see FIG. 30 ) placed at a predetermined position of the video stream and holds the value.
- the audio read function and the subtitle read function read the values of field stream_id and field private_stream_id (see FIG. 30 ) and hold the values.
- the values of field stream_id and field private_stream_id are used to analyze the supplied bit stream.
- the video decoder control module 216 causes the video read function of the buffer control module 215 to read one video access unit of the video stream from the memory 113 and supply the video access unit to the video decoder 116 .
- the video decoder control module 216 controls the video decoder 116 to decode the video stream supplied to the video decoder 116 in the access unit and generate video data.
- the video data are supplied to the graphics process module 219 .
- the audio decoder control module 217 causes the audio read function of the buffer control module 215 to read one audio access unit of the audio stream from the memory 113 and supply the audio stream unit to the audio decoder 117 .
- the access unit (audio frame) that composes an audio stream has a predetermined fixed length.
- the audio decoder control module 217 controls the audio decoder 117 to decode the audio stream supplied to the audio decoder 117 in the access unit and generate audio data.
- the audio data are supplied to an audio output module 242 .
- the subtitle decoder control module 218 causes the subtitle read function of the buffer control module 215 to read one subtitle access unit of the subtitle stream from the memory 113 and supply the subtitle access unit to the subtitle decoder control module 218 .
- the subtitle access unit that composes the subtitle stream contains length information at the beginning.
- the subtitle decoder control module 218 has a subtitle decode function that can decode the supplied subtitle stream and generate subtitle image data.
- the subtitle image data are supplied to the graphics process module 219 .
- the video data decoded by the video decoder 116 under the control of the video decoder control module 216 and the subtitle image data decoded by the subtitle decoder control module 218 are supplied to the graphics process module 219 .
- the graphics process module 219 adds the subtitle image data to the supplied video data and generates a video signal that is output.
- the graphics process module 219 generates the menu image and the message image corresponding to a command received from the script control module 211 and the player control module 212 and overlays them with the output video signal.
- the graphics process module 219 performs an enlargement process and a reduction process for the supplied subtitle image data and adds the processed image data to the video data according to a command received from the script control module 211 .
- the graphics process module 219 converts the aspect ratio of the output signal according to the aspect ratio of the predetermined output video device and the output aspect ratio designated in the content reproduced from the disc 101 .
- the graphics process module 219 directly outputs the video data.
- the graphics process module 219 performs a squeezing process that matches the height of the image with the height of the screen of the output video device, inserts black portions into left and right sides of the image, and outputs the resultant image.
- the aspect ratio of the output video device is 4:3 and the output aspect ratio is 4:3, the graphics process module 219 directly outputs the video data.
- the graphics process module 219 performs a squeezing process that matches the width of the image with the width of the screen of the output video device, inserts black portions into the upper and lower portions of the image, and outputs the resultant image.
- the graphics process module 219 also performs a process that captures the video signal that is being processed according to a request from the player control module 212 and supplies the requested video signal thereto.
- a video output module 241 exclusively uses a part of the memory 113 as a first-in first-out (FIFO) buffer.
- the video output module 241 temporarily stores video data processed by the graphics process module 219 in the buffer and reads the video data therefrom at predetermined timing.
- the video data that are read from the buffer are output from the video output interface 118 .
- the audio output module 242 exclusively uses a part of the memory 113 as a FIFO buffer.
- the audio output module 242 stores audio data that are output from the audio output interface 119 to the buffer and reads the audio data therefrom at predetermined timing.
- the audio data that are read from the buffer are output from the audio output interface 119 .
- the audio output module 242 When the audio mode of the content is dual monaural (for example, bilingual), the audio output module 242 outputs the audio data according to a predetermined audio output mode.
- the audio output mode is “main audio,” the audio output module 242 copies audio data of the left channel in for example the memory 113 and outputs audio data of the left channel and audio data from the memory 113 .
- the audio output module 242 outputs audio data of only the left channel.
- the audio output mode is “sub audio”
- the audio output module 242 copies audio data of the right channel in for example the memory 113 and outputs audio data of the right channel and audio data from the memory 113 .
- the audio output module 242 outputs audio data of only the right channel.
- the audio output mode is “main and sub audio” or the content is stereo, the audio output module 242 directly outputs the audio data.
- the user can interactively sets the audio output mode on for example the menu screen that the video content reproduction section 210 generates.
- a nonvolatile memory control module 250 writes data to an area whose data are not erased after the operation of the video content reproduction section 210 is completed (this area is referred to as a nonvolatile area) and reads data therefrom according to a command received from the player control module 212 .
- the nonvolatile memory control module 250 has a function that stores a plurality of sets of data Saved_Player_Status and data Saved_Player_Data with a key of a title ID (Title_ID).
- the nonvolatile memory control module 250 stores as data Saved_Player_Status data Backup_Player_Status that the player control module 212 has.
- Data Backup_Player_Status corresponds to data of for example the player status 323 B that exist immediately before the operation of the player control module 212 is completed.
- Data Saved_Player_Status corresponds to the resume information 324 .
- the nonvolatile memory control module 250 stores as data Saved_User_Data data User_Data that the player control module 212 has.
- Data User_Data are predetermined data that the user sets to the player control module 212 .
- the nonvolatile memory control module 250 correlatively stores a set of data Saved_Player_Status and data Saved_User_Data with the title ID of the disc 101 in a predetermined region of the flash memory.
- the storage medium that the nonvolatile memory control module 250 stores data is not limited to a flash memory, but a hard disk or the like.
- an audio stream of a proper language is automatically selected.
- a subtitle stream of a proper language is automatically selected.
- a status “original language” can be set.
- an audio stream of a proper language is automatically selected.
- subtitle and audio streams are automatically selected, the case of which the languages of audio and subtitle streams are the same is prevented.
- the “original language” represents a language in which content was created. For example, if content was created in Japan and the language thereof is mainly Japanese, the original language is Japanese.
- the original language can be set on the content creator side.
- an audio stream denoted by (1) is the original language.
- the original language of a subtitle is not defined.
- the language of audio is set to “original language” and the language of a subtitle is set to “Japanese” that are the same as those of the first example.
- streams selected immediately after content is reproduced from the disc by the player are an audio stream as the original language (1) Japanese and a subtitle stream (2) Japanese.
- the same language is selected.
- since the subtitle of Japanese is not necessary, it is thought that this selection is not user-friendly.
- property subtitleFlag contained in attribute information (player status 323 B of property 323 , see FIG. 3 ) of the player, which causes a subtitle to be displayed or not displayed, is set to a value that represents subtitle OFF.
- property audioLanguageCode represents a language code of an audio language that has been set in the UMD video player.
- property audioLanguegeCode represents a language to be selected for audio.
- a value of property audioLanguegeCode a language code defined for example in the International Organization for Standardization (ISO) 639-1 may be used. In ISO 639-1, for example English is abbreviated as “en” and Japanese as “jp”. According to this embodiment, as a value of property audioLanguegeCode, “00”, which is not defined in ISO 639-1, is added. With value “00”, “original language” is represented.
- property audioNumber represents the stream number of an audio stream that is being currently reproduced.
- Property audioNumber is a 16-bit value composed of an audio stream ID (stream_id) and a private stream ID (private_stream_id) (see FIG. 30 ). Of 16 bits, high order eight bits are used for a stream ID, whereas the low order eight bits are used for a private stream ID. An audio stream of a clip AV stream file can be uniquely identified by this 16-bit value.
- Property audioFlag represents reproduction information of an audio stream.
- FIG. 39 shows examples of values that property audioFlag can have.
- the value of property audioFlag is “0”, the reproduction of audio is turned off, causing audio not to be reproduced.
- the value is “1”, an audio stream represented by property audioNumber is reproduced.
- an audio stream represented by property audioNumber is dual mono, namely contents of left and right channels of stereo are different, when the value is “2”, only left channel of dual mono is reproduced, whereas when the value is “3”, only right channel of dual mono is reproduced.
- an audio stream is dual mono and the value of property audioFlag is “1”, both left and right channels of dual mono are reproduced.
- Property subtitleLanguageCode represents a language code of the language of a subtitle that has been set to the UMD video player.
- property subtitleLanuegeCode represents a language to be selected for a subtitle.
- the language code defined for example in the International Organization for Standardization (ISO) 639-1 can be used like that for audio streams.
- property subtitleNumber represents the number of a subtitle stream that is currently being reproduced.
- a value of property subtitleNumber is a 16-bit value composed of a stream ID (stream_id) of a subtitle stream and a private stream ID (private_stream_id) (see FIG. 30 ). Of 16 bits, high order eight bits are used as a stream ID, whereas the low order eight bits are used for a private stream ID.
- a subtitle stream of a clip AV stream can be uniquely identified by this 16-bit value.
- Property subtitleFlag represents a reproduction status of a subtitle stream.
- FIG. 40 shows examples of values of property subtitleFlag.
- the value of property subtitleFlag is “0”
- the subtitle display is turned off, causing a subtitle not be displayed.
- the value is “1”
- the subtitle display is turned on, causing a subtitle to be displayed.
- the content side describes attribute information of audio streams and attribute information of subtitle streams in a clip information file.
- the clip information file contains information of field stream_id and field private_stream_id that uniquely identify each elementary stream multiplexed in a corresponding AV stream and attribute information shown in FIG. 31 .
- field audio_language_code for an audio stream and field subtitle_language_code for a subtitle stream in particular relate to this embodiment of the present invention.
- Audio streams of a plurality of languages can be multiplexed into a clip AV stream file.
- it is necessary to indicate which of audio streams of a plurality of languages that have been multiplexed is an audio streams of the original language.
- a language of an audio stream that comes first in audio streams arranged by a for loop is defined as the original language.
- a language code of the original language is represented by field audio_language_code (see FIG. 31 ) corresponding to the audio stream.
- the property 323 (player status 323 B) of the movie player 300 can be set with methods described with reference to FIG. 8 on the script layer 302 .
- Audio streams and subtitle streams are set with method play( ), method playChapter( ), and method changeStream( ).
- method changeStream( ) is a method of changing a current stream to a desired stream, not a mechanism of automatically selecting a stream. Thus, details of method changeStream( ) will be omitted.
- Method playChapter( ) is equivalent to what argument playListTime, which is one of arguments of method play( ), is substituted with argument chapterNumber, only method( ) will be described in the following.
- FIG. 41A and FIG. 41B list examples of arguments of method play( ).
- arguments are given to method play( ) in a predetermined manner, reproduction of a video stream, an audio stream, and a subtitle stream corresponding to a stream number designated can be started.
- syntax of method( ) is represented for example by formula (1).
- arguments are delimited by delimiters (in this example, commands “,”) and arranged in a predetermined order.
- delimiters in this example, commands “,”
- Only delimiters are described.
- movieplayer.play (playListNumber,playListTime,menuMode,pauseMode,video_strm,audio_strm,audio_status,subtitle_strm,subtitle_status) (1)
- Argument playListNumber represents a play list number of a play list to be reproduced.
- Argument playListTime represents a time from the beginning of a play list. When content is reproduced from the beginning of a play list, the value of argument playListTime is set to “0”.
- the value of argument menuMode is either “true” or “false”. When the value of argument menuMode is “true”, it denotes that content is reproduced in the menu mode. When the value of argument menuMode is “true”, it denotes that content is reproduced in the normal mode.
- the value of argument pauseMode is “true” or “false”. When the value of argument pauseMode is “true”, it represents standby as pause.
- Argument video_strm represents a video stream to be reproduced.
- the value of argument video_strm is “ ⁇ 1” or “ ⁇ 2”.
- the value of argument video_strm is “ ⁇ 1”, it denotes that a video stream to be reproduced is automatically selected by the movie player 300 .
- the value of argument video_strm is “ ⁇ 2”, it denotes that a video stream to be reproduced is not changed.
- arguments audio_strm are arguments for reproduction of audio streams and subtitle streams. These arguments in particular relate to this embodiment of the present invention.
- Argument audio_strm is an argument with which a value is set to property audioNumber (see FIG. 7 ) of the movie player 300 .
- Argument audio_strm is used to set an audio stream number represented by 16 bits of a stream ID (stream_id) and a private stream ID (private_stream_id) to property audioNumber.
- the value of argument audio_strm is “ ⁇ 1” or “ ⁇ 2”.
- Argument audio_status is a parameter with which property audioFlag (see FIG. 7 ) of the movie player 300 is set.
- the value of argument audio_status is “0”, “1”, “2”, or “3”, the value is directly set to property audioFlag.
- the value of argument audio_status is “ ⁇ 2”, it denotes that the current value of property audioFlag is kept.
- Argument subtitle_strm is an argument with which a value is set to property subtitleNumber (see FIG. 7 ) of the movie player 300 .
- Argument subtitle_strm is used to set a subtitle stream number represented by 16 bits of a stream ID (stream_id) and a private stream ID (private_stream_id) to property subtitleNumber.
- the value of argument subtitle_strm is “ ⁇ 1” or “ ⁇ 2”.
- the value of argument subtitle_strm is “ ⁇ 1”, it denotes that a proper subtitle stream is automatically selected in the movie player 300 and the subtitle stream number of the selected subtitle stream is set to property subtitleNumber.
- the value of argument subtitle_strm is “ ⁇ 2”, it denotes that reproduction is started with the value of property subTitleNumber that is currently set, not changed.
- Argument subtitle_status is a parameter with which property subtitleFlag (see FIG. 7 ) of the movie player 300 is set.
- the value of argument subtitle_status is non or “1”
- the value of argument subtitle_status is directly set to property subtitleFlag.
- the value of argument subtitle_status is “ ⁇ 1”
- the value of property subtitleFlag is automatically set to a proper value in the movie player 300 .
- the value of argument subtitle_status is “ ⁇ 2”, the current value of property subtitleFlag is kept.
- Property audioFlag does not contain automatic setting of the movie player 300
- property subtitleFlag contains automatic setting of the movie player 300 . This is because when audio and a subtitle are set to the same language, the display of the subtitle is automatically turned off, causing the subtitle not to be displayed.
- FIG. 42 shows a flow of a process of automatically selecting an audio stream and a subtitle stream from a plurality of types of audio streams and subtitle streams.
- reference letter A denotes that the flow advances to a portion represented by the same letter A.
- step S 90 When the selection process of an audio stream is started (at step S 90 ), it is determined whether or not the value of argument audio_strm, which represents the stream number of an audio stream reproduced in method play( ), has been set to “ ⁇ 1” or the value has been omitted (at step S 91 ). As described above, when the value of argument audio_strm is “ ⁇ 1”, the automatic selection of an audio stream is designated.
- step S 92 the value of argument audio_strm is set to property audioNumber. An audio stream represented by argument audio_strm is selected. Thereafter, the flow advances to step S 82 shown in FIG. 42 .
- step S 82 property audioFlag is set in the predetermined manner corresponding to the selected audio stream in the movie player 300 . When the number of audio channels of the selected audio stream is 5.1 channels, the value of property audioFlag is set to for example “1”, denoting whether or not to reproduce the audio stream. When an audio stream has not been selected at step S 92 , the process is terminated as an error.
- step S 93 it is determined whether there is no audio stream identified by property audioNumber of the movie player 300 or the value of property audioNumber is undefined.
- step S 94 an audio stream identified by property audioNumber is selected.
- step S 82 shown in FIG. 42 .
- Property audiFlag is set in the predetermined manner corresponding to the selected audio stream in the movie player 300 .
- step S 95 the automatic selection process for an audio stream is specifically performed.
- the automatic selection process of an audio stream at step S 81 shown in FIG. 42 represents the entire process of the flow chart shown in FIG. 43 .
- the process after step S 95 of the flow chart shown in FIG. 43 is the automatic selection process of an audio stream executed when the value of argument audio_strm is “1”, which represents the automatic selection.
- the automatic selection process based on (1) language code is performed. This process is performed so that an audio stream whose language is the same as the audio language setting of the player is selected by priority.
- step S 96 it is determined whether or not the value of property audioLanguageCode in the movie player 300 is “00”, which represents “original language”. When the determined result denotes that the value of property audioLanguageCode represents “original language”, the flow advances to step S 101 .
- step S 101 the arrangement of audio streams in block StreamInfo( ) of the clip information file (see FIG. 29 ) is checked and a language code of a audio stream that comes first in the arrangement is obtained. Thereafter, it is determined whether or not there is an audio stream whose language code is the same as the obtained language code and whose number of audio channels is equal to or smaller than the number of audio channels that has been set to the UMD video player.
- the flow advances to step S 102 .
- step S 103 the flow advances to step S 103 .
- an audio stream whose number of audio channels is maximum in these audio streams that satisfy the conditions at step S 101 is selected.
- the audio channel setting of the UMD video player is for example 5.1 channels and there are two audio streams whose language code is the same as that of an audio stream that comes first in block StreamInfo( ) and whose numbers of audio channels are 2 channels and 5.1 channels, the audio stream whose number of channels is 5.1 channels is selected.
- an audio stream which comes earlier in block StreamInfo( ) is selected.
- the flow advances to step S 82 shown in FIG. 42 .
- property audioFlag is set in the predetermined manner corresponding to the selected audio stream in the movie player 300 .
- step S 103 it is determined whether or not there is an audio stream whose number of audio channels is equal to or smaller than the number of audio channels that has been set to the UMD video player.
- an audio stream whose number of audio channels is maximum in these audio streams is selected (at step S 104 ).
- an audio stream that comes earlier in block StreamInfo( ) is selected.
- the flow advances to step S 82 shown in FIG. 42 .
- Property audioFlag is set in the predetermined manner corresponding to the selected audio stream in the movie player 300 .
- step S 103 When the determined result at step S 103 denotes that there is not such an audio stream, the process according to the flow chart shown in FIGS. 43 and 44 is completed. In this case, although an audio stream has not been selected, the flow advances to step S 82 .
- step S 82 property audioFlag is set in the predetermined manner.
- step S 97 it is determined whether or not there is an audio stream whose language code is the same as that of property audioLanguegeCode of the movie player 300 and whose number of audio channels is the equal to or smaller than the number of audio channels that has been set to the UMD video player.
- the flow advances to step S 98 .
- the flow advances to step S 99 .
- step S 98 like step S 102 , an audio stream whose number of audio channels is maximum in audio streams that satisfy the conditions at step S 97 is selected.
- an audio stream that comes earlier in block StreamInfo( ) is selected.
- step S 82 shown in FIG. 42 Property audioFlag is set in the predetermined manner corresponding to the selected audio stream in the movie player 300 .
- step S 99 it is determined whether or not there is an audio stream whose number of audio channels is equal to or smaller than the number of audio channels that has been set to the UMD video player.
- an audio stream whose number of audio channels is maximum in these audio streams is selected (at step S 100 ).
- an audio stream that comes earlier in block StreamInfo( ) is selected.
- the flow advances to step S 82 shown in FIG. 42 .
- Property audioFlag is set in the predetermined manner corresponding to the selected audio stream in the movie player 300 .
- step S 99 When the determined result at step S 99 denotes that there is not such an audio stream, the process according to the flow chart shown in FIG. 43 and FIG. 44 is completed. In this case, although an audio stream has not been selected, the flow advances to step S 82 .
- step S 82 property audioFlag is set in the predetermined manner.
- step S 110 When a subtitle stream is selected (at step S 110 ), the flow advances to step S 111 .
- step S 111 it is determined whether the value of argument subtitle_strm, which represents the stream number of a subtitle stream to be reproduced, has been set to “ ⁇ 1” in method play( ) or the value has been omitted. As described above, when the value of argument subtitle_strm is “ ⁇ 1”, automatic selection of a subtitle stream is designated.
- step S 112 the value of argument subtitle_strm is set to property subtitleNumber and a subtitle stream identified by argument subtitle_strm is selected. Thereafter, the flow advances to step S 84 shown in FIG. 84 .
- Property subtitleFlag is set in the movie player 300 .
- the operation that the UMD video player performs depends on its implementation.
- the process according to the flow chart shown in FIG. 45 is terminated and the next process is performed although a subtitle stream has not been selected. This error process applies to other error process in FIG. 45 .
- step S 113 it is determined whether there is no subtitle stream identified by property subtitleNumber of the movie player 300 or the value of property subtitleNumber is undefined.
- step S 114 a subtitle stream identified by property subtitleNumber is selected. Thereafter, the flow advances to step S 84 shown in FIG. 42 .
- Step S 115 Steps after S 115 are a real automatic selection process of a subtitle stream.
- a subtitle stream of block StreamInfo( ) of the clip information file is checked and a language code of the subtitle stream is obtained. It is determined whether or not there is a subtitle stream whose language code is the same as that of property subtitleLanguegeCode of the movie player 300 . When the determined result denotes that there is such a subtitle stream, the flow advances to step S 117 .
- a subtitle stream is selected. When there are a plurality of subtitle streams that satisfy the condition at step S 116 , a subtitle stream that comes earlier in block StreamInfo( ) in the clip information file is selected. Thereafter, the flow advances to step S 84 shown in FIG. 42 .
- step S 116 When the determined result at step S 116 denotes that there is no subtitle stream that satisfies the condition, the flow advances to step S 118 .
- step S 118 since there is no subtitle stream that can be selected, the value of property subtitleFlag is set to “0”, which denotes that no subtitle is displayed.
- step S 121 it is determined whether or not the value of argument subtitle_status, which represents the status of reproduction of a subtitle of method play( ), has been set to “ ⁇ 1”, which represents automatic setting, or the value has been omitted.
- the value of argument subtitle_status has been set to other than “ ⁇ 1”
- the flow advances to step S 122 .
- step S 122 the value that has been set to argument subtitle_status is set to property subtitleFlag.
- step S 123 it is determined whether or not a language code identified by property subtitleLanguageCode of the movie player 300 matches a language code (property audioLanguageCode of the movie player 300 ) of an audio stream that is currently being selected and the value of property audioFlag has been set to other than “0”, causing an audio stream to be reproduced.
- the determined result denotes that the conditions at step S 123 are satisfied, the flow advances to step S 124 .
- the value of property subtitleFlag is set to “0”, causing a subtitle not to be displayed.
- the values at step S 123 are satisfied, it denotes that the language of audio reproduced by an audio stream is the same as the language of a subtitle displayed by a subtitle stream.
- the value of property subtitleFlag is set to “0”, causing a subtitle not to be displayed.
- step S 125 the value of property subtitleFlag is set to “1”, causing a subtitle to be displayed.
- the display of a subtitle is turned on or off corresponding to the language code of a subtitle stream.
- the display of a subtitle may be turned on or off.
- the display of the subtitle may not be suppressed (by setting the value of property subtitleFlag to “1”).
- An attribute of a subtitle stream is represented by the value of field subtitle_presentation_type of block StaticInfo( ) of clip AV stream file “XXXXX.CLP” as was described with reference to FIG. 31 .
- FIG. 47 portions similar to those in FIG. 46 are denoted by similar reference numerals and their description will be omitted.
- step S 126 it is determined whether or not the attribute of the subtitle stream represents an attribute of a normal subtitle (Normal).
- the flow advances to step S 124 .
- the value of property subtitleFlag is set to “0”, causing the subtitle not to be displayed.
- the flow advances to step S 125 .
- the value of property subtitleFlag is set to “1”, causing the subtitle to be displayed.
- audio and a subtitle are properly and automatically selected according to an embodiment of the present invention.
- audio streams of a plurality of different languages are multiplexed.
- the present invention is not limited to such an example.
- audio streams whose language is the same and whose contents are different may be able to be multiplexed.
- audio stream A and audio stream B whose languages are the same and whose contents are different may be multiplexed and one stream (for example audio stream A) may be used as an original language.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Medicinal Chemistry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Chemical Kinetics & Catalysis (AREA)
- General Chemical & Material Sciences (AREA)
- Communicable Diseases (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Organic Chemistry (AREA)
- Pharmacology & Pharmacy (AREA)
- Oncology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Television Signal Processing For Recording (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
- Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004239347A JP4339206B2 (ja) | 2004-08-19 | 2004-08-19 | 再生装置、再生方法および再生プログラム、ならびに、記録媒体 |
JP2004-239347 | 2004-08-19 | ||
PCT/JP2005/014491 WO2006019000A1 (ja) | 2004-08-19 | 2005-08-02 | 再生装置、再生方法、再生プログラム、記録媒体、ならびに、データ構造 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20080063373A1 US20080063373A1 (en) | 2008-03-13 |
US8019199B2 true US8019199B2 (en) | 2011-09-13 |
Family
ID=35907390
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/573,696 Expired - Fee Related US8019199B2 (en) | 2004-08-19 | 2005-08-02 | Reproduction device, reproduction method, reproduction program, recording medium, and data structure |
Country Status (9)
Country | Link |
---|---|
US (1) | US8019199B2 (ko) |
JP (1) | JP4339206B2 (ko) |
KR (1) | KR101237160B1 (ko) |
CN (1) | CN101044573B (ko) |
MX (1) | MX2007001866A (ko) |
RU (1) | RU2381574C2 (ko) |
TR (1) | TR200700950T1 (ko) |
TW (1) | TW200608362A (ko) |
WO (1) | WO2006019000A1 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080056676A1 (en) * | 2005-08-22 | 2008-03-06 | Kim Kun S | Apparatus for reproducing data, method thereof, apparatus for recording the same, method thereof and recording medium |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009180972A (ja) * | 2008-01-31 | 2009-08-13 | Panasonic Corp | オーディオレジューム再生装置及びオーディオレジューム再生方法 |
MX2010013448A (es) * | 2008-06-26 | 2010-12-22 | Panasonic Corp | Medio de grabacion, dispositivo de reproduccion, dispositivo de grabacion, metodo de reproduccion, metodo de grabacion y programa. |
KR101486354B1 (ko) * | 2008-07-02 | 2015-01-26 | 엘지전자 주식회사 | 방송 수신기 및 방송 데이터 처리 방법 |
JP2010187158A (ja) * | 2009-02-12 | 2010-08-26 | Funai Electric Co Ltd | コンテンツ処理装置 |
RU2543936C2 (ru) * | 2010-03-31 | 2015-03-10 | Томсон Лайсенсинг | Воспроизведение с быстрым доступом к объектам видеоданных |
KR20110138877A (ko) * | 2010-06-22 | 2011-12-28 | 삼성전자주식회사 | 오디오 스트림 송신 장치, 오디오 스트림의 수신 장치 및 그의 송/수신 방법 |
US20130058621A1 (en) * | 2011-09-07 | 2013-03-07 | Vesstech, Inc. | Video warning systems for devices, products, containers, and other items |
US9571870B1 (en) * | 2014-07-15 | 2017-02-14 | Netflix, Inc. | Automatic detection of preferences for subtitles and dubbing |
JP6733730B2 (ja) * | 2016-05-25 | 2020-08-05 | ヤマハ株式会社 | コンテンツ再生装置、およびコンテンツ再生方法 |
CN106060266B (zh) * | 2016-06-28 | 2019-06-21 | Oppo广东移动通信有限公司 | 控制方法、控制装置及电子装置 |
CN108174308B (zh) * | 2017-12-28 | 2020-06-16 | Oppo广东移动通信有限公司 | 视频播放方法、视频播放装置、存储介质及电子设备 |
CN108259986B (zh) * | 2018-01-17 | 2021-06-01 | 海信电子科技(深圳)有限公司 | 多声道的音频播放方法及装置、电子设备、存储介质 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0950675A (ja) | 1995-08-04 | 1997-02-18 | Sony Corp | データ記録方法及び装置、データ記録媒体、データ再生方法及び装置 |
US6393201B1 (en) * | 1998-01-07 | 2002-05-21 | Hitachi, Ltd. | Reproducing apparatus and reproducing/recording apparatus memorizing identification information of optical information meda and method thereof |
US6424794B1 (en) * | 1993-10-29 | 2002-07-23 | Time Warner Entertainment Company, L.P. | Data structure for representing a program containing components organized in a series of data blocks |
US6442333B1 (en) * | 1997-12-25 | 2002-08-27 | Pioneer Electronic Corporation | Information reproducing apparatus |
US6542200B1 (en) * | 2001-08-14 | 2003-04-01 | Cheldan Technologies, Inc. | Television/radio speech-to-text translating processor |
JP2004023646A (ja) | 2002-06-19 | 2004-01-22 | Sanyo Electric Co Ltd | 光ディスク再生装置 |
JP2004112207A (ja) | 2002-09-17 | 2004-04-08 | Sanyo Electric Co Ltd | 光ディスク再生装置 |
JP2004207904A (ja) | 2002-12-24 | 2004-07-22 | Alpine Electronics Inc | Dvd再生装置およびその音声・字幕設定方法 |
US20060210245A1 (en) * | 2003-02-21 | 2006-09-21 | Mccrossan Joseph | Apparatus and method for simultaneously utilizing audio visual data |
-
2004
- 2004-08-19 JP JP2004239347A patent/JP4339206B2/ja not_active Expired - Fee Related
-
2005
- 2005-08-02 CN CN2005800358504A patent/CN101044573B/zh not_active Expired - Fee Related
- 2005-08-02 KR KR1020077003876A patent/KR101237160B1/ko not_active IP Right Cessation
- 2005-08-02 TR TR2007/00950T patent/TR200700950T1/xx unknown
- 2005-08-02 WO PCT/JP2005/014491 patent/WO2006019000A1/ja active Application Filing
- 2005-08-02 US US11/573,696 patent/US8019199B2/en not_active Expired - Fee Related
- 2005-08-02 MX MX2007001866A patent/MX2007001866A/es active IP Right Grant
- 2005-08-02 RU RU2007106078/28A patent/RU2381574C2/ru not_active IP Right Cessation
- 2005-08-04 TW TW094126597A patent/TW200608362A/zh not_active IP Right Cessation
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6424794B1 (en) * | 1993-10-29 | 2002-07-23 | Time Warner Entertainment Company, L.P. | Data structure for representing a program containing components organized in a series of data blocks |
JPH0950675A (ja) | 1995-08-04 | 1997-02-18 | Sony Corp | データ記録方法及び装置、データ記録媒体、データ再生方法及び装置 |
US6442333B1 (en) * | 1997-12-25 | 2002-08-27 | Pioneer Electronic Corporation | Information reproducing apparatus |
US6393201B1 (en) * | 1998-01-07 | 2002-05-21 | Hitachi, Ltd. | Reproducing apparatus and reproducing/recording apparatus memorizing identification information of optical information meda and method thereof |
US6542200B1 (en) * | 2001-08-14 | 2003-04-01 | Cheldan Technologies, Inc. | Television/radio speech-to-text translating processor |
JP2004023646A (ja) | 2002-06-19 | 2004-01-22 | Sanyo Electric Co Ltd | 光ディスク再生装置 |
JP2004112207A (ja) | 2002-09-17 | 2004-04-08 | Sanyo Electric Co Ltd | 光ディスク再生装置 |
JP2004207904A (ja) | 2002-12-24 | 2004-07-22 | Alpine Electronics Inc | Dvd再生装置およびその音声・字幕設定方法 |
US20060210245A1 (en) * | 2003-02-21 | 2006-09-21 | Mccrossan Joseph | Apparatus and method for simultaneously utilizing audio visual data |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080056676A1 (en) * | 2005-08-22 | 2008-03-06 | Kim Kun S | Apparatus for reproducing data, method thereof, apparatus for recording the same, method thereof and recording medium |
Also Published As
Publication number | Publication date |
---|---|
TWI319567B (ko) | 2010-01-11 |
WO2006019000A1 (ja) | 2006-02-23 |
CN101044573A (zh) | 2007-09-26 |
CN101044573B (zh) | 2010-11-17 |
JP4339206B2 (ja) | 2009-10-07 |
TW200608362A (en) | 2006-03-01 |
TR200700950T1 (tr) | 2007-04-24 |
KR20070043011A (ko) | 2007-04-24 |
US20080063373A1 (en) | 2008-03-13 |
RU2007106078A (ru) | 2008-08-27 |
KR101237160B1 (ko) | 2013-02-25 |
MX2007001866A (es) | 2007-07-24 |
RU2381574C2 (ru) | 2010-02-10 |
JP2006059435A (ja) | 2006-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8019199B2 (en) | Reproduction device, reproduction method, reproduction program, recording medium, and data structure | |
EP1783771B1 (en) | Reproduction device, reproduction method and reproduction program | |
US8139926B2 (en) | Reproduction apparatus, reproduction method, reproduction program, record medium, and data structure | |
US8005339B2 (en) | Reproduction apparatus, reproduction method, reproduction program, record medium, and data structure | |
US7821881B2 (en) | Reproduction device, reproduction method, reproduction program, and recording medium | |
TWI328801B (ko) | ||
JP5655478B2 (ja) | 情報処理装置、情報処理方法 | |
AU2012200803A1 (en) | Reproduction device, reproduction method, reproduction program, recording medium, and data structure | |
JP2003018505A (ja) | 情報再生装置および会話シーン検出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAMADA, TOSHIYA;KAKUMU, TATSUYA;FUJINAMI, YASUSHI;REEL/FRAME:020293/0953;SIGNING DATES FROM 20070116 TO 20070125 Owner name: SONY COMPUTER ENTERTAINMENT INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAMADA, TOSHIYA;KAKUMU, TATSUYA;FUJINAMI, YASUSHI;REEL/FRAME:020293/0953;SIGNING DATES FROM 20070116 TO 20070125 Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAMADA, TOSHIYA;KAKUMU, TATSUYA;FUJINAMI, YASUSHI;SIGNING DATES FROM 20070116 TO 20070125;REEL/FRAME:020293/0953 Owner name: SONY COMPUTER ENTERTAINMENT INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAMADA, TOSHIYA;KAKUMU, TATSUYA;FUJINAMI, YASUSHI;SIGNING DATES FROM 20070116 TO 20070125;REEL/FRAME:020293/0953 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: SONY NETWORK ENTERTAINMENT PLATFORM INC., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT INC.;REEL/FRAME:027449/0001 Effective date: 20100401 |
|
AS | Assignment |
Owner name: SONY COMPUTER ENTERTAINMENT INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONY NETWORK ENTERTAINMENT PLATFORM INC.;REEL/FRAME:027449/0640 Effective date: 20100401 |
|
REMI | Maintenance fee reminder mailed | ||
LAPS | Lapse for failure to pay maintenance fees | ||
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.) |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20150913 |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |