WO2006019108A1 - 情報記録媒体、及びデータ再生装置 - Google Patents
情報記録媒体、及びデータ再生装置 Download PDFInfo
- Publication number
- WO2006019108A1 WO2006019108A1 PCT/JP2005/014986 JP2005014986W WO2006019108A1 WO 2006019108 A1 WO2006019108 A1 WO 2006019108A1 JP 2005014986 W JP2005014986 W JP 2005014986W WO 2006019108 A1 WO2006019108 A1 WO 2006019108A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- information
- event
- stream
- access unit
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 41
- 230000008569 process Effects 0.000 abstract description 24
- 238000010586 diagram Methods 0.000 description 73
- 239000000872 buffer Substances 0.000 description 50
- 238000012545 processing Methods 0.000 description 40
- 238000007726 management method Methods 0.000 description 36
- 230000006870 function Effects 0.000 description 25
- 230000003287 optical effect Effects 0.000 description 12
- 230000006835 compression Effects 0.000 description 8
- 238000007906 compression Methods 0.000 description 8
- 230000010365 information processing Effects 0.000 description 8
- 101150087593 tba-2 gene Proteins 0.000 description 8
- 238000013515 script Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 238000012790 confirmation Methods 0.000 description 5
- 239000000284 extract Substances 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000008929 regeneration Effects 0.000 description 4
- 238000011069 regeneration method Methods 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 101100025832 Danio rerio nbas gene Proteins 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 101000969688 Homo sapiens Macrophage-expressed gene 1 protein Proteins 0.000 description 2
- 102100021285 Macrophage-expressed gene 1 protein Human genes 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 101100259931 Neurospora crassa (strain ATCC 24698 / 74-OR23-1A / CBS 708.71 / DSM 1257 / FGSC 987) tba-1 gene Proteins 0.000 description 1
- 241000610375 Sparisoma viride Species 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- JEIPFZHSYJVQDO-UHFFFAOYSA-N ferric oxide Chemical compound O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/12—Formatting, e.g. arrangement of data block or words on the record carriers
- G11B20/1217—Formatting, e.g. arrangement of data block or words on the record carriers on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/12—Formatting, e.g. arrangement of data block or words on the record carriers
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/30—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
- G11B27/3027—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/327—Table of contents
- G11B27/329—Table of contents on a disc [VTOC]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/20—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
- H04N19/29—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding involving scalability at the object level, e.g. video object layer [VOL]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/30—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
- H04N19/31—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability in the temporal domain
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/30—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
- H04N19/33—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability in the spatial domain
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42646—Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4621—Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/8042—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L19/00—Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
- G10L19/04—Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis using predictive techniques
- G10L19/16—Vocoder architecture
- G10L19/18—Vocoders using multiple modes
- G10L19/24—Variable rate codecs, e.g. for generating different qualities using a scalable representation such as hierarchical encoding or layered encoding
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/10537—Audio or video recording
- G11B2020/10546—Audio or video recording specifically adapted for audio data
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2541—Blu-ray discs; Blue laser DVR discs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/775—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/7921—Processing of colour television signals in connection with recording for more than one processing mode
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/806—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
- H04N9/8063—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8227—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
Definitions
- the present invention relates to an information recording medium on which video and audio data is recorded, a data reproduction device for reproducing data, and the like.
- DVD DVD—Video disc
- FIG. 1 is a diagram showing the structure of a DVD. As shown in the lower part of Fig. 1, the DVD has a logical address space from the lead-in part to the lead-out part. In the logical address space, the volume information of the file system is recorded at the head, followed by application data such as video and audio.
- the DVD file system is an ISO9660 or Universal Disc Format (UDF) file system.
- a file system is a mechanism for representing data on a disk in units called directories or files.
- a personal computer (PC) uses a file system called FAT or NTFS. With the file system, data recorded on the node disk in the structure of directories and files is processed by a computer. As a result, user pity is increasing.
- DVD uses both UDF and ISO9660 file systems. Both UDF and ISO9660 are sometimes referred to as “UDF bridges”. Data recorded on a DVD can be read by both UDF and ISO9660 file system drivers. Of course, DVD-RAM ZRZRW, which is a rewritable DVD, can physically read, write, and delete data via these file systems.
- Data recorded on a DVD exists as a directory or file as shown in the upper left of FIG. 1 via a file system.
- a directory called “VIDEO-TS” is placed directly under the root directory (“ROOT” in Fig. 1), where DVD application data is recorded.
- Application data is divided into multiple files It is recorded.
- the main files are as follows.
- Playback control information includes information for realizing the interactivity (technology that dynamically changes the playback state according to user operations) adopted for DVDs, and information attached to titles and AV streams such as metadata. This information includes information such as In a DVD, playback control information is generally called navigation information.
- VIDEO-TS. IFO that manages the entire disc
- VTS-01-0. IFOJ that is playback control information for each video title set.
- “01” in the body indicates the number of the video title set. For example, if the video title set number is # 2, the file name of the video title set is “VTS—02—0. IFO”.
- DVD it is possible to record multiple titles, in other words, multiple movies with different contents, or multiple movies with the same content but different versions, on a single disk.
- the upper right part of FIG. 1 shows a DVD navigation space in the DVD application layer, that is, a logical structure space in which the playback control information described above is expanded.
- the information in “VIDEO—TS. IF 0” is expanded as Video Manager Information (VMGI) in the DVD navigation space.
- VMGI Video Manager Information
- the playback control information that exists for each video title set such as “VTS-01-01. IFO” is expanded in the DVD navigation space as Video Title Set Information (VTSI).
- VTSI describes Program Chain Information (PGCI), which is information on a playback sequence called Program Chain (PGC).
- PPC Program Chain
- PGCI consists of a set of cells and a kind of programming information called commands.
- Cell is a Video Object It is a set of all sections (VOB, MPEG stream) or all sections.
- VOB Video Object It is a set of all sections (VOB, MPEG stream) or all sections.
- Cell playback means playing the section specified by Cell in the VOB.
- the command is processed by a DVD virtual machine, and is similar to a Java (registered trademark) script or the like executed on a browser.
- Java (registered trademark) scripts control windows and browsers (for example, open new browser windows).
- DVD commands only control playback of AV titles (for example, designation of chapters to be played back) in addition to logical operations. As you can see, the DVD command is different from the Java script!
- the Cell is recorded on the disk and has information on the start address and end address (logical recording address on the disk) of the VOB.
- the player reads and reproduces the data using the information of the start address and end address of the VOB described in the Cell.
- FIG. 2 is a diagram for explaining the navigation information embedded in the AV stream.
- the interactivity that is characteristic of DVDs is recorded in the “VIDEO-TS. IFO” and “VTS-01-01. IFO” mentioned above! Nah ...
- Some important information for realizing interactivity is the use of a dedicated carrier called a navigation pack (“Navi Pack” or “NV—PCK”!), And video in the VOB. It is multiplexed with data and audio data.
- buttons appear on the menu screen. Each button defines the contents of processing when that button is selected and pressed. One button is selected on the menu.
- the noil light is a translucent image that is overlaid on the selected button, indicating that the button is overlaid and that the button is selected.
- the user can use the up / down / left / right keys on the remote control to move the selected button to any one of the buttons up / down / left / right.
- the user uses the up / down / left / right keys on the remote control to move the noise light to the button corresponding to the process to be executed, and presses the enter key.
- the command program corresponding to the selected button is executed. . For example, playback of a title or chapter is executed by a command (see, for example, Patent Document 1).
- FIG. 2 shows an outline of control information stored in the NV-PCK.
- the NV—PCK includes highlight color information and individual button information.
- Color palette information is described in the no-light color information.
- the color palette information specifies the translucent color of the highlighted highlight.
- the button information the rectangular area information that is the position information of each button and the movement information to other buttons (specification of the destination button corresponding to the user's selection of the up / down / left / right keys on the remote control) And button command information (command executed when the button is selected)
- the highlight on the menu is created as an overlay image as shown in the upper right part of the center of FIG.
- An overlay image is an image in which a color specified by color palette information is added to a button specified by rectangular area information in button information.
- the overlay image is combined with the background image shown in the right part of Fig. 2 and displayed on the screen.
- a menu is displayed on a DVD.
- the reason why part of the navigation data is embedded in the stream using NV-PCK is to enable the menu information to be updated dynamically in synchronization with the stream. For example, the menu can be displayed only for 5 to 10 minutes while the movie is being played.
- the second reason is that even if application data for which it is difficult to synchronize the stream and the menu information, the stream and the menu information can be displayed in synchronization.
- Another major reason is to improve user operability. For example, by storing information to support special playback in NV-PCK, even when AV data recorded in DV D is played back abnormally such as fast forward or rewind, the AV data is smoothly Can be decoded and reproduced.
- FIG. 3 is a diagram showing an image of a VOB that is a DVD stream.
- the video, audio, and subtitle data shown in Fig. 3 (A) is packetized and packed based on the MPEG system standard (ISO / IEC13818-1) as shown in Fig. 3 (B).
- MPEG system standard ISO / IEC13818-1
- FIG. 3 (C) each is multiplexed to generate one MPEG program stream.
- Interra NV—PCK which includes button commands for realizing the activity, is also multiplexed with the packets and knocks.
- the multiplexing feature of the data in the MPEG system is that the individual data to be multiplexed is a bit string based on the decoding order. Data are not necessarily arranged in the order of playback, in other words, in the order of decoding.
- This is an MPEG system stream decoder model (commonly called “System Target Decode” or “STD” (see Figure 3 (D))). This is because it has a decoder buffer corresponding to one elementary stream and temporarily stores data until the decoding timing.
- the size of the decoder buffer specified by DVD—Video differs for each elementary stream.
- the buffer size for video is 232 KB
- the buffer size for audio is 4 KB
- the buffer size for subtitles is 52 KB.
- subtitle data multiplexed side by side with video data is not necessarily decoded or reproduced at the same timing as video data.
- Patent Document 1 Japanese Patent Laid-Open No. 9-282848
- DVDs there are four audio codec standards adopted by DVDs: "Dolby Digital (AC-3)", “MPEG Audio”, “: LPCM”, and “dts”. “Dt s” is a player option function, and some DVD players have a built-in dts decoder and some do not. In addition, some DVD players have a dts-compatible function and some do not have a digital data output function to the AV amplifier.
- DVD players with dts digital data output function are based on the digital IZF standard called Sony / Philips Digital Interconnect Format (SPDIF, consumer standard stipulated in IEC60958-3 standard).
- SPDIF can only support a bandwidth of up to 1.5Mbps according to the standard, and "dts ++ (lossless compression)" is an extended codec standard for "dts” that requires a bandwidth of nearly 20Mbps.
- na Corresponding to, na,. Therefore, even if the next generation HD DVD standard (BD standard) supports “dt s ++”, it is not possible to output a dts ++ stream to an AV amplifier compliant with the SPDIF standard.
- the present invention provides basic data so that a decoder that decodes only basic data can process an access unit including basic data and extended data corresponding to the next generation. It is an object of the present invention to provide an information recording medium on which an access unit including an extension data is recorded. Another object of the present invention is to provide a data reproducing apparatus for processing the access unit of the information recording medium of the present invention.
- an information recording medium of the present invention is an information recording medium having a plurality of access units on which a stream including at least one of an image and sound is recorded.
- Each of the access units has a first packet including basic data and a second packet including extension data associated with the basic data, and the basic data requires the extension data.
- the extension data is data for improving the quality of data generated from the basic data
- the stream is the first packet and It has information indicating the attribute of the second packet.
- the access unit is data related to sound, and the attribute specifies at least one of channel, frequency, bit rate, and presence / absence of 2-channel downmix data.
- each access unit is data for a certain period of time.
- the data reproducing apparatus of the present invention provides the access unit recorded on the information recording medium of the present invention, the acquisition means for acquiring the information, and the reproduction for reproducing the access unit using the information. Means.
- the present invention is a data having the characteristic configuration means of the data reproducing apparatus of the present invention as a step. It can also be realized as a playback method or as a program that causes a computer to execute these steps.
- the program can also be distributed via a recording medium such as a CD-ROM or a transmission medium such as a communication network.
- the present invention includes basic data and extended data so that a decoder that decodes only basic data can process an access unit including basic data and extended data corresponding to the next generation.
- An information recording medium on which the access unit is recorded can be provided. Further, the present invention can provide a data reproducing apparatus that processes the access unit of the information recording medium of the present invention.
- the existing digital IZF even when audio data using a new audio codec exceeding the bandwidth of the existing digital IZF is recorded on the recording medium, the existing digital IZF Thus, it is possible to extract and output audio data as usual.
- FIG. 1 is a diagram showing the structure of a DVD.
- FIG. 2 is a diagram for explaining navigation information.
- FIG. 3 (A) is a diagram showing data such as video, audio, and captions.
- Fig. 3 (B) is a diagram showing how each data is packetized and packed.
- FIG. 3 (C) is a diagram showing packet data and packed data.
- Fig. 3 (D) shows the MPEG system stream decoder model.
- FIG. 4 is a diagram showing a configuration of a next-generation DVD.
- FIG. 5 is a diagram showing the structure of logical data directories and files recorded on a BD disc.
- FIG. 6 is a block diagram showing functions of the player.
- FIG. 7 is a detailed block diagram of the player configuration.
- FIG. 8 is a diagram showing a BD application space.
- FIG. 9 is a configuration diagram of an MPEG stream (VOB).
- FIG. 10 is a diagram showing a configuration of a pack. It is a figure for demonstrating subrecording.
- FIG. 12 is a diagram for explaining a VOB data continuous supply model.
- FIG. 13 is a diagram showing an internal structure of a VOB management information file.
- FIG. 14 is a diagram for explaining details of VOBU information.
- FIG. 15 is a diagram for explaining the details of the time map.
- FIG. 16 is a diagram showing an internal structure of playlist information.
- FIG. 17 is a diagram showing an event handler table.
- FIG. 18 is a diagram showing an internal structure of information relating to the entire BD disc.
- FIG. 19 is a diagram showing a program table of global event handlers.
- FIG. 20 is a diagram showing an example of a time event.
- FIG. 21 is a diagram showing an example of a user event.
- FIG. 22 is a diagram showing an example of a global event.
- FIG. 23 is a diagram for explaining the function of the program processor.
- FIG. 24 is a diagram showing a list of system parameters.
- FIG. 25 is a diagram showing an example of a menu program having two selection buttons.
- FIG. 26 is a diagram illustrating an example of an event handler for a user event.
- FIG. 27 is a diagram showing a basic processing flow until AV playback.
- FIG. 28 is a diagram showing a processing flow up to the start of regeneration of a PL regeneration start force VOB.
- FIG. 29 is a diagram showing an event processing flow after the start of AV playback.
- FIG. 30 is a diagram showing a flow of subtitle processing.
- FIG. 31 is a diagram showing a structure of an access unit having no hierarchical structure.
- FIG. 32 is a diagram showing the structure of an access unit having two hierarchical structures.
- FIG. 33 is a diagram showing the structure of an access unit having three hierarchical structures.
- FIG. 34 is a diagram for explaining operations different for each data output destination of a stream reading Z supply unit that outputs data having a hierarchical structure to decoders corresponding to various hierarchies.
- FIG. 35 is a diagram showing the structure of an ideal access unit for introducing Level 2-EXT when devices compatible with Base and Level-EXT are widely spread.
- FIG. 36 is a diagram showing a data structure of a data stream including Level 2.
- Fig. 37 shows the access unit of BaseZLevell—EXT that can be decoded by existing players and decoders and Leve 12—EXT that cannot be decoded by existing players and decoders. It is a figure for demonstrating a process.
- FIG. 38 is a diagram showing a method of storing access units having a hierarchical structure in MPEG2-TS.
- FIG. 39 is a diagram illustrating an example of items described in the descriptor.
- FIG. 40 (A) is a diagram showing a 5.1 channel speaker layout
- FIG. 40 (B) is a diagram showing a 7.1 channel speaker layout.
- FIG. 41 is a diagram showing a channel structure.
- FIG. 42 is a diagram showing the MPEG2-TS file format when recording on an optical disc.
- FIG. 43 is a diagram for explaining the details of DTS defined by DVD-Video.
- FIG. 44 is a flowchart showing processing of a demultiplexer and a stream reading Z supply unit.
- FIG. 45 is a block diagram of the input time management device 2000 and the decoder model 3000.
- FIG. 46 illustrates a method of multiplexing Base and Level 2 so that the decoder model for decoding Base is not broken and the decoder model for decoding Base and Level 2 is not broken.
- Management information recording memory 205 AV recording memory
- FIG. 4 is a diagram showing a configuration of a next-generation DVD (hereinafter referred to as “BD”), particularly a configuration of a BD disc 104 as a disc medium and data 101, 102, and 103 recorded on the disc 104.
- the BD disc 104 is recorded with AV data 103, BD management information 102 including AV data management information and an AV playback sequence, and a BD playback program 101 for realizing interactivity.
- AV application data for reproducing AV content of a movie is recorded on the BD disc 104.
- the BD disc 104 may be used for other purposes.
- FIG. 5 is a diagram showing the structure of directories and files of logical data recorded on the BD disc.
- a BD disc has a recording area that spirals from the inner periphery to the outer periphery, and has a lead-in portion on the inner periphery and a lead-out portion on the outer periphery.
- BD discs have a special area inside the lead-in area where data can be read only with a drive called Burst Cutting Area (BCA)! The data in this area cannot be read using application data. Therefore, the above area may be used for copyright protection technology, for example.
- BCA Burst Cutting Area
- file system information (volume) at the head.
- volume file system information
- the BDVIDEO directory is placed directly under the root directory (ROOT).
- the BDVID EO directory is a directory in which data such as AV content and management information recorded on the BD (data 101, 102, and 103 in FIG. 4) is stored.
- the “BD. INFO” file is one of the “BD management information” and relates to the entire BD disc. A file in which information is recorded. The BD player first reads this file.
- the “BD. PROG” file is one of “BD playback programs”, and is a file in which playback control information regarding the entire BD disc is recorded.
- the “XXX. PL” file is one of “BD management information”, and is a file in which playlist information that is a scenario (playback sequence) is recorded. There is one file for each playlist.
- the “XXX. PROG” file is one of “BD playback programs”, and is a file in which the playback control information for each playlist described above is recorded.
- the playlist corresponding to the “XXX. PROG” file is a playlist that matches the file body name (“XXX”).
- VOB file (“YYY” is variable, extension "VOB” is fixed)
- the “YYY. VOB” file is one of “AV data”, and is a file in which a VOB (same as the VOB described in “Background Technology”) is recorded. There is one file for each VOB.
- the “YYY. VOBI” file is one of “BD management information”, and is a file in which stream management information related to VOB as AV data is recorded.
- the VOB corresponding to the “YYY. VOBI” file is the VOB that matches the file body name (“YYY”).
- the “ZZZ. PNG” file is one of the “AV data”, and is a file in which image data PNG (an image format standardized by W3C and read “bing”) for subtitles and menus is recorded. It is. There is one file for each PNG image.
- FIG. 6 a player that plays the above-described BD disc will be described with reference to FIGS. 6 and 7.
- FIG. 6 is a block diagram showing a rough function of the player.
- the data on the BD disc 201 is read through the optical pickup 202.
- the read data is transferred to a dedicated memory according to the data type.
- the BD playback program (“BD. PROG” file or “XXX. PROG” file) is transferred to the program recording memory 203.
- the BD management information (“BD. INFO” file, “XXX. PL” file, or “YYY. VOBI” file) is transferred to the management information recording memory 204.
- AV data (“YYY. VOB” file or “ZZZ. PNG” file) is transferred to the AV recording memory 205.
- the BD playback program recorded in the program recording memory 203 is processed by the program processing unit 206.
- the BD management information recorded in the management information recording memory 204 is processed by the management information processing unit 207.
- AV data recorded in the AV recording memory 205 is processed by the presentation processing unit 208.
- the program processing unit 206 receives event information such as the execution timing of the program information to be reproduced from the management information processing unit 207, and executes the program.
- the program can dynamically change the playlist to be played.
- the program processing unit 206 sends a play list reproduction command to the management information processing unit 207, the play list to be reproduced is dynamically changed.
- the program processing unit 206 receives a request from the user, that is, a request from the remote control key, and executes a program corresponding to the event (request), if any.
- the management information processing unit 207 Upon receiving an instruction from the program processing unit 206, the management information processing unit 207 analyzes the management information of the corresponding playlist and the VOB corresponding to the playlist, and sends the target AV data to the presentation processing unit 208. Instruct to play. Also, the management information processing unit 207 receives the reference time information from the presentation processing unit 208, instructs the presentation processing unit 208 to stop AV data reproduction based on the reference time information, and also instructs the program processing unit 206 to An event indicating the execution timing of the program is generated to give an instruction.
- the presentation processing unit 208 has a decoder corresponding to each of video, audio, and subtitle Z images (still images), and decodes and outputs AV data in accordance with instructions from the management information processing unit 207. Video data and subtitle Z images are decoded. After that, the image is drawn on each dedicated plane, video plane 210 or image plane 209. The videos drawn on the video plane 210 and the image plane 209 are synthesized by the synthesis processing unit 211 and output to a display device such as a TV.
- the BD player has components corresponding to the structure of data recorded on the BD disc shown in FIG.
- FIG. 7 is a detailed block diagram of the configuration of the player described above.
- the AV recording memory 205 is expressed as an image memory 308 and a track buffer 309.
- the program processing unit 206 is expressed as a program processor 302 and a UOP manager 303.
- the management information processing unit 207 is expressed as a scenario processor 305 and a presentation controller 306.
- the presentation processing unit 208 is expressed as a clock 307, a demultiplexer 310, an image processor 311, a video processor 312, and a sound processor 313.
- the VOB data (MPEG stream) read from the BD disc 201 is recorded in the track buffer 309, and the image data (PNG) is recorded in the image memory 308.
- the demultiplexer 310 extracts the VOB data recorded in the track buffer 309 based on the time of the clock 307, sends video data to the video processor 312, and sends audio data to the sound processor 313.
- Each of the video processor 312 and the sound processor 313 is composed of a decoder buffer and a decoder as defined by the MPEG system standard. In other words, the video and audio data sent from the demultiplexer 310 are temporarily recorded in the respective decoder buffers and decoded by the corresponding decoder according to the time of the clock 307.
- PNG recorded in the image memory 308 is processed by the following two processing methods.
- the presentation controller 306 instructs the decoding timing.
- the scenario processor 305 receives the time information from the clock 307 and instructs the presentation controller 306 to display subtitles when the subtitle display start time is reached so that the subtitles can be appropriately displayed. Similarly, based on the time information from the clock 307, the scenario processor 305 stops the subtitle display to the presentation controller 306 when the subtitle display end time is reached. Instruct.
- the image processor 311 extracts the corresponding PNG data from the image memory 308, decodes it, and draws it on the image plane 314.
- the decode timing is instructed by the program processor 302.
- the timing at which the program processor 302 instructs to decode the image depends on the BD program processed by the program processor 302 and is generally not determined! /.
- the image data and the video data are each decoded, drawn on the image plane 314 or the video plane 315, synthesized by the synthesis processing unit 316, and then output.
- Management information (scenario information and AV management information) read from the BD disc 201 is stored in the management information recording memory 304. Thereafter, the scenario information (“BD. INFO” file and “XXX. PL” file) is read by the scenario processor 305. AV management information (“YYY. VOBI” file) is read by the presentation controller 306.
- the scenario processor 305 analyzes the information of the playlist and notifies the presentation controller 306 of the VOB referenced by the playlist and its playback position.
- the presentation controller 306 analyzes the management information (“YYY. VOBI” file) of the target VOB and instructs the drive controller 317 to read out the target VOB.
- the drive controller 317 moves the optical pickup and reads out target AV data.
- the read AV data is stored in the image memory 308 or the track buffer 309 as described above.
- the scenario processor 305 monitors the time of the clock 307 and outputs an event to the program processor 302 at the timing set in the management information.
- the BD program (“BD. PROG” file or “XXX. PROG” file) recorded in the program recording memory 301 is processed by the program processor 302.
- the program processor 302 is activated when an event is sent from the scenario processor 305.
- the UOP manager 303 When an event is sent from the UOP manager 303, the BD program is processed.
- the U OP manager 303 generates an event for the program processor 302 when a request is sent from the user by a remote control key.
- FIG. 8 shows a BD application space.
- a play list is one playback unit.
- a playlist has a static scenario that is a concatenation of cells and is a reproduction sequence determined by the order of concatenation, and a dynamic scenario described by a program. Unless there is a dynamic scenario change by the program, the playlist will play back each cell in turn. Playback of the playlist ends when playback of all cells has been completed.
- the program can describe contents that change the playback order of cells. Also, the program can dynamically change the playback target according to the user's selection or the player's state.
- a typical example is a menu. In BD, the menu can be defined as a scenario to be played back by user selection, and the playlist can be changed dynamically by the program.
- the program referred to here is an event handler executed by a time event or a user event.
- a time event is an event generated based on time information embedded in a playlist.
- the event sent from the scenario processor 305 to the program processor 302 described with reference to FIG. 7 is an example of a time event.
- the program processor 302 executes an event handler associated with the identifier (ID).
- ID the identifier
- the program to be executed can instruct playback of another playlist. For example, the program stops playing the currently playing playlist and causes the specified playlist to play.
- the user event is an event generated by a user's remote control key operation. There are two main types of user events.
- the first is a menu selection event generated by operating the cursor keys (“UP”, “DOWN”, “LEFT”, “RIGHT” keys) or “ENTER” key.
- the event corresponding to the menu selection event The token handler is valid only for a limited period in the playlist (the validity period of each event handler is set as playlist information).
- a valid event handler is searched, and if there is a valid event handler, that event handler is executed. . If there is no valid event handler, the menu selection event is ignored.
- the second user event is a menu call event generated by operating the “menu” key.
- the global event handler is called.
- Global event handlers are always valid event handlers, independent of playlists. By using this function, it is possible to implement DVD menu calls (such as a function that calls audio or subtitles during title playback and plays a title that was interrupted after changing the audio or subtitles).
- a cell which is a unit constituting a static scenario in a playlist, indicates a playback section of all or part of a VOB (MPEG stream).
- the cell has the playback section in the VOB as information on the start time and end time.
- the VOB management information (VOBI) paired with each VOB has a time map (Time Map or TM) which is table information of recording addresses corresponding to data reproduction times.
- time Map Time Map or TM
- FIG. 9 is a configuration diagram of an MPEG stream (VOB) in the present embodiment.
- a VOB is composed of a plurality of video object units (VOBU).
- a VOBU is a playback unit in a multiplex stream including audio data on the basis of Group Of Pictures (GOP) in an MPEG video stream.
- the video playback time of VOBU is 0.4 to 1.0 seconds, usually about 0.5 seconds. In other words, in most cases, about 1 frame is stored in 1 GOP (NTSC).
- VOBU has a video pack (V PCK) and an audio pack (A PCK). .
- the size of each pack is the same as one sector, and in this embodiment is 2 KB.
- FIG. 10 is a diagram showing the configuration of the pack.
- PES Packet Payload As shown in Fig. 10, elementary data such as video data and audio data are sequentially stored from the top in a data storage area of PES Packet (packet) called PES Packet Payload (payload).
- the payload has a PES Packet Header (packet header) attached to form one PES Packet (packet).
- the packet header includes a stream id (ID) for identifying which stream data is stored in the payload, and a time stamp that is time information for decoding and displaying the payload. That is, Decoding Time Stamp (DTS) and Presentation Time Stamp (PTS) are recorded.
- DTS Decoding Time Stamp
- PTS Presentation Time Stamp
- rules are defined by MPEG, which is not necessarily recorded in all packet headers. The details of the rules are described in the standard of the MPEG system (ISOZIEC13818-1), so the explanation is omitted.
- the packet is further provided with a Pack Header (header) to form a pack.
- a Pack Header header
- a time stamp indicating when the pack passes through the demultiplexer and is input to the decoder buffer of each elementary stream, that is, a system clock reference (SCR) is recorded.
- SCR system clock reference
- FIG. 11 The upper part of FIG. 11 is a part of the configuration diagram of the player described above.
- the VOB on the BD disc that is, the MPEG stream
- the PNG image data on the BD disc is input to the image memory through the optical pickup.
- the track buffer is a FIFO buffer, and the VOB data input to the track buffer is sent to the demultiplexer in the input order. At this time, each pack is extracted from the track buffer according to the SCR described above and sent to the video processor or sound processor via the demultiplexer.
- the presentation controller indicates which image to draw. Also, if the image data used for drawing is image data for subtitles, it is deleted from the image memory as soon as it is used. On the other hand, if the image data used for drawing is image data for the user, it remains in the image memory while the menu is being drawn.
- the menu drawing depends on the user's operation, and when the part of the menu is redisplayed or replaced with a different image following the user's operation, the image display of the redisplayed part is displayed. This is because the data can be easily decoded.
- FIG. In general, in ROM such as CD-ROM and DVD-ROM, AV data, which is a series of continuous playback units, is recorded continuously. As long as the data is continuously recorded, the drive sequentially reads the data and sends it to the decoder. However, when continuous data is divided and discretely arranged on the disk, the drive seeks each continuous section, so data reading stops during the seek period. Data supply may stop. In order to prevent this, in the ROM, AV data, which is a series of continuous playback units, is continuously recorded. Even in BD, it is desirable to record VOB files in a continuous area. Data that is played back in synchronization with video data recorded in VOB, such as subtitle data, needs to read the BD disc power in some way as well as VOB files.
- a method is used in which the VOB file is divided into several blocks and interleaved with image data.
- the lower part of Fig. 11 is a diagram for explaining the interleaved recording.
- the image data can be stored in the image memory at a necessary timing without using the large-capacity temporary recording memory as described above.
- the image data is being read In this case, the reading of VOB data is naturally stopped.
- FIG. 12 is a diagram for explaining a VOB data continuous supply model using a track buffer.
- VOB data is stored in the track buffer. If a difference (Va> Vb) is provided between the data input rate (Va) to the track buffer and the data output rate (Vb) from the track buffer, the BD disc force will remain as long as the data is read continuously. The amount of data storage will increase.
- the lower part of FIG. 12 shows the transition of the data amount in the track buffer.
- the horizontal axis shows time, and the vertical axis shows the amount of data stored in the track buffer.
- the time “tl” indicates the time when reading of data at the logical address “al”, which is the start point of the continuous recording area of the VOB, is started.
- data is accumulated in the track buffer at the rate (Va-Vb).
- the rate is the difference between the rate of data input to the track buffer and the rate of data output from the track buffer.
- Time “t2” is the time to read data at logical address “a2”, which is the end point of one continuous recording area. That is, during the time “tl” to “t2”, the data amount in the track buffer increases at the rate (Va ⁇ Vb).
- Data accumulation amount B (t2) at time “t2” can be obtained by the following equation 1.
- BD navigation data (BD management information)
- FIG. 13 shows the internal structure of the VOB management information file (“YYY. VOBI”).
- the VOB management information includes VOB stream attribute information (Attribute) and a time map (TMA P).
- the stream attribute includes a video attribute (Video) and an audio attribute (Audio # 0 to Audio # m). Particularly for audio streams, since a VOB can have multiple audio streams at the same time, the number of audio streams (N umber) indicates the data field.
- the following shows a plurality of fields of the video attribute (Video) and the values that each field can have.
- the time map (TMAP) is a table having information for each VOBU, and has the number of VOBUs (Number) of the VOB and each VOBU information (VOBU #l to VOBU #n). Each VOBU information has a VOBU playback time length (Duration) and a VOBU data size (Size).
- FIG. 14 is a diagram for explaining the details of the VOBU information.
- an MPEG video stream may be compressed at a variable bit rate, and there is no simple correlation between the playback time of each frame and the data size.
- AC3 the audio compression standard, compresses audio data at a fixed bit rate.
- voice data the relationship between time and address is expressed by a linear expression.
- each frame has a fixed display time.
- one frame has a display time of 1Z29.97 seconds, but the compressed data size of each frame is The picture characteristics vary greatly depending on the picture type, that is, the type of IZPZB picture. Therefore, for MPEG video data, it is impossible to express the relationship between time and address with a linear expression.
- TMAP time map
- the VOBU to which the time belongs is searched. That is, the number of frames for each VOBU in the time map is counted, and the sum of the number of frames is the same as the number of VOBUs that exceed the number of frames when the given time is converted into the number of frames, or the number of frames.
- Search for VOBU the data size of each VOBU in the time map is calculated up to the VOBU immediately before that VOBU. Value obtained by calo-calculation Used to obtain the address of the pack to be read in order to reproduce the frame including the given time.
- the playlist information includes a cell list (CellList) and an event list (EventList).
- the cell list (CellList) is a reproduction cell sequence in the playlist, and the cells are reproduced in the description order of the cell list.
- the cell list (CellList) is composed of the number of cells (Number) and cell information (Cell #l to Cell #n)! RU
- Cell information has a VOB file name (VOBName), a start time (In) and end time (Out) in the VOB, and a subtitle table (SubtitleTable).
- VOBName VOB file name
- Start time (In) and end time (Out) are each expressed by the frame number in the VOB.
- TMAP time map
- the subtitle table is a table having information on subtitles to be reproduced in synchronization with the VOB.
- a VOB can have subtitles in multiple languages as well as audio, and a subtitle table (SubtitleTable) contains the number of languages (Number), followed by a table for each language (Language # 1 ⁇ : Language #k). It is composed of
- the table (Language #) for each language contains the language information (Lang), the number of subtitle information displayed (Number), and the subtitle information displayed individually (Subtitle information, Speech # 1). ⁇ Speech #j).
- the subtitle information (Speech #) includes the file name (Name) of the corresponding image data, subtitle display start time (In) and subtitle display end time (Out), subtitle display position (Position), and power. It is composed.
- the event list (EventList) is a table that defines events that occur in the playlist.
- the event list is composed of the number of events (Number) followed by individual events (Event # 1 to Event #m).
- Each event (Event #) consists of the event type (Type), event identifier (ID), event occurrence time (Time), and event duration (Duration)! RU
- FIG. 17 shows an event handler table (“XXX. PROG”) having event handlers (time events and user events for menu selection) of individual playlists.
- XXX. PROG event handler table
- the event handler table contains the number of defined event handler Z programs (Nu mberj and individual event programs (1 "08 & 111 # 1 to 1 3 ]: 08 & 111 # 11).
- Each event handler Z program (Program #) has an event handler start definition ( ⁇ event—handler> tag) and an event handler identifier (ID) paired with the event identifier described above. After that, the program is described between parentheses " ⁇ " and " ⁇ " following Function.Events (Event # l) stored in the event list (EventList) of the above "XXX. PL" ⁇ Event # m) is specified using the identifier (ID) of the event handler of "XXX. PROG”.
- the entire BD disc information is composed of a title list (TitleList) and a global event event table (EventList).
- the title list (TitleList) is composed of the number of titles (Number) in the disc, the following title information (Title # 1 to Title #n), and the power.
- Each title information (Title #) includes a playlist table (PLTable) included in the title and a chapter list (ChapterList) in the title.
- the playlist table (PLTable) includes the number of playlists in the title (Number) and the playlist name (Name), that is, the playlist file name.
- the chapter list (ChapterList) includes the number of chapters (Number) included in the title and individual chapter information (01 & 6 # 1 to 01 & 6 # 11).
- Each chapter information (Chapter #) has a table (CellTable) of the cell containing the chapter.
- the cell table (CellTable) includes the number of cells (Number) and entry information (CellEntry # l to CellEntry # k) of each cell.
- the cell entry information (CelEntry #) is composed of the name of the playlist including the cell, the cell number in the playlist, and the power.
- the event list includes the number of global events (Number) and information on individual global events. It should be noted here that the first defined global event is called the first event (FirstEvent), and is called first when a BD disc is inserted into the player.
- the event information for a Gronole event has only an event type (Type) and an event identifier (ID).
- FIG. 19 is a diagram showing a global event handler program table (“BD. PROG”).
- BD. PROG global event handler program table
- the event generation mechanism will be described with reference to FIGS.
- FIG. 20 shows an example of a time event.
- the time event is the event list (E in the playlist information ("XXX. PL")) ventList).
- the event is defined as a time event, that is, the event type (Type) key is 'TimeEvent', the time event with the identifier "Exl” at the event generation time (“tl")
- the program processor searches for an event handler or event with the event identifier “Exl” and executes the target event handler.For example, in this embodiment, events such as drawing two button images are performed. .
- FIG. 21 is a diagram showing an example of a user event for performing a menu operation.
- the user event for performing the menu operation is also defined in the event list (EventList) of the playlist information (“XXX. PL”). If an event is defined as a user event, that is, event type (Type) power ⁇ 'UserEvent ", the user event becomes ready at the event generation time (" tl "). At this time, the event itself has not yet been generated. The event is in the ready state for the duration indicated by the duration information (Duration).
- a UOP event is first generated by the UOP manager and programmed. Output to the processor.
- the program processor outputs a UOP event to the scenario processor.
- the scenario processor checks whether there is a valid user event at the time the UOP event is received. If there is a valid user event, it generates a user event and outputs it to the program processor.
- the program processor searches for an event handler with the event identifier “Evl” and executes the target event handler. For example, in the present embodiment, playback of playlist # 2 is started.
- the generated user event does not include information specifying which remote control key is pressed by the user.
- Information on the selected remote control key is transmitted to the program processor by a UOP event, and is recorded and held in the register SPRM (8) of the virtual player. By examining the value of this register, it is possible to branch the event handler program.
- FIG. 22 is a diagram showing an example of a global event.
- a glow event is defined in an event list (EventList) of information ("BD. INFO”) related to the entire BD disc. Defined as a global event When the event type is “GlobalEvent”, an event is generated only when the user operates the remote control key.
- a UOP event is first generated by the UOP manager and output to the program processor.
- the program processor outputs a UOP event to the scenario processor, and the scenario processor generates a global event corresponding to the UOP event and sends it to the program processor.
- the program processor searches for an event handler with the event identifier “menu” and executes the target event handler. For example, in the present embodiment, playback of playlist # 3 is started.
- the program processor is a processing module having a virtual player machine inside.
- the virtual player machine has a function corresponding to BD and does not depend on the implementation of the BD player. In other words, the virtual player machine is guaranteed to be able to realize the same function in any BD player.
- the virtual player machine has a programming function and a player variable (register).
- the programming function the following two functions are defined as BD-specific functions based on Java (registered trademark) Script.
- Link function Stops the current playback and starts playback from the specified playlist, cell, or time
- PNG drawing function draws specified PNG data on the image plane
- Image plane clear function Clear specified area of image plane
- SPRM system parameter
- GPRM general parameter
- FIG. 24 shows a list of system parameters (SPRM).
- SPRM (8) Selection key information
- the virtual player programming function ⁇ and ava (registered trademark) Script-based force programming function is defined in the UNIX (registered trademark) OS and the like B— It may be defined by Shell or Perl Script. In other words, the programming function is not limited to being defined by Java Script.
- FIG. 25 and FIG. 26 are diagrams showing examples of programs in event handlers.
- FIG. 25 is a diagram showing an example of a menu program having two selection buttons.
- Button 1 is formed by drawing a PNG image “lblack. Png” in an area starting from the coordinates (10, 200) (left end).
- Button 2 is formed by drawing the PNG image “2white.png” in the area starting at the coordinates (330, 200) (left edge).
- the program on the right side of Fig. 25 is executed using the last time event of this cell.
- the cell's head force is specified to be played again.
- FIG. 26 is a diagram showing an example of an event handler for a user event for menu selection.
- FIG. 27 is a diagram showing a basic processing flow until AV playback.
- the BD player When the BD disc is inserted (S101), the BD player reads and analyzes the "BD. INFO” file (S102), and reads the "BD. PROG” file (S103). "BD. IN Both the “FO” file and “BD. PROG” file are stored in the management information recording memory and analyzed by the scenario processor.
- the scenario processor generates the first event according to the first event information in the “BD. INFO” file (S 104).
- the generated first event is received by the program processor, and the program processor executes an event handler corresponding to the event (S105).
- the playlist information to be played first is recorded in the event handler corresponding to the first event. If the play list is not instructed to be played, the player continues to wait for a user event without playing anything (S201).
- the UOP manager causes the program manager to start executing a UOP event (S202).
- the program manager determines whether or not the UOP event is a menu key (S203). If the UOP event is a menu key, it outputs a UOP event to the scenario processor, and the scenario processor generates a user event. (S204). The program processor executes an event handler corresponding to the generated user event (S205).
- Fig. 28 is a diagram showing a processing flow up to the start of regeneration of the PL regeneration start force VOB.
- playback of a playlist is started by the first event handler or the global event handler (S301).
- the scenario processor reads and analyzes the playlist information “XXX. PL” as information necessary for playback of the playlist to be played back (S302), and loads the program information “XXX. PROG” corresponding to the playlist. (S303). Subsequently, the scenario processor instructs cell reproduction based on the cell information registered in the playlist (S304). Cell playback means that a request is sent from the scenario processor to the presentation controller, and the presentation controller starts playback of AV (S305).
- the presentation controller When AV playback is started (S401), the presentation controller reads and analyzes the VOB information file (XXX. VOBI) corresponding to the cell to be played back (S402). The presentation controller uses the time map to identify the VOBU to start playback and its address, and instructs the drive controller to specify the read address. The controller reads the target VOB data (S403). As a result, the VOB data is sent to the decoder, and its reproduction is started (S404).
- VOB information file XXXX. VOBI
- the presentation controller uses the time map to identify the VOBU to start playback and its address, and instructs the drive controller to specify the read address.
- the controller reads the target VOB data (S403). As a result, the VOB data is sent to the decoder, and its reproduction is started (S404).
- VOB playback is continued until the playback section of the VOB ends (S405), and when it ends, the process proceeds to playback of the next cell (S304). If there is no next cell, playback stops (S4 06).
- FIG. 29 is a diagram showing an event processing flow after the start of AV playback.
- the BD player is an event-driven player.
- event processing for the time event system, user event system, and subtitle display system starts, and the event processing is executed in parallel.
- the processing in the S500 system is a time event system.
- the scenario processor determines whether or not the time event has occurred through a step (S502) for checking whether playback of the playlist has ended. Is confirmed (S503).
- the scenario processor When the time event occurrence time is reached, the scenario processor generates a time event (S504), and the program processor receives the time event and executes the event handler (S505).
- step S503 If the time event occurrence time is not reached in step S503, and after the event handler is executed in step S504, the process returns to step S502 and the above-described processing is repeated. In addition, when it is confirmed in step S502 that the reproduction of the playlist has been completed, the time-event processing is forcibly terminated.
- the S600 process is a user event process.
- the playlist playback completion confirmation step (S602) is followed by the UOP reception confirmation step (S603). If there is a UOP reception, the UOP manager generates a UOP event (S604), and the program processor that receives the UOP event checks whether the UOP event is a menu call (S605). If the UOP event is a menu call, the program processor causes the scenario processor to generate an event (S607), and the program processor executes the event handler (S608).
- step S605 If it is determined in step S605 that the UOP event is not a menu call, UOP The event indicates that the event is caused by the cursor key or the “OK” key.
- the scenario processor determines whether the current time is within the user event valid period or not (S606). If the current time is within the user event valid period, the scenario processor generates a user event (S607), and the program processor executes the target event handler (S608).
- Step S603! / ⁇ If UOP is accepted at Step S603! / ⁇ , if the current time is not within the user event valid period at Step S606, and after executing the event handler at Step S608, return to Step S602 The above process is repeated. Further, when it is confirmed in step S602 that the reproduction of the playlist has been completed, the user event processing is forcibly terminated.
- FIG. 30 is a diagram showing a flow of subtitle processing.
- step S703 If the current time is the subtitle drawing start time, the scenario processor instructs the presentation controller to draw subtitles, and the presentation controller instructs the image processor to draw subtitles (S704). If it is determined in step S703 that the current time is not the subtitle drawing start time, it is checked whether the current time is the subtitle display end time (S705). If it is determined that the current time is the display end time of the caption, the presentation controller instructs the image processor to erase the caption, and the image processor is drawn! The caption is erased from the image plane (S706).
- Subtitle drawing step After completion of S704, after subtitle deletion step S706, and when subtitle display end time confirmation step S705 determines that the current time is not the subtitle display end time In step S702, the process described above is repeated. Also, when it is confirmed in step S702 that the playback of the playlist has been completed, the processing related to subtitle display is forcibly terminated.
- Embodiment 1 relates to the stream structure in BD audio data, and the content is basically Specifically, it is based on the related embodiment described above. Therefore, in the first embodiment, description will be made centering on a portion in which the related embodiment force is also expanded and a portion different from the related embodiment.
- FIG. 31 is a diagram showing the structure of one access unit (decoding and playback encoding unit for video Z audio information) that does not have a hierarchical structure.
- one access unit has a header (Base Header) as shown in FIG. ) And a payload section (Base Payload).
- the Base Header includes Base SYNC, which is the synchronization signal of the Base frame, AU-SIZE indicating the data size of this access unit, EXT indicating whether this access unit is configured only from the Base frame, and if this EXT—ID that indicates what kind of extended information is given when the access unit consists only of the Base frame, and reserve for future use Areas are provided.
- Base SYNC is the synchronization signal of the Base frame
- AU-SIZE indicating the data size of this access unit
- EXT indicating whether this access unit is configured only from the Base frame
- this EXT—ID indicates what kind of extended information is given when the access unit consists only of the Base frame, and reserve for future use Areas are provided.
- the access unit of Fig. 31 does not introduce a hierarchical structure, and one access unit as a whole is encoded only by one encoding method. That means that one access unit can be decoded with only one type of decoding method.
- Figure 32 shows that the Base frame is encoded with a different encoding method from the Base frame, for example, Levell—EXT fmme, which encodes higher-quality video information and higher-quality audio information. It is a figure which shows the structure of 1 access unit.
- Levell—EXT fmme which encodes higher-quality video information and higher-quality audio information.
- EXT of Base Header indicates that this access unit is not configured only from the Base frame, and EXT—ID is the code that is included in the other extension layer data, Levell follows Base frame. Show me that! /
- AU—SIZE represents the size of the access unit. By using AU-SIZE, one access unit can be used to properly decode this access unit while ignoring Levell-EXT frame. Can be designed.
- FIG 33 shows an access unit that is similarly expanded to Level 2—EXT.
- Level2-EXT data is, for example, data that is not included in data up to Level-EXT to obtain audio with a sampling rate higher than the sampling rate of data up to Level-EXT.
- EXT_ID is set to indicate that Level 1 and Level 2 exist.
- Fig. 34 shows the data of the stream reading Z supply unit that outputs the data encoded in such a hierarchical structure (for example, Level2 stream) to a decoder corresponding to various layers. It is a figure for demonstrating different operation
- a hierarchical structure for example, Level2 stream
- the stream reading Z supply unit When outputting data to the Base decoder, the stream reading Z supply unit removes the Levell-EXT and Level2-EXT frames from the Level2 stream and outputs only the Base frame. At that time, the stream reading Z supply unit is AU-SIZE, which is the size information of the access unit of the Base Header, EXT indicating whether or not it is configured only from the Base frame, and EXT-ID indicating the type of the extension layer data Rewrite each value and output data.
- AU-SIZE is the size information of the access unit of the Base Header
- EXT indicating whether or not it is configured only from the Base frame
- EXT-ID indicating the type of the extension layer data Rewrite each value and output data.
- the stream reading Z supply unit removes the Level2—EXT frame from the Level 2 stream and rewrites the AU-SIZE and EXT—ID values and outputs the data. To do.
- the stream reading Z supply unit outputs the Level 2 stream as it is.
- FIG. 35 is a diagram showing an ideal access unit structure for introducing Level 2—EXT when devices compatible with Base and Level 1—EXT are widely spread.
- the access unit shown in Fig. 35 as in the case where the access unit is composed of Base and Levell-EXT, information about data up to Levell-EXT is described in Base Header (and Levell Header). ing.
- information related to the extended layer data after Level2—EXT is described in an area that is not detected by the BaseZLevell—EXT decoder, such as the reserved area.
- EXT-ID has a value indicating that Level2 does not exist !, but it is used up to the Levell frame! / EXT-ID2 is prepared in a powerful reserve area It is described that Level 2 extended hierarchy data exists! Speak.
- Level2 access units for example, access units including Base, Levell-EXT, and Level2—EXT
- Levell access units Base only or Base and Levell-EXT
- the stream reading Z supply unit only extracts the Base part and the Level-EXT part from the Level 2 stream. That is, the stream reading Z supply unit can output a stream composed of Levell access units to a Levell decoder without rewriting data.
- the number of allocated bits for size information such as AU-SIZE is small.
- Leve 12 When Leve 12 is installed, the data size of one access unit is too large, and the size information is expressed in AU-SIZE. It is also effective when you cannot do it.
- FIG. 36 is a diagram showing a data structure of a data stream when a device supporting up to Level 1 exists and Level 2 is newly adopted.
- Figure 36 shows that one access unit consists of two partial forces, Base ZLevell—EXT that can be decoded by existing players and decoders, and Level2—EXT that cannot be decoded by existing players and decoders. Clearly show that.
- the PES packet is composed of a PES header and a PES payload for storing actual data, and the PES header has various fields as shown in FIG.
- the stream-id indicates the type of elementary stream stored in the payload part of the PES packet. Generally, different stream-ids indicate different elementary streams.
- PES-packet-length indicates the data size of the PES packet.
- PES—priority is information for identifying the priority level of the PES packet.
- PTS—DTS—flags is information indicating whether or not a PTS that is the reproduction start time information of the PES payload and a DTS that is the decoding start time information have a certain power. If the values of PTS and DTS are the same, DTS is omitted.
- PES-extension-flag and PES-extension-flag-2 indicate whether or not there is an extension data area in the payload part of the PES packet.
- Elementary stream identification auxiliary information to supplement stream-id that can exist only when stream-1 id-1 extension, stream 1 id OxFD (extended 1 stream- id).
- the base frame part of the access unit (Base + Levell-EXT part in Fig. 36) and the part that does not include the Base frame (Level2- EXT part in Fig. 36) are identification information of the TS packet described later.
- a packet identifier (PID) may be the same and stream_ids may be different, or PTS—DTS—flags may be different, and stream—id—extensions may be used. Also good.
- Base frame is defined as a part that can be completed with 2032 noise or 2013-byte compatible with DVD—Video, and the other part of one access mute is a part that does not include the Base frame. Don't include part and Base frame! /, Part can be separated! /.
- stream_id OxFD (private stream). Therefore, the stream-id-extension value (for example, 0x70) of the PES packet including the Base frame and the stream_id_extension value (for example, 0x71) of the PES packet not including the Base frame are set to different values. Thereby, the player and the external output unit can extract only data including the Base frame. In this case, the set stm m-id- extension value is recorded in the private stream area of logical addresses 0x40 to 0x7F.
- the first PES packet corresponds to an existing device (a protocol on the digital interface is specified, and it corresponds to an existing AV receiver having an input terminal corresponding to it) Levell Up to encoding units can be stored.
- the second PES packet corresponds to a non-existing device (a protocol on the digital interface is specified, or does not have an input terminal corresponding to it, does not exist !, corresponds to an AV receiver) Yes) It is possible to store the sign unit after Level2.
- the first PES packet and the second and subsequent PES packets can be distinguished by judging the value of stream—id, stream—id—extension, or PTS—DTS—flags. It is.
- the size information power of the PES payload can be extracted very easily. Therefore, if the encoding unit up to Levell-EXT is highly compatible with existing AV receivers and digital interfaces and stored together in the first PES packet, the PES payload of the first PES packet Can be easily extracted by analyzing the PES header.
- the stream is input from the BD disc 1001, which is composed of a plurality of access units to the BD player 1000 and on which the stream is recorded, to the parser 1002.
- the For each access unit, Versa 1002 distinguishes between the first PES packet including the Base frame portion and the second and subsequent PES packets including only the Level2-EXT portion.
- the versa 1002 outputs the first PES packet that is the Base frame portion to the decoder 1003 in the BD player 1000 that can process only the Base frame portion.
- the decoder 1003 decodes the first PES packet and outputs the decoded data to the television 1005 via the stereo / analog interface 1004.
- the television 1 005 reproduces data from the BD player 1000 and outputs an image and sound based on the data.
- the Versa 1002 includes the Base frame portion in the Base decoder 1008 and the BaseZLevell-EXT decoder 1 009 in the A / V receiver 1007 outside the BD player 1000 via the SPDIF1006.
- Output PES packets Base decoder 10 08 and Base / Levell—EXT decoder 1009 are decoders that can process the Base frame part and additionally the Levell—EXT frame part. The first PES from BD player 1000 Process the packet.
- the Versa 1002 sends the first PES packet including the Base frame part and the second and subsequent PES packets including only the Level2—EXT part via the Advanced Digital Interface Interface to the AZV receiver.
- the Level2-EXT decoder 1012 is a decoder that can process all frames from Base to Level2-EXT frames, and processes both PES packets from the BD player 1000.
- the access unit is transmitted to the existing decoder 1003, the Base decoder 1008, and the BaseZLeveil-EXT decoder 1009 for processing.
- the access unit outputs to the Level 2—EXT decoder 1012 that can process the first PES packet including the Base frame part and the second and subsequent PES packets including only the Level 2—EXT part. It is processed.
- the BD player 1000 in FIG. 37 is an example of the data reproducing device of the present invention.
- Parser 1002 is an example of a data sorting device.
- an extended frame Levell-EXT and Level2-EXT with low decoding compatibility can be added. It is important that the PES packet containing The order of data in one access unit is Base, Levell-EXT, Level2—EXT, Level3—EXT, Level4—EXT,. It is important that no reordering occurs when extracting the key units.
- DTS voice coding system developed by Digital Theater Systems
- the data size of the payload of the PES packet including the first Base is SPDIF (Sony / The Philips Digital Interconnect Format and consumer products are defined in the IEC60958-3 standard), and should be 2032 bytes or less. This means that in the case of the DTS-typel method, in which 512 samples of audio data sampled at 48KHz are stored in one frame, up to 1524Kbps in terms of bit rate is stored in the first PES packet. Become.
- DVD—DTS-compatible AV receiver compliant with data output from Video Player
- the payload data size of the PES packet including the first Base should be 2013 bytes or less.
- data of one access unit is divided and managed according to the digital interface regulations.
- Fig. 38 is a diagram showing the structure of an access unit having a hierarchical structure multiplexed with MPEG2-TS.
- MPEG2-TS is a digital stream composed of one 188-byte TS packet.
- Program Map Table MPEG2-TS program configuration information Part of PMT has a structure as shown in FIG.
- the PES packet storing the Base + Level1-EXT encoding unit and the PES packet storing the Level2-EXT encoding unit are stored in separate TS packets.
- the MPEG2-TS stores a PMT packet indicating a program stored in the MPEG2-TS.
- PMT is an elementary — stream— indicating which PID TS packet carries various information such as video information and audio information belonging to a given program, and a stream indicating the encoding type of the elementary stream. — Stores the type and one or more des criptors describing additional information about the elementary stream.
- the descriptor for the hierarchical coding scheme uses the extended hierarchy level information (coding—level) and whether or not to use the extension hierarchy that is not currently supported or very few! (For example, identification information indicating whether or not to use Level 2 or Level 2—existence), and if the encoded data is audio information, channel arrangement information (channel assignment), sampling Information such as frequency (sampling_frequency) may be described.
- the encoded data is video information, it may be possible to describe resolution information, frame frequency, etc. in addition to coding-level and Level2-existence in descriptor.
- the descriptor can describe the relationship between levels and audio attributes (Q value, frequency, channel, and speaker layout). By using the speaker layout information, the decoder can appropriately correct each channel even if the layout of the stream to be processed differs from the actual layout.
- Fig. 40 (A) shows the 5.1-channel speaker layout
- Fig. 40 (B) shows the 7.1-channel speaker layout.
- the descriptor can describe the relationship between Level and channel.
- the decoder corresponding to each channel can appropriately identify whether or not the sound can be output. For example, if the channel structure is 7. lch (2 + 3.1 + 2 hierarchical structure), 2ch decoder can output 2 channel sound, 5. lch de coder is 5.1 channel (2 + 3. 1 channel) sound can be output. However, if the channel structure is 7.lch without such a hierarchical structure, the 2ch decoder and 5.lch decoder may not be able to output sound due to the amount of processing.
- DTS data is divided into DTS (equivalent to Base) and DTS + (equivalent to Levell—EXT) data and DTS ++ (equivalent to Level2—EXT) data.
- DTS + DTS + + both of which include extended hierarchy data. Therefore, the descriptor may contain the information to identify whether the target stream is DTS / DTS + or DTS ++ (equivalent to Level2—existence in FIG. 38).
- Level2 existence is used as information indicating whether or not the target stream includes only a portion that can be output to SPDIF in the same format as DVD-Video (DTS typel format). Also good.
- Level2-existence and coding-level information may be described in a database file (such as in the Attribute of the VOBI file in Fig. 13). This information, of course, indicates that the extraction process is different when outputting digital data, but it can also be used for video and audio attribute display Z selection on the BD menu screen etc. . For example, a player that does not support Level 2 can determine that the stream is a stream of Level 2 to be decoded, and provide the user with Level 2 audio that cannot be selected in advance.
- Figure 42 shows the MPEG2-TS file format when recording on an optical disc such as a BD-ROM. It is a figure which shows a format.
- a 4-byte Arrival Time Stamp (ATS, TS packet input start time information) is added to form one Timed TS packet, and 32 Timed TS packets are grouped together Is recorded in 3 sectors (6KB).
- FIG. 43 is a diagram for explaining the details of DTS defined by DVD-Video.
- one access unit that represents audio information for 512 samples at 48KHz may consist of only the core, and may consist of the core and extension!
- a DTS burst payload that stores only the audio data of the PES payload is formed, and a group of 8 bytes (Pa, Pb, Pc, Pd) and stuffing data are added to each 2 bytes.
- Pa, Pb, Pc, Pd 8 bytes
- stuffing data are added to each 2 bytes.
- SPDIF (IEC60958-3) is transmitted in a cycle with 192 frames as one block.
- One frame consists of two subframes, which are 4 bytes of data carrying 2 bytes of data within 5 frames of IEC61937.
- FIG. 44 is a flowchart showing processing of the demultiplexer 310 (FIG. 7) and the stream reading Z supply unit (FIG. 34).
- S801 is a digital output start step for extracting a part of the access unit shown in Fig. 36 and outputting it to the outside in order to support SPDIF.
- S802 is a reproduction end determination step. If YES, the data output ends, and if NO, the process proceeds to PES packet processing S803.
- S805 is a step performed when it is determined in S804 that the PES packet is a non-Base frame part. In S805, the PES packet is discarded.
- S806 is a step performed when it is determined in S804 that the PES packet is a Base frame part.
- the payload (Base + Level-EXT) of the PES packet is extracted, and the frame data is output to the decoder or the existing digital IZF as described with reference to FIGS.
- BaseZLevell—EXT (hereinafter simply referred to as “Base”) that can be decoded by an existing player or decoder
- Level2—EXT (hereinafter simply “Level2”) that cannot be decoded by an existing player or decoder. And how to multiplex them into one access unit
- T-STD TS system target decode model
- input time management apparatus 2000 includes a read buffer (RB), a de-packetizer, an ATS counter, and a 27 MHz clock.
- RB temporarily stores the "xxxx. Vob" file.
- de— The packetizer removes the ATS from the TS packet of “xxxx. vob” and removes the ATS from the first TS packet. Set the value to ATS Counter.
- the de-packetizer outputs only each TS packet to the T-STD3000 according to the ATS time.
- 27MHz Clock outputs clock at 27MHz.
- the decoder model 3000 includes a demultiplexer, TB, MB, and EB, which are buffers related to images, a decoder Dv, and TBal and Bal, which are buffers related to the base of sound data, It has decoder Dal, TBa2 and Ba2 which are buffers related to “Base + Level2” of sound data, decoder Da2, TBsys and Bsys which are buffers related to system data, and decoder Dsys.
- the image data is processed in the order of TB, MB, EB, and decoder Dv.
- the base of the sound data is processed in the order of TBal, Bal, and decoder Dal.
- the sound data “Base + Level 2” is processed in the order of TBa2, Ba2, and decoder Da2.
- System data is processed in the order of TBsys, Bsys, and decoder Dsys.
- Base decoder line (TBal, Bal, and decoder Dal) nor the “: Base + Level 2” decoder line (TBa2, Ba2, and decoder Da2) can be used to prevent the buffer from failing.
- Level 2 must be multiplexed as a single stream with the same PID.
- Base (Base #n) of the nth access unit (Access Unit #n) is input and accumulated at the bit rate (R) from time ATS b #n At the same time Rbal Withdrawn to Bal at a rate of.
- Rbal Withdrawn to Bal at a rate of.
- Base (Base #n) of the nth access unit (Access Unit #n) is input and stored at bit rate (R) from time ATS b # n At the same time Rbal
- the decoder line that supports only Base #n can only decode a stream that consists of Base and Levell, where one access unit is stored in one PES packet. Therefore, high bit rate streams including Level 2 cannot be processed. In order not to break down TBal that can only support low bit rates in this way, it is necessary to delay the input time of Base # n + l to TBa 1. In other words, Equation 3 below must be satisfied.
- Equation 3 shows that the amount of bytes (188 X (1-RbalZR)) that increases when one TS packet is input is added to the TBal data storage amount at time ATS—b # n + l. , TBal
- ATS Means that it will not exceed the size of After ATS—b # n + l that satisfies Equation 3, ATS must be set to multiplex Base # n + 1 and Base # n + 1 must be multiplexed to the stream.
- bit rate Rbal the unit of bit rate Rbal, R is bits / second, and 27000000 is ATS.
- the Nbas and Next values can also be used to calculate information such as the maximum bit rate of each Codec.
- DTS ++ assuming that the sampling frequency is S48KHz, 512 samples ZAcc ess Unit (DTS—typel), Core (Base) is a fixed rate of lMbps, and XLL (Level2) is 24Mbps alone.
- XLL data length 24Mbps
- X 512Z48K 320
- the buffer fails even when the data underflows as well as when the data overflows.
- Base and Level2 must be multiplexed so that the data does not underflow. Therefore, as in the case of preventing data overflow, the data size is based on the size of the buffer, the size of the data to be input to the buffer, the speed of the data to be input to the buffer, and the speed of the data to be output from the buffer. Multiplex Base and Level2 so that there is no underflow!
- each decoder model takes into account the size of the buffer, the size of the data to be input to the buffer, the speed of the data to be input to the buffer, and the speed of the data to be output from the buffer. Multiplex Base and Level 2 so that does not fail.
- the information recording medium of the present invention is useful as an optical disk or the like on which video and audio data are recorded.
- the data sorting apparatus of the present invention is useful as an apparatus for extracting existing compressed data or basic compressed data corresponding to an existing digital IZF from data recorded on an information recording medium of the present invention such as an optical disk.
- the data reproducing apparatus of the present invention is useful as an apparatus for extracting and reproducing the basic compressed data of the present invention such as an optical disk.
- the data reproducing apparatus of the present invention reproduces not only data from the information recording medium of the present invention such as an optical disc but also audio data supplied through broadcasting or a network, or audio data on a recording medium such as a hard disk or a semiconductor memory. It is also useful for a playback device that performs the above.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Television Signal Processing For Recording (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20050772736 EP1780712B1 (en) | 2004-08-17 | 2005-08-17 | Information recording medium and data reproducing device |
KR1020087012726A KR100890096B1 (ko) | 2004-08-17 | 2005-08-17 | 정보 기록 매체 및 데이터 재생 장치 |
US11/659,036 US7792012B2 (en) | 2004-08-17 | 2005-08-17 | Information recording medium, and data reproduction device |
JP2006531816A JP4551403B2 (ja) | 2004-08-17 | 2005-08-17 | 再生装置 |
KR1020087026920A KR100890095B1 (ko) | 2004-08-17 | 2005-08-17 | 정보 기록 매체 및 데이터 재생 장치 |
CN2005800277422A CN101006507B (zh) | 2004-08-17 | 2005-08-17 | 记录方法 |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004237160 | 2004-08-17 | ||
JP2004-237160 | 2004-08-17 | ||
JP2004-283896 | 2004-09-29 | ||
JP2004-283897 | 2004-09-29 | ||
JP2004283896 | 2004-09-29 | ||
JP2004283897 | 2004-09-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006019108A1 true WO2006019108A1 (ja) | 2006-02-23 |
Family
ID=35907489
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/014982 WO2006019104A1 (ja) | 2004-08-17 | 2005-08-17 | 情報記録媒体、データ分別装置、及びデータ再生装置 |
PCT/JP2005/014980 WO2006019102A1 (ja) | 2004-08-17 | 2005-08-17 | 情報記録媒体、データ分別装置、及びデータ再生装置 |
PCT/JP2005/014986 WO2006019108A1 (ja) | 2004-08-17 | 2005-08-17 | 情報記録媒体、及びデータ再生装置 |
PCT/JP2005/014981 WO2006019103A1 (ja) | 2004-08-17 | 2005-08-17 | 情報記録媒体、及び多重化装置 |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/014982 WO2006019104A1 (ja) | 2004-08-17 | 2005-08-17 | 情報記録媒体、データ分別装置、及びデータ再生装置 |
PCT/JP2005/014980 WO2006019102A1 (ja) | 2004-08-17 | 2005-08-17 | 情報記録媒体、データ分別装置、及びデータ再生装置 |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/014981 WO2006019103A1 (ja) | 2004-08-17 | 2005-08-17 | 情報記録媒体、及び多重化装置 |
Country Status (7)
Country | Link |
---|---|
US (4) | US8249415B2 (ja) |
EP (4) | EP1780714A4 (ja) |
JP (6) | JP4568725B2 (ja) |
KR (10) | KR100876492B1 (ja) |
CN (3) | CN101006507B (ja) |
TW (1) | TWI377564B (ja) |
WO (4) | WO2006019104A1 (ja) |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4481991B2 (ja) | 2004-08-17 | 2010-06-16 | パナソニック株式会社 | 情報記録媒体、データ分別装置、データ再生装置及び記録方法 |
TWI377564B (en) * | 2004-08-17 | 2012-11-21 | Panasonic Corp | Information storage medium and multiplexing device |
US8326133B2 (en) * | 2005-03-29 | 2012-12-04 | Koninklijke Philips Electronics N.V. | Method and device for providing multiple video pictures |
JP4670604B2 (ja) * | 2005-11-21 | 2011-04-13 | ブラザー工業株式会社 | 情報配信システム、情報処理装置、情報処理プログラム及び情報処理方法 |
US8139612B2 (en) * | 2006-04-04 | 2012-03-20 | Qualcomm Incorporated | Methods and apparatus for dynamic packet mapping |
US8213548B2 (en) * | 2006-04-04 | 2012-07-03 | Qualcomm Incorporated | Methods and apparatus for dynamic packet reordering |
JP4325657B2 (ja) * | 2006-10-02 | 2009-09-02 | ソニー株式会社 | 光ディスク再生装置、信号処理方法、およびプログラム |
KR101310894B1 (ko) * | 2006-10-17 | 2013-09-25 | 주식회사 넷앤티비 | LASeR 서비스에서 다른 SAF 세션의 스트림 참조방법, 기록매체, 장치 및 그 서비스 제공 장치 |
US8875199B2 (en) * | 2006-11-13 | 2014-10-28 | Cisco Technology, Inc. | Indicating picture usefulness for playback optimization |
US8416859B2 (en) * | 2006-11-13 | 2013-04-09 | Cisco Technology, Inc. | Signalling and extraction in compressed video of pictures belonging to interdependency tiers |
US20080115175A1 (en) * | 2006-11-13 | 2008-05-15 | Rodriguez Arturo A | System and method for signaling characteristics of pictures' interdependencies |
US20090180546A1 (en) * | 2008-01-09 | 2009-07-16 | Rodriguez Arturo A | Assistance for processing pictures in concatenated video streams |
JP5119239B2 (ja) * | 2007-03-26 | 2013-01-16 | パナソニック株式会社 | デジタル放送受信装置 |
JP5057820B2 (ja) * | 2007-03-29 | 2012-10-24 | 株式会社東芝 | デジタルストリームの記録方法、再生方法、記録装置、および再生装置 |
US8804845B2 (en) * | 2007-07-31 | 2014-08-12 | Cisco Technology, Inc. | Non-enhancing media redundancy coding for mitigating transmission impairments |
US8958486B2 (en) * | 2007-07-31 | 2015-02-17 | Cisco Technology, Inc. | Simultaneous processing of media and redundancy streams for mitigating impairments |
DE202008010463U1 (de) * | 2007-08-07 | 2009-04-23 | Plds Taiwan (Philips & Lite-On Digital Solutions Corporation) | Optischer Datenträger sowie eine Vorrichtung zum Aufzeichnen auf einen scheibenförmigen optischen Datenträger |
CN101904170B (zh) * | 2007-10-16 | 2014-01-08 | 思科技术公司 | 用于传达视频流中的串接属性和图片顺序的方法和系统 |
US8718388B2 (en) | 2007-12-11 | 2014-05-06 | Cisco Technology, Inc. | Video processing with tiered interdependencies of pictures |
TWI357263B (en) | 2008-02-22 | 2012-01-21 | Novatek Microelectronics Corp | Method and related device for converting transport |
US8416858B2 (en) * | 2008-02-29 | 2013-04-09 | Cisco Technology, Inc. | Signalling picture encoding schemes and associated picture properties |
US8886022B2 (en) | 2008-06-12 | 2014-11-11 | Cisco Technology, Inc. | Picture interdependencies signals in context of MMCO to assist stream manipulation |
US8705631B2 (en) * | 2008-06-17 | 2014-04-22 | Cisco Technology, Inc. | Time-shifted transport of multi-latticed video for resiliency from burst-error effects |
US8699578B2 (en) | 2008-06-17 | 2014-04-15 | Cisco Technology, Inc. | Methods and systems for processing multi-latticed video streams |
US8971402B2 (en) | 2008-06-17 | 2015-03-03 | Cisco Technology, Inc. | Processing of impaired and incomplete multi-latticed video streams |
WO2009158550A2 (en) * | 2008-06-25 | 2009-12-30 | Cisco Technology, Inc. | Support for blocking trick mode operations |
WO2010056842A1 (en) | 2008-11-12 | 2010-05-20 | Cisco Technology, Inc. | Processing of a video [aar] program having plural processed representations of a [aar] single video signal for reconstruction and output |
US8572036B2 (en) | 2008-12-18 | 2013-10-29 | Datalight, Incorporated | Method and apparatus for fault-tolerant memory management |
US8949883B2 (en) | 2009-05-12 | 2015-02-03 | Cisco Technology, Inc. | Signalling buffer characteristics for splicing operations of video streams |
US8279926B2 (en) | 2009-06-18 | 2012-10-02 | Cisco Technology, Inc. | Dynamic streaming with latticed representations of video |
US20110222837A1 (en) * | 2010-03-11 | 2011-09-15 | Cisco Technology, Inc. | Management of picture referencing in video streams for plural playback modes |
JP6652320B2 (ja) * | 2013-12-16 | 2020-02-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 送信方法、受信方法、送信装置及び受信装置 |
CN111212251B (zh) * | 2014-09-10 | 2022-05-27 | 松下电器(美国)知识产权公司 | 再现装置以及再现方法 |
JP2016100039A (ja) * | 2014-11-17 | 2016-05-30 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 記録媒体、再生方法、および再生装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09102932A (ja) * | 1995-08-02 | 1997-04-15 | Sony Corp | データ記録方法及び装置、データ記録媒体、データ再生方法及び装置 |
JP2003502704A (ja) * | 1999-06-21 | 2003-01-21 | デジタル・シアター・システムズ・インコーポレーテッド | デコーダの互換性を失わない確立済み低ビット・レートのオーディオ・コード化システムの音質の改善 |
JP2003518354A (ja) * | 1999-12-21 | 2003-06-03 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 伝送媒体を介する第1及び第2のデジタル情報信号の伝送 |
Family Cites Families (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5603012A (en) | 1992-06-30 | 1997-02-11 | Discovision Associates | Start code detector |
US5623344A (en) * | 1992-09-01 | 1997-04-22 | Hitachi America, Ltd. | Digital video recording device with trick play capability |
US5805762A (en) * | 1993-01-13 | 1998-09-08 | Hitachi America, Ltd. | Video recording device compatible transmitter |
EP0654199B1 (en) * | 1993-06-10 | 1999-05-26 | Sony Corporation | Rational input buffer arrangements for auxiliary information in video and audio signal processing systems |
JPH08339637A (ja) * | 1995-04-11 | 1996-12-24 | Toshiba Corp | 記録媒体とこの記録媒体へのデータの記録装置とその記録方法、その記録媒体からのデータの再生装置とその再生方法 |
CA2173812C (en) | 1995-04-11 | 2000-02-08 | Shinichi Kikuchi | Recording medium, recording apparatus and recording method for recording data into recording medium, and reproducing apparatus and reproduction method for reproducing data from recording medium |
US5956088A (en) * | 1995-11-21 | 1999-09-21 | Imedia Corporation | Method and apparatus for modifying encoded digital video for improved channel utilization |
US6567612B2 (en) * | 1996-04-05 | 2003-05-20 | Pioneer Electronic Corporation | Information record medium, apparatus for recording the same and apparatus for reproducing the same |
JPH09282848A (ja) | 1996-04-05 | 1997-10-31 | Pioneer Electron Corp | 情報記録媒体並びにその記録装置及び再生装置 |
JP3340384B2 (ja) * | 1997-03-25 | 2002-11-05 | 三星電子株式会社 | Dvdオーディオディスク及びこれを再生する装置及び方法 |
US6222983B1 (en) * | 1997-03-25 | 2001-04-24 | Samsung Electronics Co., Ltd. | DVD-audio disk, and apparatus and method for playing the same |
KR100265112B1 (ko) * | 1997-03-31 | 2000-10-02 | 윤종용 | 디브이디 디스크와 디브이디 디스크를 재생하는 장치 및 방법 |
KR100215476B1 (ko) | 1997-06-02 | 1999-08-16 | 윤종용 | 디지털 다기능 디스크(dvd) 및 dvd재생장치 |
US7113523B1 (en) * | 1997-06-11 | 2006-09-26 | Sony Corporation | Data multiplexing device, program distribution system, program transmission system, pay broadcast system, program transmission method, conditional access system, and data reception device |
EP0933776A3 (en) | 1998-01-30 | 2006-05-17 | Victor Company of Japan, Ltd. | Signal encoding apparatus, audio data transmitting method, audio data recording method, audio data decoding method and audio disc |
JP3988006B2 (ja) * | 1998-04-24 | 2007-10-10 | ソニー株式会社 | 情報伝送装置および情報再生装置 |
JP3872896B2 (ja) | 1998-06-01 | 2007-01-24 | 株式会社東芝 | 音声再生装置 |
US6366617B1 (en) * | 1998-10-09 | 2002-04-02 | Matsushita Electric Industrial Co., Ltd. | Programmable filter for removing selected user data from an MPEG-2 bit stream |
KR100657237B1 (ko) | 1998-12-16 | 2006-12-18 | 삼성전자주식회사 | 데이터열간의 연속 재생을 보장하기 위한 부가 정보 생성 방법 |
KR200227364Y1 (ko) | 1999-06-29 | 2001-06-15 | 이계안 | 동력 조향 장치의 오일 탱크 마운팅 구조 |
US6999827B1 (en) * | 1999-12-08 | 2006-02-14 | Creative Technology Ltd | Auto-detection of audio input formats |
US7133449B2 (en) | 2000-09-18 | 2006-11-07 | Broadcom Corporation | Apparatus and method for conserving memory in a fine granularity scalability coding system |
MXPA03003690A (es) * | 2000-10-27 | 2004-05-05 | Chiron Spa | Acidos nucleicos y proteinas de los grupos a y b de estreptococos. |
JP3867516B2 (ja) * | 2001-05-17 | 2007-01-10 | ソニー株式会社 | ディジタル放送受信装置及び方法、情報処理装置及び方法、並びに、情報処理システム |
JP4556356B2 (ja) * | 2001-07-16 | 2010-10-06 | 船井電機株式会社 | 録画装置 |
WO2003010766A1 (en) * | 2001-07-23 | 2003-02-06 | Matsushita Electric Industrial Co., Ltd. | Information recording medium, and apparatus and method for recording information on information recording medium |
JP3862630B2 (ja) | 2001-07-23 | 2006-12-27 | 松下電器産業株式会社 | 情報記録媒体、情報記録媒体に情報を記録する装置及び方法 |
JP2003100014A (ja) * | 2001-09-25 | 2003-04-04 | Nec Corp | 記録再生管理・制御装置及び記録再生管理・制御方法 |
US7649829B2 (en) * | 2001-10-12 | 2010-01-19 | Qualcomm Incorporated | Method and system for reduction of decoding complexity in a communication system |
US20050013583A1 (en) * | 2001-11-20 | 2005-01-20 | Masanori Itoh | Audio/video information recording/reproducing apparatus and method, and recording medium in which information is recorded by using the audio/video information recording/reproducing apparatus and method |
JP3863526B2 (ja) | 2001-11-30 | 2006-12-27 | 松下電器産業株式会社 | ストリーム変換装置及び方法、情報記録装置及び方法、並びに情報記録媒体 |
US7480441B2 (en) * | 2001-12-20 | 2009-01-20 | Thomson Licensing | Method for seamless real-time splitting and concatenating of a data stream |
US7356147B2 (en) * | 2002-04-18 | 2008-04-08 | International Business Machines Corporation | Method, system and program product for attaching a title key to encrypted content for synchronized transmission to a recipient |
US6842831B2 (en) * | 2002-04-25 | 2005-01-11 | Intel Corporation | Low latency buffer control system and method |
KR100458878B1 (ko) * | 2002-05-03 | 2004-12-03 | 학교법인 경희대학교 | Fec 코딩 방식에 기초한 가변길이 패킷 송수신 방법 |
US7581019B1 (en) * | 2002-06-05 | 2009-08-25 | Israel Amir | Active client buffer management method, system, and apparatus |
US20050220441A1 (en) | 2002-07-16 | 2005-10-06 | Comer Mary L | Interleaving of base and enhancement layers for hd-dvd |
JP2005533337A (ja) | 2002-07-16 | 2005-11-04 | トムソン ライセンシング | 拡張層の交互のストリーム識別番号を用いた、hd−dvdのための基本層と拡張層のインターリーブ |
JP3668213B2 (ja) | 2002-08-13 | 2005-07-06 | 株式会社東芝 | Hdコンテンツとsdコンテンツとを記録する光ディスク及び光ディスク装置 |
US7668842B2 (en) | 2002-10-16 | 2010-02-23 | Microsoft Corporation | Playlist structure for large playlists |
BRPI0316861B1 (pt) * | 2002-12-03 | 2018-12-11 | Thomson Licensing Sa | disco de vídeo digital codificado com dados de sinais |
CN1512768A (zh) * | 2002-12-30 | 2004-07-14 | 皇家飞利浦电子股份有限公司 | 一种在hd-dvd系统中用于生成视频目标单元的方法 |
US7702405B2 (en) * | 2004-06-02 | 2010-04-20 | Standard Microsystems Corporation | System and method for transferring non-compliant packetized and streaming data into and from a multimedia device coupled to a network across which compliant data is sent |
TWI377564B (en) | 2004-08-17 | 2012-11-21 | Panasonic Corp | Information storage medium and multiplexing device |
JP4481991B2 (ja) * | 2004-08-17 | 2010-06-16 | パナソニック株式会社 | 情報記録媒体、データ分別装置、データ再生装置及び記録方法 |
US8068722B2 (en) | 2004-10-07 | 2011-11-29 | Panasonic Corporation | Information storage medium, TS packet judgement apparatus, and data reproduction apparatus |
US7423756B2 (en) * | 2007-01-31 | 2008-09-09 | G & A Technical Software, Inc. | Internally-calibrated, two-detector gas filter correlation radiometry (GFCR) system |
US20090204668A1 (en) * | 2008-02-12 | 2009-08-13 | Sydney Furan Huang | System and process for distant pulse diagnosis |
-
2005
- 2005-08-16 TW TW94127970A patent/TWI377564B/zh active
- 2005-08-17 KR KR20077002035A patent/KR100876492B1/ko not_active IP Right Cessation
- 2005-08-17 US US11/659,022 patent/US8249415B2/en not_active Expired - Fee Related
- 2005-08-17 US US11/659,959 patent/US7949930B2/en not_active Expired - Fee Related
- 2005-08-17 US US11/660,394 patent/US8170400B2/en not_active Expired - Fee Related
- 2005-08-17 CN CN2005800277422A patent/CN101006507B/zh not_active Expired - Fee Related
- 2005-08-17 JP JP2006531812A patent/JP4568725B2/ja not_active Expired - Fee Related
- 2005-08-17 WO PCT/JP2005/014982 patent/WO2006019104A1/ja active Application Filing
- 2005-08-17 KR KR1020087012727A patent/KR100869605B1/ko not_active IP Right Cessation
- 2005-08-17 EP EP05772771A patent/EP1780714A4/en not_active Ceased
- 2005-08-17 WO PCT/JP2005/014980 patent/WO2006019102A1/ja active Application Filing
- 2005-08-17 KR KR1020097001484A patent/KR20090021225A/ko active Search and Examination
- 2005-08-17 WO PCT/JP2005/014986 patent/WO2006019108A1/ja active Application Filing
- 2005-08-17 CN CN2010101661234A patent/CN101872635B/zh not_active Expired - Fee Related
- 2005-08-17 CN CN2005800281659A patent/CN101006513B/zh active Active
- 2005-08-17 WO PCT/JP2005/014981 patent/WO2006019103A1/ja active Application Filing
- 2005-08-17 JP JP2006531816A patent/JP4551403B2/ja not_active Expired - Fee Related
- 2005-08-17 KR KR1020087012726A patent/KR100890096B1/ko not_active IP Right Cessation
- 2005-08-17 EP EP20050772736 patent/EP1780712B1/en not_active Expired - Fee Related
- 2005-08-17 KR KR1020107016661A patent/KR101074604B1/ko not_active IP Right Cessation
- 2005-08-17 KR KR20077002032A patent/KR20070034607A/ko not_active Application Discontinuation
- 2005-08-17 EP EP20050772763 patent/EP1780713B1/en not_active Expired - Fee Related
- 2005-08-17 KR KR20077002034A patent/KR100865826B1/ko active IP Right Grant
- 2005-08-17 JP JP2006531813A patent/JP4593570B2/ja not_active Expired - Fee Related
- 2005-08-17 KR KR1020087026920A patent/KR100890095B1/ko not_active IP Right Cessation
- 2005-08-17 US US11/659,036 patent/US7792012B2/en active Active
- 2005-08-17 JP JP2006531811A patent/JP4479968B2/ja active Active
- 2005-08-17 KR KR1020087027807A patent/KR100978723B1/ko active IP Right Grant
- 2005-08-17 EP EP20050772769 patent/EP1791123B1/en active Active
- 2005-08-17 KR KR1020087014651A patent/KR100890097B1/ko not_active IP Right Cessation
-
2010
- 2010-01-14 JP JP2010006298A patent/JP4593679B2/ja not_active Expired - Fee Related
- 2010-04-15 JP JP2010094418A patent/JP4951087B2/ja not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09102932A (ja) * | 1995-08-02 | 1997-04-15 | Sony Corp | データ記録方法及び装置、データ記録媒体、データ再生方法及び装置 |
JP2003502704A (ja) * | 1999-06-21 | 2003-01-21 | デジタル・シアター・システムズ・インコーポレーテッド | デコーダの互換性を失わない確立済み低ビット・レートのオーディオ・コード化システムの音質の改善 |
JP2003518354A (ja) * | 1999-12-21 | 2003-06-03 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 伝送媒体を介する第1及び第2のデジタル情報信号の伝送 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1780712A4 * |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4568725B2 (ja) | 情報記録媒体、多重化装置及び記録方法 | |
JP4481991B2 (ja) | 情報記録媒体、データ分別装置、データ再生装置及び記録方法 | |
JP2008516360A (ja) | 情報記録媒体、tsパケット判定装置、及びデータ再生装置 | |
KR20070032030A (ko) | 정보 기록 매체, 및 다중화 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006531816 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005772736 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077002034 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11659036 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580027742.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 1020077002034 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2005772736 Country of ref document: EP |