WO2006028216A1 - 記録媒体、再生装置、プログラム、再生方法 - Google Patents
記録媒体、再生装置、プログラム、再生方法 Download PDFInfo
- Publication number
- WO2006028216A1 WO2006028216A1 PCT/JP2005/016640 JP2005016640W WO2006028216A1 WO 2006028216 A1 WO2006028216 A1 WO 2006028216A1 JP 2005016640 W JP2005016640 W JP 2005016640W WO 2006028216 A1 WO2006028216 A1 WO 2006028216A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- playback
- information
- stream
- time
- main
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/327—Table of contents
- G11B27/329—Table of contents on a disc [VTOC]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/92—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/93—Regeneration of the television signal or of selected parts thereof
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/00086—Circuits for prevention of unauthorised reproduction or copying, e.g. piracy
- G11B20/0021—Circuits for prevention of unauthorised reproduction or copying, e.g. piracy involving encryption or decryption of contents recorded on or reproduced from a record carrier
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/12—Formatting, e.g. arrangement of data block or words on the record carriers
- G11B2020/1264—Formatting, e.g. arrangement of data block or words on the record carriers wherein the formatting concerns a specific kind of data
- G11B2020/1288—Formatting by padding empty spaces with dummy data, e.g. writing zeroes or random data when de-icing optical discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/21—Disc-shaped record carriers characterised in that the disc is of read-only, rewritable, or recordable type
- G11B2220/213—Read-only discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2541—Blu-ray discs; Blue laser DVR discs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/775—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/8042—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/806—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
- H04N9/8063—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8227—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
Definitions
- the present invention belongs to the technical field of synchronous application technology.
- Synchronous application technology means that a plurality of digital streams recorded on different recording media are played back, and the synchronization is defined, so that one movie product is played back. It is a technology that makes it appear to users.
- a main stream refers to a digital stream including high-quality moving images.
- a substream is a digital stream that does not contain high-quality moving images.
- the main stream is recorded on a large-capacity optical disk such as a BD-ROM and supplied to the user, and the substream is supplied to the user via the Internet.
- Patent Document 1 Japanese Patent Laid-Open No. 2002-247526
- Special playback is a function such as fast-forwarding, “rewinding”, chapter search and time search, and is realized on the premise of “random access” to the digital stream. Random access is a technique for converting an arbitrary point in time on a digital stream into a recording position on the digital stream and accessing the recording position. Up When special playback is to be executed for the synchronous application as described above, it is not necessary to execute random access in the main stream, and it is necessary to execute random access in the substream.
- Various data objects such as audio, graphics, and standard-definition moving images can be reproduced from the substream.
- a main stream including high-quality moving images has a unit that can be decoded independently, such as GOP (Group Of Picture), and a substream does not necessarily have a unit corresponding to this GOP.
- GOP Group Of Picture
- the random access on the substream side can be performed at high speed just like the main stream. There is no guarantee that can be made. As a result, the playback on the substream side cannot be started easily, and the playback start on the substream side may be significantly delayed.
- An object of the present invention is to provide a recording medium and a playback device that can prevent a decrease in response when substream random access is performed simultaneously with mainstream random access.
- a recording medium is a recording medium on which playlist information is recorded, and the playlist information defines a playback section for each of a plurality of digital streams.
- Information including main path information and sub path information.
- the main path information is information that designates one of a plurality of digital streams as a main stream and defines a main playback section for the main stream.
- the sub-path information designates the other one of a plurality of digital streams as a sub-stream, and defines a sub-stream corresponding to the main playback section for the sub-stream.
- a plurality of digital streams designated as substreams are recorded in a form associated with an entry map, and the entry map includes a plurality of substreams on the time axis.
- the entry time is shown in association with a plurality of entry positions in the substream.
- the playback device can respond to user operations immediately. Cue playback and double speed playback can be realized.
- the entry map includes a plurality of entry times that exist at regular time intervals on the time axis, or a first type entry map that indicates multiple entry positions that exist at regular data intervals in the digital stream. And a second type of entry map that shows the entry position that corresponds to the beginning of the completed data set in association with the entry time, and the entry map is the first type of entry map force type second type It is desirable to include a flag that indicates whether or not.
- the flag indicates the first type, it indicates that the entry position exists at regular time intervals or at regular data intervals.
- the stream analysis should be performed at the fixed time interval or the fixed data interval, which is the worst case.
- a desired access point is reached by analyzing the range of the constant time interval or the constant data interval.
- the flag indicates the second type, it indicates that the start point of the completed data set is designated as the entry position.
- the playback device reads out the data set of the position force at the entry position, and if it is used for playback, it can be It can be understood that the data display can be realized.
- the flag prompts the playback device to determine whether stream analysis with a certain time interval or constant data interval as the upper limit is necessary or no stream analysis is necessary.
- random access to the main stream is performed.
- it does not place an excessive burden on the playback device. By reducing the burden, the response to the user operation can be improved.
- FIG. 1 is a diagram showing a form in which a recording medium according to the present invention is used.
- FIG. 2 is a diagram showing an internal configuration of a BD-ROM.
- FIG. 3 A diagram schematically showing how a file with the extension .m2ts is structured.
- FIG. 4 A diagram showing the process through which TS packets constituting the MainClip are written to the BD-ROM.
- FIG. 5 is a diagram showing an internal configuration of a video stream used for a movie.
- FIG. 6 (a) is a diagram showing an internal configuration of an IDR picture.
- FIG. 7 is a diagram illustrating a process of converting IDR pictures and Non-IDR I pictures into TS packets.
- FIG. 8 is a diagram showing an internal structure of Clip information.
- FIG. 11 is a diagram showing an internal configuration of a local storage.
- FIG. 12 is a diagram showing an internal configuration of a primary audio stream and a secondary audio stream.
- FIG. 13 shows the internal structure of a PG stream.
- FIG. 14 is a diagram showing an internal configuration of an IG stream.
- FIG. 15 is a diagram showing a data structure of Clip information on the local storage side.
- FIG. 16 is a diagram showing EP_map generated for the Primary audio stream and the Secondary audio stream.
- FIG. 17 is a diagram showing EPjnap set for the PG stream time axis.
- FIG. 18 is a diagram showing an EP_map set for the IG stream time axis.
- FIG. 19 is a diagram illustrating a data structure of PlayList information.
- FIG. 20 is a diagram showing a relationship between AVClip and PlayList information.
- FIG. 21 is a diagram showing an internal structure of PlayListMark information in PlayList information.
- FIG. 22 is a diagram showing designation of a chapter position by PlayListMark information in PlayList information.
- FIG. 23 shows a close-up of the internal structure of Subpath information.
- -It is a figure which shows a response
- FIG. 25 is a diagram collectively showing the EPjnap and Playltem time axes set for the MainClip, and the EPjnap and SubPlayltem time axes set for the SubClip as the primary audio stream and the secondary audio stream.
- FIG. 26 is a diagram showing the EPjnap and Playltem time axes set for the MainClip and the EPjnap and SubPlayltem time axes set for the SubClip that becomes the PG stream and the IG stream.
- FIG. 27 is a diagram showing the correspondence relationship between SubPathJpe of SubPlayItem, applicationjpe, and EP_stream_type in a tabular format.
- FIG. 28 shows a virtual file system generated by the playback device 300.
- FIG. 29 is a diagram showing an internal configuration of a playback apparatus according to the present invention.
- FIG. 30 is a flowchart showing a jump-in reproduction processing procedure.
- FIG. 31 is a diagram schematically showing how a random access position is specified using EPjnap set as shown in FIG.
- FIG. 32 is a flowchart showing a processing procedure for converting coordinates TM on the MainClip and SubClip into addresses.
- FIG. 33 is a diagram illustrating a relationship between variables k and h and random access positions when SubClip is a primary audio stream and a secondary audio stream.
- FIG. 34 is a diagram showing a relationship between variables k, h and random access positions in the case of a SubClip power PG stream and an IG stream.
- FIG. 35 shows an example of PiP playback.
- FIG. 36 (a) A diagram showing a comparison between an HD image and an SD image.
- FIG. 37 is a diagram showing recorded contents of a local storage according to the second embodiment.
- FIG. 38 is a diagram showing an internal structure of Clip information recorded in a local storage in the second embodiment.
- FIG. 39 is a diagram showing the EP_map set for the Secondary Video stream in the same notation as FIG.
- FIG. 40 is a diagram showing PlayList information defining a synchronous application that constitutes static PiP playback.
- FIG. 41 is a diagram showing how the synchronization between MainClip as a primary video and SubCliP as a secondary video is defined by PlayList information in the same notation as in FIGS. 25 and 26.
- FIG. 42] (a) to (c) are diagrams showing applications on the premise of dynamic synchronization.
- FIG. 43 is a diagram showing an internal structure of PlayList information defining PiP playback based on dynamic synchronization.
- FIG. 44 is a diagram showing an internal configuration of a playback apparatus according to the second embodiment.
- FIG. 45 is a flowchart showing a processing procedure for performing PL playback.
- FIG. 46 is a diagram depicting random access to MainClip and random access to SubClip in the same notation as FIG.
- FIG. 47 (a) is a diagram showing playback control when realizing PiP playback by dynamic synchronization.
- FIG. 1 is a diagram showing a form of usage of a recording medium according to the present invention.
- the recording medium according to the present invention is a local storage 200.
- the local storage 200 is a hard disk built in the playback device 300.
- the local storage 200 is used for supplying movie works to a home theater system formed by a playback device 300, a remote controller 400, and a television 500 together with the BD-ROM 100.
- the local storage 200 is a hard disk that is incorporated in a playback device and used as a tray for content distributed from a movie distributor's server.
- the playback device 300 is a network-compatible digital home appliance and has a function of playing back the BD-ROM 100. Also, the content downloaded from the movie distributor's server via the Internet. The ability to expand the BD-ROM100 by combining the content with the content recorded on the BD-ROM100.
- the remote controller 400 accepts designation of a chapter to be reproduced and designation of a time to start reproduction.
- Television 500 displays the playback video of playback device 300.
- the recording medium according to the present invention is configured on the premise of such a combination with the BD-ROM 100.
- the above is the form of usage of the recording medium according to the present invention.
- FIG. 2 shows the internal structure of the BD-ROM.
- the BD-ROM is shown in the fourth row of this figure, and the tracks on the BD-ROM are shown in the third row.
- the track in this figure is drawn by stretching the track formed in a spiral shape from the inner periphery to the outer periphery of the BD-ROM in the horizontal direction.
- This track includes a lead-in area, a volume area, and a lead-out area.
- the volume area in this figure has a layer model of physical layer, file system layer, and application layer. If the application layer format of BD-ROM (application format) is expressed using the directory structure, it becomes like the first level in the figure. In the first stage, the BD-ROM has a BDMV directory under the Root directory.
- BDMV directory Under the BDMV directory, there are three subdirectories called a PLAYLIST directory, a CLIPINF directory, and a STREAM directory.
- the STREAM directory is a directory that stores a file group that is a main body of a digital stream, and a file (00001.m2ts) with an extension m2ts exists.
- the PLAYLIST directory contains a file (OOOOl.mpls) with the extension mpls.
- the CLIPINF directory contains a file with the extension clpi (OOOOl.clpi)
- AVClip, Clip information, and PlayList information which are constituent elements of the BD-ROM, will be described.
- FIG. 3 is a diagram schematically showing how the file with the extension .m2ts is configured.
- Files with the extension .m2ts (00001.m2ts, 00002.m2ts, 00003.m2ts) are A
- AVClip (stage 4) converts video streams with multiple video frames (pictures pjl, 2, 3) and audio streams with multiple audio frames (stage 1) into PES packet sequences (stage 1). (2nd stage), further converted into TS packets (3rd stage), and these are multiplexed.
- an AVClip having a moving image is particularly called “MainClip”, and is distinguished from an AVClip having no moving image.
- FIG. 4 shows the process by which the TS packets that make up the MainClip are written to the BD-ROM.
- the TS packet that composes the MainClip is shown in the first row of this figure.
- the 188-byte TS packet that makes up the MainClip has a 4-byte TS_extrajieader (“EX” in the figure) as shown in the second row, and is 192 bytes long.
- the third level and the fourth level indicate the correspondence between the physical unit of the BD-ROM and the TS packet.
- TS packets with extra.header are grouped every 32 and written to 3 sectors.
- the 32 EX-attached TS packets stored in 3 sectors are called “Aligned Units”. When writing to a BD-ROM, encryption is performed in units of Aligned Units.
- each sector is provided with an error correction code in units of 32 to constitute an ECC block.
- the playback device accesses the BD-ROM in units of Aligned Units, it can obtain 32 complete TS packets with EX. 3 ⁇ 4 Write MainClip to D-ROM Process.
- FIG. 5 is a diagram showing an internal configuration of a video stream used for a movie.
- the video stream in FIG. 5 also has a plurality of picture powers arranged in code order.
- ⁇ , ⁇ , and ⁇ in the figure mean I picture, ⁇ picture, and ⁇ picture, respectively.
- I pictures IDR pictures and Non-IDR I pictures.
- Non-IDR I pictures, P pictures, and B pictures are compression-coded based on frame correlation with other pictures.
- a B picture is a picture made up of Bidirectionally Predictive (B) format slice data
- a P picture is a picture made up of Predictive (P) format slice data.
- B pictures refrenceB pictures and nonrefrenceB pictures.
- Fig. 6 (a) shows the internal structure of the IDR picture.
- an IDR picture consists of multiple Intra format slice data.
- Figure 6 (b) shows the internal structure of the Non-IDR I picture.
- IDR picture power In contrast to lntra format slice data, the No n-IDR I picture is also composed of Intra slice data, P slice data, and B open slice data power.
- Figure 6 (c) shows the dependency in the Non-IDR I picture.
- Non-IDR I pictures can be composed of ⁇ and ⁇ slice data, and can therefore have a dependency relationship with other pictures.
- FIG. 7 is a diagram illustrating a process of converting IDR pictures and Non-IDR I pictures into TS packets.
- the first row in the figure shows IDR pictures and Non-IDR I pictures.
- the second row shows the Access Unit defined in MPEG4-AVC.
- Each of AUD, SPS, PPS, SEI, and Access Unit is information defined in MPEG4-AVC, and is described in various documents such as ITU-T Recommendation H.264. For details, See these references. The important point here is that AUD, SPS, PPS, and SEI are supplied to the playback device as a prerequisite for random access.
- the third row shows the NAL unit. AUD, SPS, PPS, SEI in the 2nd stage, AUD, SPS, PPS, SEI and slice data are each NAL by adding a header.
- NAL unit is a unit specified in the network abstraction layer of MPEG4- AVC.
- a plurality of NAL units obtained by converting one picture are converted into PES packets as shown in the fourth row. Then it is converted to a TS packet and recorded on the BD-ROM.
- the NAL units that make up the IDR picture and Non-IDR I picture located at the head of the GOP must be inserted into the decoder including the Access Unit Delimiter.
- the NAL unit including the Access Unit Delimiter is one index for decoding the IDR picture and the Non-IDR I picture.
- a NAL unit including this Access Unit Delimiter is handled as a point.
- the playback device interprets the NAL unit including the Access Unit Delimiter as an entry position for playing back the Non-IDR I picture and IDR picture. Therefore, in order to execute random access in the Main Clip, it is very important to know where the Access Unit Delimiter of the IDR picture and Non-IDR I picture exists.
- the above is the configuration of the MPEG4-AVC format video stream used for movies. ⁇ Configuration of BD-ROM # 2.Clip information>
- FIG. 8 shows the internal structure of Clip information. As shown on the left side of this figure, Clip information is
- Clip info is explained.
- the leader line ctl in the figure closes up the structure of Clip info ().
- Clip infoO is “clip_stream_type” that indicates the type of digital stream
- application type that indicates the type of application that uses this MainClip
- recording rate application_type is set to “1” to indicate that the corresponding MainClip constitutes a movie application.
- Clip info includes Ne EP_map_for_one_stream [0] to [Ne-1]. And it has Ne pieces of attribute information about each EP_map_for_one_stream.
- This attribute information includes stream_PID [0] to [Ne-1] of the corresponding elementary stream, EP_stream_type [0] to [Ne-1] indicating the type of the corresponding EP_map_for_one_stream, and the number of EP_high in EP_map_for_one_stream number—of—high— entries [0] to [Ne—1] and the number of EPJows in the EP—map—for—one—stream number—of—Low—entries [0] to [Ne—1 ] And EP_map_for_one_stream_PID_start_address [0] to [Ne-1] indicating the current address of the EP-map-for-one-stream.
- the first level is the display order.
- a plurality of arranged pictures are shown, and the second row shows a time axis in the picture.
- the fourth row shows the TS packet sequence on the BD-ROM, and the third row shows the EP_map setting.
- FIG. 10 represents PTS_EP_start and SPN_EP_start of Entry Point # 1 to Entry Point # 7 in FIG. 9 as a combination of EP ⁇ ow and EP_High.
- EP ⁇ ow is shown on the left side of this figure, and EP_High is shown on the right side.
- PTS_EP ⁇ ow of EP ⁇ ow (i) to (i + 3) indicates the lower bits of tl to t4.
- SPN_EP ⁇ ow of EP ⁇ ow (i) to (i + 3) indicates the lower bits of n1 to n4.
- FIG. 10 shows EP_High (0) to (Nc-l) in the EP_map.
- this common high-order bit is described in PTS_EP_High and SPN_EP_High.
- ref_to_EPjOWjd corresponding to EP_High is set to indicate the first one (EP ⁇ ow (i)) among EP ⁇ ow corresponding to tl to t4 and nl to n4.
- OOOOl.mpls is a file that stores PlayList information. Since the same type exists in the local storage 200, the PlayList information that exists in this local storage 200 will be explained, and 000 Ol.mpls on the BD-ROM will be explained. Description of is omitted.
- BD-ROM Since the BD-ROM has a large capacity, when a movie is supplied to the home theater system shown in FIG. 1, it is difficult to play a role. The above is the explanation for BD-ROM.
- FIG. 11 is a diagram showing the internal configuration of the local storage 200. As shown in the figure, the recording medium according to the present invention can be produced by improving the application layer.
- the local storage 200 is shown in the fourth level of the figure, and the tracks on the local storage 200 are shown in the third level.
- the track in this figure is drawn by extending the track formed in a spiral shape from the inner periphery to the outer periphery of the local storage 200 in the horizontal direction.
- This track consists of a lead-in area, a volume area, and a lead-out area.
- the volume area in this figure has a layer model of physical layer, file system layer, and application layer.
- PlayList information which are constituent elements of the local storage 200, will be described.
- AVClip (00002.m2ts, 00003.m2ts, 00004.m2ts, 00005.m2ts) on the local storage 200 constitutes a SubClip.
- a SubClip is an AVClip composed of one or more Out-of-MUX streams.
- An Out-of-MUX stream is an elemental stream that is played back during playback of an AVClip including a video stream, but is multiplexed with the video stream.
- “Out-of-MUX stream framework” is called “Out-of-MUX stream framework” to read out-of-MUX stream during playback of video stream and use it for decoding.
- Such Out-of-MUX streams are classified into “Primary audio stream”, “Secondary audio stream”, “Presentation Graphics (PG) stream”, and interactive GraphicsQ G) stream ”.
- 00002.m2ts stores the primary audio stream
- 00003.m2ts is the secondary audio stream
- 00004.m2 ts is the PG stream
- 00005.m2ts is the IG Assume that a stream is stored.
- this storage method is only an example, and it is multiplexed into four Out-of-MUX stream powerful SubClips! . The details of the Out-of-MUX stream are described below.
- the “Primary audio stream” is an audio stream that is a so-called main audio
- the “Secondary audio stream” is an audio stream that is a so-called sub audio.
- the audio playback of the Secondary audio stream is mixed with the playback audio of the Primary audio stream and used for output.
- “commentary audio” is included in the audio handled as the second audio stream.
- the main audio power that becomes the primary audio stream The dialogue and BGM of the main part of the movie work, and the secondary audio power that becomes the secondary audio stream. If the commentary voice of the movie director is used, the dialogue and BGM of the main part of the main movie work are commentary. It is output after being mixed with audio.
- the secondary audio stream is recorded only in the local storage 200 and is not recorded on the BD-ROM.
- the Primary audio stream may be placed on the BD-ROM or the local storage 200.
- the primary audio stream encoding codec may be different from the secondary audio stream encoding codec.
- FIG. 12 is a diagram showing an internal configuration of the Primary audio stream and the Secondary audio stream.
- the first row in the figure shows the time axis (SubCli P time willow) that is referenced during playback of the SubClip.
- the second row shows the TS packet sequence that makes up the SubClip.
- the third row shows the SubClip.
- the fourth row shows the audio frame sequences that make up the Primary audio stream and Secondary audio stream.
- the SubClip is generated by converting the frame sequence that becomes the Primary audio stream and Secondary audio stream into PES packets (third stage), and further converting this PES packet sequence into TS packet sequences (No. 1). 2nd stage).
- the PTS present in the header of the PES packet indicates the start timing of the audio frame in the PES packet. Therefore, by referring to this PTS, it becomes clear when the audio frame stored in the PES packet is reproduced on the SubClip time axis. Therefore, the header of the PES packet becomes the target of stream analysis. ⁇ Out-of-of-MUX stream description 2. PG stream>
- a PG stream is an elementary list stream that realizes subtitle display as a moving image is played back.
- FIG. 13 shows the internal structure of the PG stream.
- the fourth row shows the PES packets that make up the SubClip, and the third row shows the TS packets that make up the SubClip.
- the second row shows the SubClip time axis, and the first row shows a composite image displayed by decoding and synthesizing the PG stream as the SubClip and the video stream as the MainClip.
- SubClip PES packets are grouped into a series of functional segments: PCS (Presentation Control Segment), PDS (Pallet Definition Segment) ⁇ WDS (Window Definition Segment) ⁇ ODS (Object Dennition Segment) ⁇ E ND (END of Display Set Segment). Created by adding a PES packet header.
- PCS Presentation Control Segment
- PDS Parallel Definition Segment
- WDS Window Definition Segment
- ODS Object Dennition Segment
- E ND END of Display Set Segment
- ODS Object Definition Segment
- WDS Window Definition Segment
- PDS Parallel Definition Segment
- PCS Presentation Control Segment
- page control includes Cut-In / Out, Fade-In / Out, Color Change, Scroll, Wipe-In / Out, and is accompanied by page control by PCS. Therefore, it is possible to realize the display effect of displaying the next subtitle while gradually erasing one subtitle.
- END is a functional segment indicating the end of a functional segment set for displaying a caption display.
- the header of the PES packet includes time stamps such as PTS and DTS, and these time stamps indicate the timing of starting decoding of the functional segment and the timing of displaying the graphics based on the functional segment.
- a group of function segments starting with PCS and extending to END is called “Display Set”.
- the third level shows the TS packets obtained by converting these PES packets.
- the second level shows the time axis (SubClip time axis) that is referenced when playing a SubClip.
- the PCS DTS indicates the timing at which the PCS is decoded
- the PCS PTS indicates the timing at which the graphics are displayed based on the Display Set starting with the PCS.
- a composite image as shown in the first row is displayed.
- This Display Set includes “Epoch Start”, “Acquisition Point”, “Normal Case”, “Epoch”
- Epoch Start indicates the start of a new Epoch.
- Epoch refers to one period with memory management continuity on the playback time axis of AVClip, and a data group assigned to this period. Therefore, Epoch Start includes all the functional segments necessary for the next screen composition. Epoch Start is placed at a position where cueing has been found, such as a chapter in a movie work.
- Acquisition Point is a Display Set that includes all functional segments necessary for the next screen composition, but not at the start of Epoch! /. If the cue is made from the acquisition point DS, the graphics display can be realized reliably. In other words, Acquisition Point DS has the role of enabling screen composition from the middle of Epoch.
- the Display Set as an Acquisition Point is incorporated at a position where it can be a cue point.
- Normal Case includes only the difference from the previous Display Set. For example, if a DSv's title has the same content as the preceding DSu, but the screen structure is different from this preceding DSu, a DSS with only PCS and END will be provided and this DSv will be the Normal Case DS . This way For example, there is no need to provide duplicate ODS, which can contribute to capacity reduction in BD-ROM. On the other hand, the normal case DS is only a difference, so the normal case alone cannot be used for screen configuration.
- Epoch Continue indicates that the playback power of a certain AVClip is continued when the playback is continued after the playback of another AVClip. This completes the description of the functional segments that make up the PG stream.
- the IG stream is an elementary list that realizes subtitle display as video playback progresses.
- FIG. 14 is a diagram showing an internal configuration of the IG stream.
- the fourth row shows the PES packets that make up the SubClip, and the third row shows the TS packets that make up the SubClip.
- the second row shows the SubClip time axis, and the first row shows the composite image that is displayed by decoding and combining the IG stream that is the Out-of-MUX stream and the video stream that is the MainClip. Show.
- the SubClip PES packet structure at the fourth level will be described.
- the SubClip PES packet adds a PES packet header to a functional segment called ICS (Interactive Composition Segment), PDS (Palette Definition Segment) ⁇ ODS (Object Definition Segment) ⁇ END (END of Display Set Segment) It is created by doing.
- ICS Interactive Composition Segment
- PDS Palette Definition Segment
- ODS Object Definition Segment
- END END of Display Set Segment
- ODS Object Definition Segment
- a PDS (Palette Definition Segment) is a functional segment that defines color development when drawing graphics data.
- the ICS is a functional segment that defines interactive control that changes the state of buttons according to user operations.
- END is a functional segment indicating the end of the functional segment set for displaying the menu display.
- the PES packet header includes time stamps such as PTS and DTS, and these time stamps indicate the timing of starting decoding of the functional segment and the timing of displaying the Dallas based on the functional segment.
- time stamps such as PTS and DTS
- these time stamps indicate the timing of starting decoding of the functional segment and the timing of displaying the Dallas based on the functional segment.
- functional segment from ICS to END This group is called Display Set.
- this Display Set has types such as “Epoch Start”, “Acquisition Point”, “Normal Case”, and “Epoch Continue”.
- the third tier in FIG. 14 shows a TS packet obtained by converting these PES packets.
- the second level shows the time axis (SubClip time axis) that is referenced when playing a SubClip.
- the ICS DTS indicates the timing at which the ICS is decoded
- the ICS PTS indicates the timing at which graphics are displayed based on the Display Set starting from the ICS.
- the composite image shown in the first row is displayed.
- Display Set is a set of functional segments that realize one display of a menu.
- FIG. 15 shows the data structure of Clip information on the local storage 200 side.
- the data structure of the clip information on the local storage 200 side is the same as the clip information on the BD-ROM side. However, among these data structures, application-type, EP-map configuration, and EP-stream-type are set to the contents specific to SubClip.
- application_type in Fig. 15 is explained. If SubClip is one of the primary audio stream, secondary audio stream, PG stream, or IG stream described above, application_type is set to 7.
- the MUX stream includes types such as a primary audio stream, a secondary audio stream, a PG stream, and an IG stream, and these differ in where playback is possible in the middle of the stream.
- the primary audio stream and secondary audio stream have multiple audio frame powers, and basically playback can be started with any power as long as the head power of this audio frame.
- a Display Set consisting of completed functional segments is a display set other than the so-called “Normal & 36”, that is, a display set of “Epoch Start J, Acquisition Point J,“ Epoch Continue ”.
- the PCSJCS located at the top must be treated as the entry position.
- the Out-of-MUX stream has a different structure depending on the difference in the corresponding Out-of-MUX stream, because it differs from where it can be decoded. Will have. EPjnap for Out-of-MUX stream is called "Out-of-MUX_EP_map".
- EPjnap power for primary and secondary audio streams The EPjnap time interval is different from EPjnap for moving images. In other words, the time interval for moving images is less than 1 second, and the time interval of the force Entry Point with the precise Entry Point is 5 seconds, and the time interval is wide.
- FIG. 16 is a diagram showing EPjnap generated for the Primary audio stream and the Secondary audio stream.
- the EPjnap structure shown in this figure is a fixed time of 5 seconds. A corresponding entry position exists at every other entry time.
- the third row in the figure shows the SubClip time axis, and tl to t6 on this SubClip time axis are entry times. These entry times tl, t2, t3, t4, t5, and t6 exist at regular intervals of 5 seconds.
- the second row of this figure shows EPjnap.
- EPjnap's PTS_EP_start in the second row indicates these entry times! /
- the first level shows a TS packet sequence constituting the Primary audio stream and the Secondary audio stream.
- the second stage SPN_EP_start is set to indicate nl to n6. Since it is set to correspond to the SPN_EP_start force PTS_EP_start of each entry point in EPjnap, the entry time every 5 seconds will be associated with the entry position.
- the interval from the previous entry point may be a data interval of 256 Kbytes. This time interval of 256 Kbytes is the secondary audio stream transfer rate multiplied by 5 seconds and the time interval. Since the EPjnap time interval is 5 seconds, the range that requires stream analysis is 5 seconds or less. This completes the explanation of EPjnap set for the primary audio stream and the econdary audio stream.
- FIG. 17 is a diagram showing EPjnap set for the PG stream time axis.
- the first row shows the TS packet sequence that makes up the PG stream
- the second row shows the EP_map
- FIG. 18 is a diagram showing EPjnap set for the IG stream time axis.
- the first row shows the TS packet sequence that makes up the IG stream
- the second row shows the EP_map
- EPjnap structure can be changed due to the qualitative difference of the corresponding Out-of-MUX stream.
- EP_stream_type in Clip information on the local storage 200 will be described.
- EP_stream_type indicates the difference in the structure of EPjnap shown in FIGS. 16 to 18 for each Out-of-MUX stream multiplexed in one SubClip.
- EP_stream_type [i] is the Out-of-MUX stream [i ]
- EP_stream_type [i] is set to any of 3,4,6,7.
- Out_of_MUX_EP_map uses EP_stream_type to clarify how much the entry time exists at what interval or what position is designated as the entry position.
- EP_stream_type is 3 or 4, it indicates that the entry position exists at a fixed interval of 5 seconds.
- the range to be analyzed is at most 5 seconds.
- the playback device is in the Out-of-MUX stream. By analyzing the range of 5 seconds, it can be understood that the desired access point is reached.
- EP_stream_type is 6 or 7, it indicates that the start point of the Display Set composed of the completed functional segments is designated as the entry position.
- the playback device reads the functional segment from the position corresponding to this entry position, and does not perform any stream analysis if it is used for playback. In both cases, subtitle display and menu display at a desired playback time can be realized.
- EP_stream_type is a force that requires stream analysis with an upper limit of 5 seconds. It prompts the playback device to determine whether stream analysis is not necessary. Therefore, random access to SubClip is required along with random access to MainClip. Even if it becomes, it will not put an excessive burden on the playback device. Response to user operations can be improved by reducing the burden.
- the file with the extension “mpls” is a file storing PlayList (PL) information.
- PlayList information is information that defines a bundle of two types of playback paths called MainPath and Subpath as a Playlist (PL).
- FIG. 19 shows the data structure of PlayList information.
- PlayList information includes MainPath information that defines MainPath (MainPat h (), PlayListMark information that defines chapters (PlayListMarkO), and Subpath.
- Subpath information (SubpathO) that defines MainPath and Mainpath as a Playlist (PL).
- MainPath is a playback path defined on the main AVClip.
- Subpath is a playback path defined on Sub Clip.
- MainPath is a playback path defined for the video stream that is the main video.
- MainPath is defined by a plurality of Playltem information-PlayltemO ' ⁇ as indicated by an arrow mpl.
- Playltem information defines one or more logical playback sections that make up the MainPath.
- the structure of Playltem information is highlighted by the lead line hsl.
- the Playltem information includes “Clip_Information_file_name” indicating the file name of the AVClip playback section information to which the IN and Out points of the playback section belong, “Clip_codec_identifier” indicating the AVClip encoding method, and playback It consists of time information “IN_time” indicating the start point of the section and time information “OUT_time” indicating the end point of the playback section.
- FIG. 20 is a diagram showing the relationship between MainClip and PlayList information.
- the first level is PlayList Indicates the time axis of information.
- Levels 2 to 5 show the video streams referenced in EPjnap (same as shown in Fig. 5).
- PlayList information includes Playltem information # 1, # 2 and! 2 and other Playltem information, and two playback sections are defined by the In_time and Out_time of these Platform information # 1, # 2. .
- a time axis different from the AVClip time axis is defined. This is the Playltem time axis shown in the first row. In this way, by defining Playltem information, it becomes possible to define a time axis that is different from AVClip.
- Playltem information The above is the description of Playltem information according to the present embodiment. Next, PlayListMark information will be described.
- FIG. 21 shows the internal structure of PlayListMark information in PlayList information.
- the PlayListMark information is composed of a plurality of PLMark information (#l to #n).
- PLmark information (PLmarkO) is information that designates an arbitrary section of the PL time axis as a chapter point.
- the PLmark information includes “ref_to_PlayItem_Id” indicating the Playltem to be designated as a chapter and “mark_time_stamp” indicating the chapter position in the Playltem in time notation.
- FIG. 22 is a diagram showing designation of chapter positions by PLMark information in PlayList information.
- the second to fifth tiers in this figure show the EPjnap and AVClip shown in FIG.
- the first level in the figure shows PLMark information and the PL time axis.
- Arrow ktl, 2 indicates the specification by ref_to_PlayItem_Id of PLMark information.
- the ref_to_PlayItem_Id of the PLMark information specifies each of the Playlt em information.
- Mark_time_Stamp indicates the point in time of Chapter #l, # 2 on the Playltem time axis. In this way, the PLMark information can define chapter points on the time axis of the planet.
- MainPath is the playback path defined for the main video MainClip
- Subpath is the playback path defined for the SubClip to be synchronized with MainPath.
- FIG. 23 is a diagram showing a close-up of the internal structure of the Subpath information.
- each Subpath includes SubPath_type indicating the type of SubClip and one or more SubPlayltem information ⁇ ⁇ ⁇ SubPlayltemO ⁇ ⁇ ⁇ .
- SubPl ayltem information includes “Clip jnformation_file_name”, “SubPlayIt em—In—time”, “SubPlayItem—Out—time”, “sync—Playltem— ⁇ ”, “sync—start—PT3 ⁇ 4 as shown by the arrow hcl in the figure. —Of— Playlt em ”.
- “Clip_information_file_name” is information for uniquely specifying a SubClip corresponding to SubPlayltem by describing a file name of Clip information.
- SubPlayItem_In_time is information indicating the start point of SubPlayltem on the playback time axis of SubClip.
- SubPlayItem_Out_time is information indicating the end point of SubPlayltem on the playback time axis of SubClip.
- “sync_PlayItem_id” is information for uniquely designating a Playltem that constitutes the MainPath that should be synchronized by this SubPlayltem. SubPlayltem-In-time exists on the playback time axis of the PlayItem specified by sync-Playltem-id.
- Sync_start_PTS_of_PlayItem indicates where the starting point force of the SubPlayltem specified by SubPlayItem_In_time exists on the playback time axis of the PlayItem specified by sync_PlayItem_id.
- SubPath_type is set to a value between 0 and 255 to indicate what playback path the SubPath defined by the SubPath information is.
- SubPathJpe is linked to the contents of Clip information specified by Clip_information_file_name of SubPlayltem information. What is linked to it is linked to the application_type of Clip.Info.
- SubPathJ pe is a force that takes a value between 0 and 255. When the a_plication_type of Clip.Info is set to 7, some value from 5 to 8 is set.
- SubPath_type is set to 5
- This primary audio playback path is , Defined for additional 'substitution.
- SubPath_type When SubPath_type is set to 6, this SubPath information indicates that a Presentation Graphics playback path for addition / replacement is defined. What is additional carlo 'replacement is a PG stream that can be added and replaced with respect to a PG stream that can be played back with Playltem information.
- SubPath_type When SubPath_type is set to 7, this SubPath information indicates that an Interactive Graphics playback path is defined for additional caro 'replacement. What is additional carlo 'replacement is a PG stream that can be added and replaced with respect to a PG stream that can be played back with Playltem information.
- SubPath_type When SubPath_type is set to 8, the SubPath information indicates that the Secondary audio playback path is defined. This Secondary audio playback path is defined for additional tracking. What is added is Secondary audio that should be mixed with the playback audio of Primary audio that can be played back with Playltem information.
- FIG. 24 is a diagram showing the correspondence between the SubClip on the local storage 200, the PlayList information on the local storage 200, and the MainClip on the BD-ROM.
- the first row shows the SubClip that exists on the local storage 200.
- the SubClip on the full-capacity storage 200 has types such as a Primary audio stream, a Secondary audio stream, a PG stream, and an IG stream. Any of these will be used for synchronized playback as SubPath.
- the second level shows two time axes defined by PlayList information.
- the lower time axis in the second row indicates the Playltem time axis defined by the Playltem information, and the upper time axis is Indicates the SubPlayltem time axis defined by SubPlayltem.
- SubPlayltem-Clip-information-file-name in the SubPlayltem information plays the role of SubClip selection, which of the four SubClips in the first stage is selected as the target for specifying the playback section. It ’s really powerful.
- SubPlayItem.IN_time and SubPlayltem.OuUime play the role of defining the start and end points of the playback section on the SubClip.
- the arrow Sync_PlayItem_Id is intended to synchronize with any Playltem! /, And plays the role of specifying synchronization, and sync_start_PTS_of_PlayItem plays the role of showing the difference between the origin of the Playltem time axis and the origin of the SubPlayltem time axis Fulfill.
- FIG. 25 shows the EPjnap and Playltem time axis set for the MainClip, and the EP set for the SubClip that is the Primary audio stream and Secondary audio stream.
- the middle, lower 4 to lower 1 in the figure show the Playltem time axis, picture sequence, Main Clip time axis, EP_map, and TS packet sequence shown in FIG.
- the upper first to third stages indicate the TS packet sequence, EP_map, and SubClip time axis shown in FIG.
- the upper fourth row shows the SubPlayltem time axis shown in FIG. You can see that the entry time is set at 1 second time intervals for MainClip, and at 5 second time intervals for SubClip.
- FIG. 26 is a diagram showing the EPjnap and Playltem time axes set for the MainClip and the EPjnap and SubPlayltem time axes set for the SubClip that becomes the PG stream and IG stream.
- the middle, lower 4 to lower 1 in the figure show the Playltem time axis, picture sequence, Main Clip time axis, EP_map, and TS packet sequence shown in FIG.
- the first to third tiers show the TS packet sequence, EP_map, and SubClip time axis shown in Fig. 16.
- the upper fourth row shows the SubPlayltem time axis shown in FIG.
- MainClip it can be seen that the 1-second time interval is specified, and for SubClip, the position where the Display Set other than Normal Case exists is specified as the entry position.
- FIG. 27 is a diagram showing a correlation between values that can be taken by SubPath_type, values that can be taken by applicationj pe, and values that can be taken by EP_stream_type pe in a table format.
- SubPath—type is set to EP—stream—type force 3 when 'application-type force is set to' 5 C, 3 ⁇ 4>. Since EP_stream_type is set to “3”, the EP_map corresponding to this EP_stream_type is for primary audio playback, and if it has a time interval of 5 seconds or a data interval of 256 Kbytes, playback is performed. The device can be understood.
- EP_stream_type is set to "7". Since EP_stream_type is set to "7”, the EP_map corresponding to this EP_stream_type is for Interactive Graphics playback. If the Display Set consisting of a complete functional segment is at the entry position, the playback device Can understand.
- EP_stream_type is set to "4". Since EP_stream_type is set to "4", the EP_map corresponding to this EP_stream_type is for secondary audio playback, and if it has a time interval of 5 seconds or a data interval of 256K bytes, the playback device Can be understood.
- FIG. 28 is a diagram showing a virtual file system generated by the playback device 300. As shown in FIG. The upper left of this figure is BD-ROM The lower left corner shows the recorded contents of the local storage 200. The right side shows the configuration of the virtual file system.
- the playback apparatus combines the AVClip, Clip information, and PlayList information that exist in the BD-ROM with the AVClip, Clip information, and PlayList information that exist in the local storage 200 to obtain a virtual file system.
- AVClip # 2, # 3, # 4, # 5 (00002.M2TS, 00003.M2TS, 00004.M2TS, 00005.M2TS) on Local Storage is added to the STREAM directory on the BD-ROM. .
- FIG. 29 is a diagram showing the internal structure of the playback apparatus according to the present invention.
- the reproducing apparatus according to the present invention is industrially produced based on the internal configuration shown in the figure.
- the playback device according to the present invention is mainly composed of two parts, a system LSI and a drive device, and can be industrially produced by mounting these parts on the cabinet and substrate of the device.
- the system LSI is an integrated circuit in which various processing units that function as playback devices are integrated.
- the playback equipment produced in this way is BD Drive 1, Arrival
- the BD-ROM drive 1 performs BD-ROM loading / ejection, executes access to the BD-ROM, and sets the Aligned Unit consisting of 32 complete EX-attached TS packets to the BD-ROM. Read from.
- Arrival time Clock Counter2 generates Arrival Time Clock based on 27MHz crystal oscillator (27MHz X-tal).
- Arrival Time Clock is a clock signal that defines the time axis that is the reference for ATS assigned to TS packets.
- the Source de-packetizer 3 removes the TP_extra_header from each TS packet constituting the Aligned Unit, and Is output to PID filter 4. Sour ce
- the output to the PID filter 4 by De-Packizer3 is made at the timing when the arrival time Clock Counter2 has passed and the ATS indicated by TP_extra_header is reached. Since the output to PID filter 4 is made according to ATS, the TS packet output to PID filter 4 is Arrival Time Clock even if there is a speed difference such as 1x speed and 2x speed when reading from BD-ROM. Will be made according to the current time over time.
- PID Filter4 is capable of MFJing the TS packet to any of the video stream, PG stream, IG stream, and Primary audio stream. J Output to one of Transport Buffer 5, Transport Buffer 12, Transport Buffer 20, or Transport Buffer 37.
- the Transport Buffer (TB) 5 is a buffer that is temporarily stored when TS packets belonging to the video stream are output from the PID filter 4.
- the Multiplexed Buffer (MB) 6 is a buffer for storing PES packets when outputting a video stream from the Transport Buffer 5 to the Elementary Buffer 7.
- the Coded Picture Buffer (CPB) 7 is a buffer that stores pictures in an encoded state (I picture, B picture, P picture).
- the video decoder 8 obtains a plurality of frame images by decoding individual frame images of the video elementary stream at predetermined decoding times (DTS), and writes them into the Decoded Picture Buffer 10.
- DTS decoding times
- Decoded Picture BufferlO is a buffer in which decoded pictures are written.
- the video plane 11 is a plane for storing uncompressed pictures.
- a plane is a memory area for storing pixel data for one screen in the playback device.
- the resolution in the video plane 11 is 1920 ⁇ 1080, and the picture data stored in the video plane 11 is composed of pixel data represented by 16-bit YUV values.
- the Transport Buffer (TB) 12 is a buffer that is temporarily stored when TS packets belonging to the PG stream are output from the PID filter 4.
- the Coded Data Buffer (CDB) 13 is a buffer that stores PES packets constituting the PG stream.
- the Stream Graphics Processor (SGP) 14 decodes the ODS, and writes the uncompressed graphics in the uncompressed state with the index color power obtained by decoding into the Object Buffer 15 as the graphics object.
- Decoding by the Stream Graphics processor 14 is performed instantaneously, and the graphics object is temporarily stored in the Stream Graphics processor 14 by decoding.
- Decoding by Stream Graphics Processor 14 can be done instantaneously Writing from Stream Graphics Processor 14 to Object Buffer 15 does not end instantaneously. This is because in the BD-ROM player model, writing to Object Buffer5 is done at a transfer rate of 128 Mbps.
- the completion time of writing to Object Bufferl5 is indicated in the PTS of the END segment. It will wait for the next DS until the indicated time has elapsed.
- Writing of graphics objects obtained by decoding each ODS starts at the DTS time associated with the ODS and ends by the decoding end time indicated in the PTS associated with the ODS. .
- Object Bufferl5 is a buffer in which a graphics object obtained by decoding of Stream Graphics Processorl4 is placed. Object Bufferl5 must be set to 2 times Z4 times the size of graphics plane 8. This is because, when considering the case of realizing Scrolling, it is necessary to store graphics objects twice and four times as many as the graphics plane 8.
- Composition Buffer I6 is a memory in which PCS and PDS are arranged. When there are two Display Sets to be processed and the active periods of these PCS overlap, the composition buffer 16 stores a plurality of PCSs to be processed.
- the Graphics Controller 17 determines whether it is a P-S Composition—State force, Epoch Start, Acquisition Point, or Normal Case contained in that Display Set. If it is Epoch Start, the PCS on the Coded Data buffer 13 is transferred from the Coded Data buffer 13 to the Composition buffer 16.
- the Presentation Graphics plane 18 is a memory having an area for one screen, and can store uncompressed graphics for one screen.
- the resolution of this plane is 1920 ⁇ 1080, and each pixel of uncompressed graphics in the Presentation Graphics plane 18 is represented by an 8-bit index color.
- CLUT Color Lookup Table
- the CLUT unit 19 converts the index color in the uncompressed graphics stored in the Presentation Graphics plane 18 into Y, Cr, and Cb values.
- the Transport Buffer (TB) 20 is a buffer in which TS packets belonging to the IG stream are stored.
- the Coded Data Buffer (CDB) 21 is a buffer that stores the PES packets that make up the IG stream.
- the Stream Graphics Processor (SGP) 22 decodes the ODS, and writes the uncompressed graphics obtained by the decoding to the Object Buffer 23.
- the Object Buffer 23 is a buffer in which a number of uncompressed graphics objects obtained by decoding of the Stream Graphics Processor 22 are arranged.
- the rectangular area occupied by each graphics object in this Object Buffer 23 is identified by the Objectjd of the ODS. Therefore, if a graphics object with the same Objectjd is supplied in the state where a graphics object exists on the Object Buffer 23, the area occupied by the graphics object on the Object Buffer 23 is overwritten by the graphics object with the same Objectjd. Will be.
- the Composition buffer 24 is a buffer for storing an Interactive_composition to be transported corresponding to one or more ICSs.
- the stored Interactive_composition is used for decoding by the Graphics controller 25.
- the Graphics controller 25 determines whether the ICS is included in the Display Set.
- Omposition State force Epoch 3 ⁇ 4tart, Acquisition ⁇ nt, or Normal Case
- Epoch Start the new Interactive composition on the Coded Data buffer 21 is transferred to the Coded Data buffer 21 bCompositon buffer 24.
- the Graphics controller 25 stores the Page_Version—Number of each page information belonging to the ICS and the Composition buffer 24. Check the Page_Version_Number of each page information in the existing Interactive composition. If page information with a large Page_Version_Number exists in the Coded Data buffer 21, the page information is transferred from the Coded Data buffer 21 to the Composition buffer 24, thereby updating the desired page information in the Composition buffer 24. . Then, it is determined whether or not the page corresponding to the updated page information is currently being displayed, and if it is being displayed, the corresponding page is redrawn.
- Interactive (graphics pre-26 26, Stream urapnics Processor (SGP) 22 is used to write uncompressed graphics.
- the resolution of this plane is 192.
- OX 1080 each pixel of uncompressed graphics in the Intaractive Graphics plane 26 is represented by an 8-bit index color.
- CLUT Color Lookup Table
- the CLUT unit 27 converts the index colors in the uncompressed graphics stored in the Interactive Graphics plane 26 into Y, Cr, and Cb values.
- the combining unit 28 combines the uncompressed frame image stored in the video plane 11 and the uncompressed graphics object stored in the Presentation Graphics plane 18. By combining images, it is possible to obtain a combined image in which captions are superimposed on a moving image.
- the compositing unit 29 includes the uncompressed graphics object stored in the Interactive Graphics plane 26 and the composite image (the uncompressed picture data in the uncompressed state and the uncompressed presentation graphics plane 18) output from the compositing unit 28. To the graphics object).
- the switch 30 selectively supplies either the TS packet read from the BD-ROM or the TS packet read from the local storage 200 to the Transport Buffer 20.
- the Network Device 31 implements the communication function of the playback device, and establishes a TCP connection, FTP connection, etc., with the web site corresponding to the URL.
- the content downloaded from the website through the connection established by the Network Device 31 is stored in the Local Storage 200.
- the switch 32 selectively supplies one of the TS packet read from the BD-ROM and the TS packet read from the local storage 200 to the Transport Buffer I2.
- the source de-packetetizer 34 removes TP_extra_header from the AVClip TS packet read from the local storage 200, and outputs only the TS packet to the PID filter 35.
- the output to the PID filter 35 by the source de-packetizer 34 is made at the time when the arrival time clock counter 33 has become the ATS indicated by the time power TP_extra_header.
- the PID filter 35 converts the TS packet read from the local storage 200 into a PG stream.
- the switch 36 supplies either the TS packet read from the BD-ROM or the TS packet read from the local storage 200 to the audio decoder 39 side.
- This TS packet constitutes the Primary audio stream.
- the Primary audio stream is supplied to the audio decoder 39 from either the BD-ROM or the local storage 200.
- the Transport Buffer (TB) 37 stores TS packets belonging to the Primary audio stream.
- the Elementary Buffer (EB) 38 is a buffer that stores PES packets constituting the Primary audio stream.
- the audio decoder 39 decodes the primary audio stream in the PES packet state output from the Elementary Buffer 41 and outputs uncompressed audio data.
- the Transport Buffer (TB) 40 stores TS packets belonging to the Secondary audio stream.
- the Elementary Buffer (EB) 41 is a buffer that stores PES packets constituting the Secondary audio stream.
- the audio decoder 42 decodes the secondary audio stream in the PES packet state output from the Elementary Buffer 38, and outputs uncompressed audio data.
- the mixer 43 mixes the uncompressed audio data obtained by decoding the Primary audio stream and the uncompressed audio data obtained by decoding the Secondary audio stream. Outputs synthesized speech.
- the scenario memory 44 is a memory for storing current PlayList information and current Clip information.
- Current PlayList information refers to information currently being processed among a plurality of PlayList information recorded on a BD-ROM.
- Current clip information refers to the information that is currently being processed among multiple clip information recorded on the BD-ROM.
- the control unit 45 includes an instruction ROM and a CPU, and executes software stored in the instruction ROM to control the entire playback device. The contents of this control change dynamically according to user events that occur in response to user operations and the setting values of each PSR in the PSR set 49.
- the control unit 45 includes a functional component such as a main conversion unit 46, a sub conversion unit 47, and a PL regeneration control unit 48.
- the main conversion unit 46 converts the playback time point on the Playltem time axis into the address of the MainClip.
- the playback time point at which normal playback and special playback should be started is defined on the Playltem time axis, and the main conversion unit 46 uses the EP_map in the Clip information corresponding to the MainClip to determine this playback time point. Convert to address in MainClip.
- the sub-conversion unit 47 converts the playback time point on the Playltem time axis into a playback time point on the SubPlayltem time axis, and converts the playback time point on the SubPlayltem time axis after conversion into the SubClip address.
- the conversion from the playback point on the Playltem time axis to the playback point on the SubPlayltem time axis by the sub conversion unit 47 is performed using Sync_Playltemjd and Sync_Start_PTS_of_PlayItem in the SubPlayltem information.
- Conversion from the playback time point on the SubPlayltem time axis to the SubClip address is the process of finding the entry position closest to the random access position from the multiple entry positions indicated in the EP_map (0 and so on)
- the process GO is to perform stream analysis using the obtained entry position as the starting point.
- the former process is performed using EP_map that is associated with SubClip. Necessary power for random access of audio stream and secondary audio stream It can be omitted by random access of PG stream and IG stream, and whether or not stream analysis can be omitted is determined by referring to EP_stream_type. Does EP_map exist in the clip information, and is random access ensured properly? It is.
- the PL playback control unit 48 controls the entire playback device to perform PL playback.
- the PL playback is a control in which the MainClip in the BD-ROM and the SubClip in the local storage are played back in synchronization according to the Playltem information and SubPlayltem information in the PlayList information.
- EPjnap is associated with MainClip and SubClip, and high speed random access for each AVClip is guaranteed. By applying this high-speed random access, PL re- In real life, "normal playback" and "special playback" are realized.
- normal playback in PL playback refers to PlayC information that constitutes MainPath information in PlayList information and SubPlayltem information that constitutes SubPath information in MainClip and SubClip. Playback is based on the time axis and the SubPlayltem time axis.
- Special playback in PL playback refers to Playltem information that constitutes MainPath information in PlayList information and SubPlayltem information that constitutes SubPath information in MainClip and SubClip. Rewind ⁇ Chapter search ⁇ Time search is executed.
- PSR set 49 is a non-volatile register built in the playback device. It consists of 64 Player Status Registers (PSR (l) to (64) and 4096 General Purpose Registers (GPR)). The 64 Player Status Registers (PSR) indicate the status of the playback device, such as the current playback time, etc. 64 PSR (PSR (5) to PSR out of PSR (1) to (64)) (8) indicates the current playback time, PSR (5) is set to a value between 1 and 999, indicating the chapter number to which the current playback time belongs, and set to OxFFFF This indicates that the chapter number is invalid for the playback device!
- PSR (6) is set to a value between 0 and 999 to indicate the number of the PlayList (current PlayList) to which the current playback point belongs.
- PSR (7) is set to a value between 0 and 255 to indicate the number of the Play Item (hereinafter referred to as current PI) to which the current playback point belongs.
- PSR (8) is set to a value between 0 and OxFFFFFF to indicate the current playback point (current PTM) with a time accuracy of 45 KHz.
- a program for causing a computer to execute the playback procedure shown in FIG. 30 may be created.
- both normal playback and special playback in PL playback are played back from arbitrary coordinates on the Playltem time axis. Since the start is based on the assumption of jump-in reproduction, the reproduction apparatus may be instructed to perform control for realizing this jump-in reproduction.
- FIG. 30 is a flowchart showing a control procedure in the case of performing reproduction reproduction of an arbitrary coordinate force on the Playltem time axis.
- Offset a representing coordinates on the Playltem time axis is calculated (Step S1), and Offset ⁇ is converted to coordinates (In_time + Offset ⁇ ) on the MainClip time axis (Step S2). If the coordinates on the MainClip time axis are obtained in this way, the coordinates (In_time + Offset a) on the MainClip time axis are converted to the address ⁇ using the EP_map of the MainClip (step S3)
- EP_stream_type is a value other than 3,4,6,7 (No in step S4), it means that EP_stream_type is invalid if it is! If EP_stream_type is invalid, it is unknown how often the entry position and entry time exist in the Out-of-MUX stream, so it may take a lot of time to specify the access position. is expected. Then, random access to SubClip is considered to be very time consuming, and if this is to be played back in synchronization with MainClip, playback of MainClip will be significantly delayed. In order to avoid this, random access on the SubClip side is abandoned in advance, and only the MainClip is read out as much as possible (Step S9). In this way, by limiting the target of random access to MainClip, a significant processing delay during random access can be avoided.
- Figure 31 shows how the random access position is determined using the EP_map set as shown in Figure 25. It is the figure which showed typically whether it specifies.
- the procedure for identifying the random access position shown in FIG. 31 will be described with reference to FIG. Note that FIG. 31 is drawn assuming that the designated jump position happens to be designated as the entry position in the MainClip and SubClip. If the specified jump position is the position of Offset ⁇ on the Playltem time axis, the origin of the playlist time axis is in the In_time of the MainClip time axis, so the jump position for the MainClip is In_time + Offset ⁇ , this position is converted to SPN, and SPN is converted to the number of sectors.
- the Offset a on the Playltem time axis is at the position of Sync_Start_PTS_of_PlayItem + Offset j8 on the SubPlayltem time axis. This is because the origin of the SubPlayltem time axis exists at a position separated by Sync_Start_PTS_of_PlayItem from the origin of the P1 ayltem time axis. If Offset j8 is calculated in this way, the coordinates of the jump position on the SubClip time axis are calculated.
- FIG. 32 is a flowchart showing a processing procedure for converting the coordinates TM on the MainClip and SubClip into addresses.
- the time widths indicated by PTS_EP_High of EP_High are added together, and it is determined at what EP_High_id the total repulsive force 3 ⁇ 4_time of the time width is exceeded (step S l l).
- the time width indicated by PTS_EP_High is a unit of time having PTS_EP_High as the upper bits. If the total repulsive force ln_time of the time width is exceeded in the kth EP_Highjd, this k is stored (step S12).
- the entry point closest to In_time is specified by the combination of k-1 and h-1 obtained in this way.
- the SPN at the position corresponding to the coordinate TM is calculated from the combination of SPN_EP_High and SPN_EP ⁇ ow at the entry point thus obtained, and the SPN is converted into a sector address (step S15).
- SPN is the serial number of the TS packet
- SPN is the serial number of the TS packet
- TS packets are converted into one Aligned Unit every 32 packets and recorded in 3 sectors, so the quotient is obtained by dividing the SPN by 32, and the quotient is calculated.
- the sector address of the Aligned Unit closest to the SPN can be obtained. Since the sector address thus obtained is the relative sector number from the beginning of one AV Clip file, the sector corresponding to the entry point can be specified by setting this relative sector number in the file pointer.
- Step S16 is a determination of whether EP_stream_type in the EP_map of the SubClip is 3,4, 6, or 7. If EP_stream_type is 6 (PG stream) or 7 (IG stream), it is considered that the head position (PCS, I CS) of Display Set other than Normal Case exists at the entry point closest to the coordinate TM. It is done. Since this Display Set has all the functional segments necessary for screen composition, the address obtained by the conversion in Step S15 is set as the address 13 which is the access position (Step S17).
- the address of the PES packet to be used is set as the access position address j8 (step S20).
- the address ⁇ 8 is specified, the power is reproduced and the sound reproduction is realized.
- FIG. 33 the processing in this flowchart will be specifically described with reference to FIGS. 33 and 34.
- Fig. 33 is a diagram showing the relationship between variables k, h and random access positions when SubClip is a primary audio stream and a secondary audio stream.
- the first row in the figure shows PTS_EP_High that constitutes EP_map, and the second row shows PTS_EP ⁇ that constitutes EP_map. Indicates w.
- the third level shows the TS packet sequence.
- k indicates the minimum PTS_EP_High exceeding the random access position
- h indicates the minimum PTS_EP_ow exceeding the random access position.
- FIG. 34 is a diagram showing the relationship between variables k, h and random access positions in the case of a SubClip force PG stream and an IG stream.
- the first level of this figure shows PTS_EP_High that constitutes EP_map, and the second level shows PTS_EP_ow that constitutes EP_map.
- the third row shows the TS packet sequence.
- k indicates the minimum PTS_EP_High exceeding the random access position
- h indicates the minimum PTS_EP_ow exceeding the random access position.
- the playback apparatus since EP_stream_type is provided in the EP_map corresponding to the SubClip, the playback apparatus has a predetermined time interval with the EP_map power of the Out-of-MUX stream. It is possible to know whether it is the one that has designated the address of the independent playback unit. It is possible to know how the characteristics of the EP_map on the Out-of-MUX stream side are! /, So it is possible to immediately determine whether high-speed random access is possible. Even if synchronization between MainClip and SubClip is attempted, there is no response drop, so jump-in playback for playlists consisting of MainPath + SubPath is achieved with the same level of response as jump-play for MainPath alone can do.
- the present invention relates to improvements when realizing Picture in Picture (PiP) playback.
- the MainClip that configures a moving image is specified by the MainPath information in the PlayList information
- the SubClip that configures another moving image is specified by the SubPlayltem information in the PlayList information.
- a moving image (Primary Video) and the latter moving image (Secondary Video) are displayed on the same screen.
- FIG. 35 is a diagram showing an example of PiP playback.
- Primary Video is a playback image of an HD image
- Secondary Video is an SD image.
- FIG. 36 (a) is a diagram showing a comparison between an HD image and an SD image.
- HD images have a resolution of 1920 x 1080 and, like film material, have a 3750 (or 3753 force 3754) clock frame interval.
- SD images have a resolution of 720 x 480 and have a display interval of 1501 clocks, similar to NTSC material, or a frame interval of 1800 clocks, similar to PAL material.
- the resolution of the SD image is about 1/4 of the resolution of the HD image, so when the Primary Video that is an HD image and the Secondary Video that is an SD image are displayed on the same screen, Secondary Video is about 1/4 the size of Primary Video.
- FIG. 36 (b) is a diagram showing how Secondary Video is enlarged or reduced. Secondary Vid eo is scaled according to the Scalling Factor. This Scalling Factor is given as a magnification factor of 1 / 4x, 1 / 2x, 1.5x, and 2x. The playback device enlarges / reduces the Secondary Video in the vertical direction according to this Scalling Factor. Also, in the horizontal direction, enlargement / reduction is performed so that the original aspect ratio of the SD image is maintained.
- an AVClip recorded on a BD-ROM constitutes the Primary Video described above. Because Primary Video is large in size, it is best to distribute large media.
- SubClip that constitutes Secondary Video, and PlayList information that regulates synchronization between Primary Video and Secondary Video are transmitted to the playback device through the network and recorded in the local storage.
- FIG. 37 is a diagram showing recorded contents of the local storage according to the second embodiment.
- This figure shows the local storage configuration in the same notation as in FIG. This figure differs from FIG. 11 in that the SubClip (00002.m2ts) recorded in the local storage is the Secondary Video stream, and the Clip information (00002.clpi) is the EP_map for this Secondary Video stream. It is a point to have! /.
- the SubClip (00002.m2ts) recorded in the local storage is the Secondary Video stream
- the Clip information (00002.clpi) is the EP_map for this Secondary Video stream. It is a point to have! /.
- FIG. 38 is a diagram showing an internal configuration of Clip information recorded in the local storage in the second embodiment.
- This Clip information is Clip information for the Secondary Video stream. Lead lines cu2, cu3, cu4, and cu5 in the figure close up the internal structure of EP_map in this Clip information.
- the EP_map shown in these lead lines has the same configuration as that shown in Fig. 8, and the EP_map for Secondary Video uses the beginning of each access unit (GOP) making up the Secondary Video stream as the entry position. , Corresponding to the entry time. Even though it is S econdary Video, this EPjnap is intended for moving images, so the time interval between entry times is less than 1 second, and EP_stream_type is “l: Video Type” as in FIG. Is set to
- FIG. 39 is a diagram showing EPjnap set for the Secondary Video stream in the same notation as FIG.
- PiP playbacks that are configured with static synchronization and those that are configured with dynamic synchronization.
- the PiP playback application consisting of the primary video, which is the main part of the movie work, and the secondary video, which is a commentary video, is composed of the former static PiP playback.
- FIG. 40 is a diagram showing PlayList information defining static PiP playback.
- the PlayList information that defines PiP playback by static synchronization can define multiple SubPath information within it (Multi-SubPath), and multiple SubPlayltem information (Mu Iti-SubPlayltem) can be defined.
- the SubPlayltem information in this figure has new information elements called PiP_Position and PiP_Size. Each information element is set as follows.
- SubPlayl ⁇ Blueprint's “slip-information-file-name” can be used to describe the file name of the primary video and AV clip.
- Sub_PlayItem_In_time indicates a playback time point which is a start time point of Sub Play lm on the time axis of SubClip as Secondary Video.
- Sub_PlayItem_Out_time indicates the playback time point that is the end time point of the Sub Platform on the time axis of the SubClip that is the Secondary Video.
- Sync_Start_Pts_of_PlayItem is time information, and when playing the Playltem specified by Sync_PlayItem_Id, the playback of the playback section (SubPlayltem) specified by the SubPlayltem information starts after how many seconds have passed since the Playltem playback started. Indicates what to do.
- PiP_Position indicates the position where the secondary video playback video should be placed using the X and Y coordinates on the screen plane for primary video playback.
- PiP_Size indicates the vertical size and horizontal size of the secondary video playback video.
- FIG. 41 is a diagram showing how the synchronization between the MainClip that is the Primary Video and the SubClip that is the Secondary Video is defined by the PlayList information in the same notation as in FIGS. 25 and 26.
- SubClip which is a secondary video
- EPjnap is provided with EPjnap, and it is guaranteed that high-speed random access can be performed with a time accuracy of less than 1 second!
- the PlayList information defines the synchronization between the Main Video that is the Primary Video and the SubClip that is the Secondary Video.
- Secondary Video is also fast-forwarding' Rewinding 'is realistic, so it is realistic to set up EP_map in SubClip and fast-forwarding to Primary Video' When rewinding, the secondary video is also fast-forwarded and rewinded.
- Dynamic synchronization refers to the point in time on the Playltem time axis corresponding to the Primary Video stream (MainClip) that the playback of the SubPlayltem corresponding to the Secondary Video stream (SubClip) starts. This means that it changes dynamically.
- the application image envisioned by the creator is as follows.
- Figures 42 (a) to 42 (c) are diagrams showing applications based on dynamic synchronization.
- FIG. 42 (a) The video content of Primary Video in this application is as shown in Fig. 42 (a).
- Each button shown in Fig. 42 (a) is a button with a thumbnail image of a moving image (thumbnail button) and has three states: a normal state, a selected state, and an active state. Then, when any of these thumbnail buttons is in the selected state (Fig. 42 (b)), instead of the selected thumbnail, the playback video of the Secondary Video is inserted and played back (Fig. 42 (c)
- the creator shall have the concept of)!
- the starting point of Secondary Video playback varies depending on the user at the time of playback, which is “dynamic synchronization”.
- an operation for setting any thumbnail to the selected state in this case, a button selection operation
- a lock operation an operation for setting any thumbnail to the selected state (in this case, a button selection operation) is referred to as a lock operation.
- FIG. 43 shows the internal structure of PlayList information that defines PiP playback based on dynamic synchronization.
- PlayList information that defines PiP playback by dynamic synchronization can define a plurality of SubPath information (Mult to SubPath). However, within each SubPath information, the power of one SubPlayltem information cannot be defined (Single-SubPlayItem).
- the SubPlayltem information in this figure has information elements such as PiP_Position and PiP_Size, as in FIG. Of these, the settings for Clip_information_file_name of SubPlayltem information, SubPalyltem. In-time, SubPalyltem-Out-time, Sync-Playltem-Id, PiP-Position, and PiP-Size are the same as in FIG. The difference is the setting of Sync_Start_Pts_of_PlayItem.
- Sync_Start_Pts_of_PlayItem is set with an indefinite value. This undefined value indicates that the time point when the lock operation is performed by the user on the Playltem time axis specified by Sync.Playltemjd is determined as the synchronization point with the Playltem specified by Sync_PlayItem_Id. Show.
- the playback device When the playback device refers to the PlayList information in FIG. 43, the playback device writes in the Sync_Start_PTS_of_PlayItem as the lock operation time when the playback has elapsed when the thumbnail button force selected state is automatically activated and automatically enters the active state. If is activated, Secondary Video playback will start from the point when the thumbnail becomes active. This procedure should be performed only when selecting a thumbnail button 'automatically determined', that is, it is specific to a thumbnail button, so it is desirable to describe it using a navigation command that defines the controls specific to the thumbnail button.
- FIG. 44 is a diagram showing an internal configuration of a playback apparatus according to the second embodiment. This figure shows the paper Therefore, the components related to the audio decoder are omitted.
- This diagram is based on the configuration diagram of the playback device shown in Fig. 29, and the same reference numerals are assigned to common components. Among these common components, Transport Buffer 5, Multiplexed Buffer 6, Coded Picture Buffer 7, Video Decoder 8, Decoded Picture Buffer 10 and Primary Video plane 11 are responsible for decoding the Primary Video stream in the second embodiment. Fulfill.
- the hardware configuration of the playback apparatus according to FIG. 44 that decodes the Secondary Video stream includes Transport Buffer 51, Multiplexed Buffer 52, Coded Picture Buffer 53, Video Decoder 54, Decoded Picture Buffer 55, and Secondary Video plane. 56, a scaler 57, and a combining unit 58 are provided. In the following, these newly added components will be explained.
- Transport Buffer (TB) 51 is used when TS packet belonging to Secondary Video stream (SubClip) is output from PID filter 35. , And the accumulated buffer.
- the Multiplexed Buffer (MB) 52 is a buffer for storing PES packets when outputting the Secondary Video stream from the Transport Buffer 51 to the Coded Picture Buffer 53.
- Coded Picture Buffer (CPB) 53 is a picture (I picture, B picture,
- the video decoder 54 obtains a plurality of frame images by decoding each frame image of the Secondary Video stream at every predetermined decoding time (DTS), and writes it into the Decoded Picture Buffer 55.
- DTS decoding time
- the Decoded Picture Buffer 55 is a buffer in which a decoded picture is written.
- the Secondary Video plane 56 stores the playback video obtained by decoding the Secondary Video.
- the Scaller 57 enlarges or reduces the playback video obtained on the Secondary Video plane 56 based on the vertical and horizontal sizes indicated by the PiP_Size of the SubPlayltem information.
- the synthesizing unit 58 displays the enlarged or reduced playback video that has been generated by the Scaller 57 and the video decoding.
- PiP playback is realized by combining the playback video obtained by the recorder.
- the primary video playback video and the secondary video playback video are combined by the combining unit 58 in accordance with the PiP_Position defined in the sub play information.
- a composite video in which the primary video playback video and the secondary video playback video are combined is played back.
- chroma composition, layer composition, etc. are possible. It is also possible to remove the background in the secondary video and extract the person part, and then compose it with the playback video of the primary video. .
- the PID filter 35 is supplied to the i and S sockets and Transport Buffers 5 to secondary Video plane 56 that constitute the Secondary Video stream. To do.
- the above is the hardware component of the playback device.
- a Sync setting unit 50 is set as a functional component.
- the Sync setting unit 50 determines whether or not Sync_Start_PTS_of_PlayItem in the SubPlayltem information is an indeterminate value, and if it is an indefinite value, only the playback section specified by the Playltem information is played out of the MainClip.
- an operation that determines the start point of the synchronization interval is accepted. This acceptance is made via a remote controller.
- the indeterminate value in SynPlay_Start_PTS_of_PlayItem of SubPlayltem information is overwritten using time information indicating the time of the lock operation.
- the lock operation is realized by an operation of selecting any button appearing on the playback video of the MainClip, the time when the selection operation of the button is performed is set as the time of the lock operation.
- Sync_Start_PTS_of_PlayItem is set in this way, the PL playback control unit 48 performs playback control so as to play the Playltem set for Primary Video and the SubPlayltem set for Secondary Video. As a result, PiP playback by dynamic synchronization can be realized.
- the PL playback controller 48 plays back the Playltem set for the Primary Video and the SubPlayltem set for the Secondary Video. By performing the control, PiP playback with static synchronization can be realized.
- FIG. 45 is a flowchart showing a control procedure in the case of performing a jumping reproduction of an arbitrary coordinate force on the Playltem time axis. This flowchart is created based on the flowchart of FIG. This figure is the same as FIG. 30 except that steps S4 to S8 are replaced by steps S31 to S35.
- step S31 it is determined semi-U whether or not the EP_stream_type in the EP_map of the SubClip is Video Type.
- step S31 if EP_stream_type in EP_map of SubClip is not Video Tyte: l, steps S4 to S8 in FIG. 30 are executed.
- FIG. 46 is a diagram depicting random access to MainClip and random access to SubClip in the same notation as FIG.
- the SubClip which is a secondary video, has an entry position specified at a time interval of less than 1 second, and this is used for random access to the SubClip. Random access to SubClip Random access to MainClip is performed at the same processing speed, so jumping playback to MainClip and also jumping playback to SubClip Can be synchronized.
- the program according to the present invention can be created as follows. First, the software developer uses a programming language to write a source program that implements each flowchart and functional components. In this description, the software developer uses a class structure, variables, array variables, and external function calls according to the syntax of the programming language to create a source program that implements each flowchart or functional component. Describe.
- the described source program is given to the compiler as a file.
- the compiler translates these source programs to generate an object program.
- Translation by the compiler consists of processes such as syntax analysis, optimization, resource allocation, and code generation.
- syntax analysis lexical analysis, syntax analysis, and semantic analysis of the source program are performed, and the source program is converted into an intermediate program.
- optimization intermediate programs Then, basic blocks, control flow analysis, and data flow analysis are performed.
- resource allocation variables in the intermediate program are allocated to registers or memory of the processor of the target processor in order to adapt to the instruction set of the target processor.
- code generation each intermediate instruction in the intermediate program is converted into program code to obtain an object program.
- the object program generated here is composed of one or more program codes that cause a computer to execute each step of the flowcharts shown in the embodiments and individual procedures of functional components.
- program codes such as a processor native code and JAVA (registered trademark) bytecode.
- JAVA registered trademark
- the programmer activates the linker for these.
- the linker allocates these object programs and related library programs in the memory space, and combines them into one to generate a load module.
- the load module generated in this way is premised on reading by a computer, and causes the computer to execute the processing procedure shown in each flowchart and the processing procedure of functional components.
- the program according to the present invention can be created through the above processing.
- the load module corresponding to the program is written in the instruction ROM together with the basic input / output program (BIOS) and various middleware (operation system).
- BIOS basic input / output program
- middleware operation system
- the playback device is a model with a built-in hard disk
- a basic input / output program (BIOS) is built into the instruction ROM, and various middleware (operation system) is preinstalled on the hard disk.
- middleware operation system
- the playback device performs bootstrap with the boot ROM, starts up the operation system, causes the CPU to execute the application as one application, and uses the program according to the present invention.
- the hard disk model playback apparatus can use the program of the present invention as one application, the program according to the present invention can be transferred alone, lent, or supplied through a network.
- a system LSI is a device in which a bare chip is mounted on a high-density substrate and packaged.
- System LSIs that have multiple bare chips mounted on a high-density substrate and knocked to give the bare chip an external structure similar to that of one LSI are also included in this system LSI (this Such a system LSI is called a multichip module;).
- QFP tunnel flood array
- PGA pin grid array
- QFP is a system LSI with pins attached to the four sides of the package.
- a PGA is a system LSI with many pins attached to the entire bottom surface.
- pins serve as an input / output interface with the drive unit, an input interface with the remote control unit, an interface with the TV, and an interface with the IEEE1394 interface and PCI bus. Since pins in the system LSI have such an interface role, the system LSI reproduces by connecting the drive device and other circuits of the playback device to these pins in the system LSI. Serves as the core of the device.
- the bare chip packaged in the system LSI is an instruction ROM, CPU, decoder LSI or the like that embodies the function of each component shown as an internal configuration diagram in each embodiment.
- the load module As described earlier in “Use as embedded program”, the load module, basic input / output program (BIOS), and various middleware (operation system) are written in the instruction ROM.
- the load module corresponding to this program is created in particular, so the system ROM according to the present invention is produced by packaging the instruction ROM storing the load module corresponding to the program as a bare chip. be able to.
- the buses connecting circuit elements, ICs, LSIs, their peripheral circuits, external interfaces, etc. will be defined. Furthermore, connection lines, power supply lines, ground lines, clock signal lines, etc. will be defined. In this regulation, the circuit diagram is completed while adjusting the operation timing of each component taking into account the LSI specs and making adjustments such as ensuring the required bandwidth for each component. .
- Mounting design refers to where on the board the parts on the circuit diagram (circuit elements, ICs, LSIs) created by circuit design are placed, or how the connection lines on the circuit diagram are placed on the board. This is the work to create a board layout that determines whether to wire to the board.
- the mounting design includes automatic placement and automatic wiring.
- this automatic placement can be realized using a special algorithm called the “centroid method”.
- this wiring process can be realized using a special algorithm called “maize method” or “line search method”.
- the mounting design result is converted to CAM data and output to equipment such as an NC machine tool.
- NC machine tools perform SoC implementation and SiP implementation based on this CAM data.
- SoC (System on chip) mounting is a technology that burns multiple circuits on a single chip.
- SiP (System in Package) mounting is a technology that combines multiple chips into a single package using grease.
- the integrated circuit generated as described above is sometimes called an IC, LSI, super-LSI, or unroller LSI depending on the degree of integration.
- each playback device may be configured as a single chip.
- the integrated circuit may be realized by a dedicated circuit or a general process, not limited to the SoC implementation and SiP implementation described above. It is conceivable to use Field Programmable Gate Array (FPGA) that can be programmed after LSI manufacturing, or a silicon figureable 'processor that can reconfigure the connection and setting of circuit cells inside the LSI.
- FPGA Field Programmable Gate Array
- a silicon figureable 'processor that can reconfigure the connection and setting of circuit cells inside the LSI.
- integrated circuit technology that replaces LSI emerges as a result of advances in semiconductor technology or derived technologies, it is natural that functional blocks may be integrated using this technology. For example, biotechnology can be applied.
- the recording medium according to the present invention has been described as a hard disk.
- the recording medium of the present invention is characterized by EP_map and EP_stream_type to be recorded. It does not depend on physical properties. Any recording medium may be used as long as it records EP_map and EP_stream_type and is used with BD-ROM.
- it may be a semiconductor memory card such as a compact flash (registered trademark) card, smart media, memory stick, multimedia card, and PCM-CIA card.
- Magnetic recording disks such as flexible disks, SuperDisk, Zip, Clik! (0, ORBJaz.SparQ, SyJet, EZFley, removable hard disk drives (ii) such as microdrives may be used.
- the digital stream in each embodiment is an AVClip of the BD-ROM standard, but may be a VOB (VideoObject) of the DVD-Video standard or the DVD-Video Recording standard.
- VOB is an ISO / I obtained by multiplexing video and audio streams.
- the program stream conforms to the EC13818-1 standard.
- the video stream in AVClip may be MPEG4 or WMV format.
- the audio stream may be a Linear-PCM system, a Dolby-AC3 system, an MP3 system, an MPEG-AAC system, or a dts system.
- Fig. 47 (a) is a diagram showing playback control when realizing PiP playback by dynamic synchronization.
- One arbitrary point on the Playltem time axis is set as the SubPlayltem synchronization point. Therefore, when the current playback time reaches the time specified by Sync_Start_PTS_of_PlayItem, the secondary video decoding is started and the secondary video playback video is combined with the primary video playback video.
- Fig. 47 (b) shows the case where the lock point passes by normal playback ⁇ the lock point passes by rewinding ⁇ the lock point passes by normal playback, and the lock point goes back and forth. It is a figure which shows how PiP reproduction
- FIG. 47 (c) is a diagram illustrating PiP playback when the Secondary Video playback section is later than the end of the Primary Video playback section. In this case, the display of the last picture of the Primary Video may be continued until the Secondary Video playback ends. When the primary video playback ends, the secondary video playback may end.
- the MainClip Primary Video specified by the MainPath information in the PlayList information and the SubClip Secondary Video specified by the SubPlayltem information in the PlayList information are displayed on separate screens. May be. Also, Primary Video may be composed of SD images and Secondary Video may be composed of HD images.
- the primary video MainClip is supplied on the BD-ROM
- the secondary video SubClip is supplied on the local storage 200
- the secondary video SubClip is stored on the BD-ROM. It may be recorded and supplied to the playback device together with MainClip as the primary video.
- the Secondary Video stream and the Primary Video stream may be multiplexed into one AVClip.
- the recording medium and the playback device according to the present invention may be used for personal purposes such as use in a home theater system.
- the internal configuration of the present invention is disclosed in the above-described embodiment, and it is clear that mass production is based on this internal configuration, the recording medium and the playback device according to the present invention are produced in the industrial product production field. Or can be used. For this reason, the recording medium and the reproducing apparatus according to the present invention have industrial applicability.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Television Signal Processing For Recording (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
Abstract
Description
Claims
Priority Applications (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020107015450A KR101121351B1 (ko) | 2004-09-10 | 2005-09-09 | 재생장치, 재생방법, 기록방법 |
KR1020107015452A KR101121278B1 (ko) | 2004-09-10 | 2005-09-09 | 재생장치, 프로그램, 재생방법, 기록방법 |
EP05782350A EP1715686B1 (en) | 2004-09-10 | 2005-09-09 | Recording medium, reproduction device, program and reproduction method |
KR1020107015449A KR101121363B1 (ko) | 2004-09-10 | 2005-09-09 | 재생장치, 재생방법, 기록방법 |
JP2006535848A JP4268637B2 (ja) | 2004-09-10 | 2005-09-09 | 記録媒体、再生装置、プログラム、再生方法。 |
KR1020107015453A KR101121330B1 (ko) | 2004-09-10 | 2005-09-09 | 재생장치, 재생방법, 기록방법 |
KR1020067016863A KR101121238B1 (ko) | 2004-09-10 | 2005-09-09 | 기록매체, 재생장치, 재생방법 |
US10/588,578 US7865060B2 (en) | 2004-09-10 | 2005-09-09 | Recording medium reproduction device program, reproduction method |
US12/612,503 US7801415B2 (en) | 2004-09-10 | 2009-11-04 | Recording medium, reproduction device, program, reproduction method and recording method |
US12/950,702 US8457473B2 (en) | 2004-09-10 | 2010-11-19 | Recording medium, reproduction device, program, reproduction method |
US12/950,692 US8433178B2 (en) | 2004-09-10 | 2010-11-19 | Recording medium, reproduction device, program, reproduction method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-263628 | 2004-09-10 | ||
JP2004263628 | 2004-09-10 |
Related Child Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/588,578 A-371-Of-International US7865060B2 (en) | 2004-09-10 | 2005-09-09 | Recording medium reproduction device program, reproduction method |
US12/612,503 Division US7801415B2 (en) | 2004-09-10 | 2009-11-04 | Recording medium, reproduction device, program, reproduction method and recording method |
US12/950,702 Division US8457473B2 (en) | 2004-09-10 | 2010-11-19 | Recording medium, reproduction device, program, reproduction method |
US12/950,692 Division US8433178B2 (en) | 2004-09-10 | 2010-11-19 | Recording medium, reproduction device, program, reproduction method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006028216A1 true WO2006028216A1 (ja) | 2006-03-16 |
Family
ID=36036498
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/016640 WO2006028216A1 (ja) | 2004-09-10 | 2005-09-09 | 記録媒体、再生装置、プログラム、再生方法 |
Country Status (6)
Country | Link |
---|---|
US (5) | US7609947B2 (ja) |
EP (3) | EP2373013B1 (ja) |
JP (7) | JP4268637B2 (ja) |
KR (5) | KR101121351B1 (ja) |
CN (6) | CN101662692B (ja) |
WO (1) | WO2006028216A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008005289A (ja) * | 2006-06-23 | 2008-01-10 | Sony Corp | 情報処理装置および情報処理方法、プログラム、並びに、プログラム格納媒体 |
JP2008005288A (ja) * | 2006-06-23 | 2008-01-10 | Sony Corp | 情報処理装置および情報処理方法、プログラム、並びに、プログラム格納媒体 |
JP2008022568A (ja) * | 2005-08-25 | 2008-01-31 | Sony Corp | データ生成方法、データ構造、記録装置および方法、並びに、プログラム |
JP2009505312A (ja) * | 2005-08-22 | 2009-02-05 | エルジー エレクトロニクス インコーポレーテッド | 記録媒体、データ再生方法及び再生装置、並びにデータ記録方法及び記録装置 |
US20090116818A1 (en) * | 2007-11-01 | 2009-05-07 | Taiji Sasaki | Recording medium, playback apparatus, recording apparatus, playback method, and recording method |
JP2011234390A (ja) * | 2005-08-25 | 2011-11-17 | Sony Corp | データ生成方法、記録装置および方法、並びに、プログラム |
WO2019043989A1 (ja) * | 2017-08-30 | 2019-03-07 | パナソニックIpマネジメント株式会社 | 記録方法、及び、記録装置 |
Families Citing this family (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101778301B (zh) | 2004-01-13 | 2012-09-26 | 松下电器产业株式会社 | 记录介质、重放装置、记录方法、程序和重放方法 |
CN101790067B (zh) | 2004-02-17 | 2013-09-11 | 松下电器产业株式会社 | 记录方法和再现装置 |
EP1789958A4 (en) * | 2004-09-13 | 2009-12-09 | Lg Electronics Inc | METHOD AND DEVICE FOR REPRODUCING DATA RECORDED IN A RECORDING MEDIUM USING A LOCAL STORAGE |
US20060077773A1 (en) * | 2004-09-13 | 2006-04-13 | Seo Kang S | Method and apparatus for reproducing data from recording medium using local storage |
US20060077817A1 (en) * | 2004-09-13 | 2006-04-13 | Seo Kang S | Method and apparatus for reproducing data from recording medium using local storage |
KR20060047549A (ko) * | 2004-10-12 | 2006-05-18 | 엘지전자 주식회사 | 로컬 스토리지를 이용한 기록매체 재생방법 및 재생장치 |
CN101057286B (zh) * | 2004-11-08 | 2010-04-07 | Lg电子株式会社 | 利用本地存储器从记录介质中再现出数据的方法和装置 |
KR20060063601A (ko) * | 2004-12-03 | 2006-06-12 | 엘지전자 주식회사 | 로컬 스토리지에 데이터를 다운로드/업데이트 하는 방법 및장치 |
KR20060081323A (ko) * | 2005-01-07 | 2006-07-12 | 엘지전자 주식회사 | 로컬 스토리지를 이용한 기록매체 재생방법 및 재생장치 |
JP4968506B2 (ja) * | 2005-03-04 | 2012-07-04 | ソニー株式会社 | 再生装置、再生方法、およびプログラム |
JP4081772B2 (ja) * | 2005-08-25 | 2008-04-30 | ソニー株式会社 | 再生装置および再生方法、プログラム、並びにプログラム格納媒体 |
JP2007080357A (ja) * | 2005-09-13 | 2007-03-29 | Toshiba Corp | 情報記憶媒体、情報再生方法、情報再生装置 |
US20070086747A1 (en) * | 2005-10-17 | 2007-04-19 | Samsung Electronics Co., Ltd. | Reproducing apparatus and video data storing method |
JP4683498B2 (ja) * | 2005-11-30 | 2011-05-18 | パイオニア株式会社 | 情報再生装置及び方法、並びにコンピュータプログラム |
US8412774B2 (en) | 2006-04-29 | 2013-04-02 | At&T Intellectual Property I, L.P. | Picture-in-picture video content distribution |
CN101202873B (zh) * | 2006-12-13 | 2012-07-25 | 株式会社日立制作所 | 信息记录再现装置和信息记录再现方法 |
US20080157307A1 (en) * | 2006-12-28 | 2008-07-03 | Semiconductor Manufacturing International (Shanghai) Corporation | Lead frame |
JP4321628B2 (ja) * | 2007-05-31 | 2009-08-26 | ソニー株式会社 | 記憶装置、記憶方法および記憶プログラム、ならびに、データ処理装置、データ処理方法およびデータ処理プログラム |
JP2009027552A (ja) * | 2007-07-20 | 2009-02-05 | Funai Electric Co Ltd | 光ディスク再生装置 |
EP2208200A1 (en) * | 2007-11-07 | 2010-07-21 | Thomson Licensing | Editing apparatus, editing method, and editing program |
US20090327100A1 (en) * | 2008-06-29 | 2009-12-31 | TV1.com Holdings, LLC | Method of Internet Video Access and Management |
CN101911713B (zh) * | 2008-09-30 | 2014-01-08 | 松下电器产业株式会社 | 再现装置、集成电路、再现方法、记录方法、记录介质再现系统 |
US20100178029A1 (en) * | 2009-01-15 | 2010-07-15 | Kabushiki Kaisha Toshiba | Recording/play back apparatus, play back control apparatus and method, and recording/play back program |
ES2439316T3 (es) * | 2009-02-19 | 2014-01-22 | Panasonic Corporation | Medio de grabación y dispositivo de reproducción |
EP2254121A1 (en) * | 2009-05-20 | 2010-11-24 | Sony DADC Austria AG | Method for copy protection |
CN102422355A (zh) | 2009-05-20 | 2012-04-18 | 索尼达德克奥地利股份公司 | 用于拷贝保护的方法 |
US9263085B2 (en) | 2009-05-20 | 2016-02-16 | Sony Dadc Austria Ag | Method for copy protection |
JP4984181B2 (ja) * | 2009-06-22 | 2012-07-25 | ソニー株式会社 | 再生装置および再生方法 |
US8351768B2 (en) * | 2009-07-23 | 2013-01-08 | Microsoft Corporation | Media processing comparison system and techniques |
WO2012081241A1 (ja) | 2010-12-16 | 2012-06-21 | パナソニック株式会社 | 制作装置及びコンテンツ配信システム |
WO2012174301A1 (en) | 2011-06-14 | 2012-12-20 | Related Content Database, Inc. | System and method for presenting content with time based metadata |
US20130191745A1 (en) * | 2012-01-10 | 2013-07-25 | Zane Vella | Interface for displaying supplemental dynamic timeline content |
CN107529706B (zh) | 2011-06-16 | 2020-11-17 | Ge视频压缩有限责任公司 | 解码器、编码器、解码和编码视频的方法及存储介质 |
UA114674C2 (uk) | 2011-07-15 | 2017-07-10 | ДЖ.І. ВІДІЕУ КЕМПРЕШН, ЛЛСі | Ініціалізація контексту в ентропійному кодуванні |
JP5435001B2 (ja) * | 2011-09-28 | 2014-03-05 | 株式会社デンソー | 地図データ配信装置、電子機器及び地図更新システム |
US10051300B1 (en) * | 2012-01-26 | 2018-08-14 | Amazon Technologies, Inc. | Multimedia progress tracker |
EP2680219A1 (en) * | 2012-06-29 | 2014-01-01 | Thomson Licensing | Method for reframing images of a video sequence, and apparatus for reframing images of a video sequence |
US10051311B2 (en) * | 2012-07-06 | 2018-08-14 | Sharp Kabushiki Kaisha | Electronic devices for signaling sub-picture based hypothetical reference decoder parameters |
US9454289B2 (en) | 2013-12-03 | 2016-09-27 | Google Inc. | Dyanmic thumbnail representation for a video playlist |
CN111933189B (zh) * | 2014-09-12 | 2022-01-04 | 松下电器(美国)知识产权公司 | 再现装置以及再现方法 |
JP6389765B2 (ja) * | 2015-01-07 | 2018-09-12 | アイシン・エィ・ダブリュ株式会社 | 表示制御システム、方法およびプログラム |
CN104573061B (zh) * | 2015-01-23 | 2017-09-26 | 南开大学 | 一种支持扩展功能的虚拟文件系统装置和方法 |
JP6953771B2 (ja) * | 2017-04-11 | 2021-10-27 | 船井電機株式会社 | 再生装置 |
CN111343502B (zh) * | 2020-03-30 | 2021-11-09 | 招商局金融科技有限公司 | 视频处理方法、电子装置及计算机可读存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08191423A (ja) * | 1995-01-10 | 1996-07-23 | Mitsubishi Electric Corp | 光ディスク記録再生方法 |
JPH1175159A (ja) * | 1995-04-11 | 1999-03-16 | Toshiba Corp | 光ディスク再生装置及び再生方法並びに光ディスク記録装置及び記録方法 |
JP2002247526A (ja) | 2001-02-19 | 2002-08-30 | Toshiba Corp | 内外ストリームデータの同期再生装置とストリームデータ配信装置 |
EP1280348A1 (en) | 2000-04-21 | 2003-01-29 | Sony Corporation | Information processing apparatus and method, program, and recorded medium |
JP2003068057A (ja) * | 2001-06-21 | 2003-03-07 | Lg Electronics Inc | マルチチャネルストリームの記録装置及び方法と、それによる記録媒体 |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05290550A (ja) | 1992-04-10 | 1993-11-05 | Nippon Steel Corp | ビデオシステム |
US5596564A (en) * | 1993-10-08 | 1997-01-21 | Matsushita Electric Industrial Co., Ltd. | Information recording medium and apparatus and method for recording and reproducing information |
EP0689206B1 (en) * | 1993-12-10 | 2002-07-24 | Sony Corporation | Data recording medium and data reproduction apparatus |
CA2173812C (en) | 1995-04-11 | 2000-02-08 | Shinichi Kikuchi | Recording medium, recording apparatus and recording method for recording data into recording medium, and reproducing apparatus and reproduction method for reproducing data from recording medium |
US5784519A (en) * | 1995-06-15 | 1998-07-21 | Kabushiki Kaisha Toshiba | Multi-scene recording medium and apparatus for reproducing data therefrom |
JP3326670B2 (ja) * | 1995-08-02 | 2002-09-24 | ソニー株式会社 | データ符号化/復号化方法および装置、および符号化データ記録媒体 |
DE69710279T2 (de) * | 1996-04-05 | 2002-06-27 | Matsushita Electric Ind Co Ltd | Optische multimedia-platte mit mehrkanaligen audiodaten und untergeordneten bilddaten zusammen mit zeitlich veränderlichen bilddaten sowie datenwiedergabeverfahren und -vorrichtung dazu |
JP3790871B2 (ja) * | 1996-09-26 | 2006-06-28 | 株式会社ニコン | 画像再生装置 |
CN1123886C (zh) * | 1996-10-01 | 2003-10-08 | 松下电器产业株式会社 | 信息处理装置和信息处理方法 |
WO1999014754A1 (en) * | 1997-09-17 | 1999-03-25 | Matsushita Electric Industrial Co., Ltd. | Optical disc, recording apparatus, and computer-readable recording medium |
US6363204B1 (en) * | 1997-09-30 | 2002-03-26 | Compaq Computer Corporation | Viewing management for video sources |
JP3772023B2 (ja) | 1998-05-22 | 2006-05-10 | 株式会社東芝 | 画像表示装置、同装置に適用される画像切り替え表示方法 |
US20030161614A1 (en) * | 1997-11-28 | 2003-08-28 | Kabushiki Kaisha Toshiba | Method and apparatus for playing back data recorded on a recoding medium |
US6678006B1 (en) * | 1998-01-07 | 2004-01-13 | Ati Technologies, Inc. | Method and apparatus for video processing that includes sub-picture scaling |
KR100466496B1 (ko) * | 1998-08-07 | 2005-01-13 | 가부시키가이샤 히타치세이사쿠쇼 | 기록매체, 기록장치, 재생장치, 기록방법, 및 컴퓨터가 읽기가능한 기록매체 |
JP3058870B1 (ja) | 1999-02-05 | 2000-07-04 | 株式会社次世代デジタルテレビジョン放送システム研究所 | Afc回路 |
JP2000347638A (ja) | 1999-06-07 | 2000-12-15 | Hitachi Ltd | Osd装置及びこれを用いた符号化ビデオ復号装置並びにこの復号装置を用いたディジタル放送受信装置 |
JP4599740B2 (ja) * | 2000-04-21 | 2010-12-15 | ソニー株式会社 | 情報処理装置および方法、記録媒体、プログラム、並びに記録媒体 |
JP2002262233A (ja) * | 2001-03-01 | 2002-09-13 | Mitsubishi Electric Corp | 行動計測システム |
JP3902420B2 (ja) | 2001-05-30 | 2007-04-04 | 三洋電機株式会社 | 光ディスク再生装置 |
JP2004007118A (ja) * | 2002-05-31 | 2004-01-08 | Toshiba Corp | テレビジョン信号再生装置及び再生方法 |
JP2004032607A (ja) | 2002-06-28 | 2004-01-29 | Sanyo Electric Co Ltd | ディジタル映像再生装置 |
DE60323231D1 (de) | 2002-07-12 | 2008-10-09 | Axon Neuroscience | Verkürzte tau proteine |
EP1383340A1 (en) | 2002-07-18 | 2004-01-21 | Thomson Licensing S.A. | Video apparatus |
CN101504853B (zh) * | 2002-09-25 | 2012-10-31 | 松下电器产业株式会社 | 再现装置和记录方法 |
AU2003276759A1 (en) * | 2002-11-08 | 2004-06-07 | Lg Electronics Inc. | Method and apparatus for recording a multi-component stream and a high-density recording medium having a multi-component stream recorded theron and reproducing method and apparatus of said recording medium |
US7664372B2 (en) * | 2002-11-20 | 2010-02-16 | Lg Electronics Inc. | Recording medium having data structure for managing reproduction of multiple component data recorded thereon and recording and reproducing methods and apparatuses |
CN100466713C (zh) * | 2002-11-28 | 2009-03-04 | 索尼株式会社 | 再现装置和再现方法 |
JP3908724B2 (ja) | 2002-12-09 | 2007-04-25 | 株式会社東芝 | 情報再生装置及び情報再生方法 |
JP3875685B2 (ja) | 2002-12-27 | 2007-01-31 | 株式会社東芝 | 情報再生装置及び情報再生方法 |
JP3731664B2 (ja) | 2003-01-08 | 2006-01-05 | 船井電機株式会社 | 光ディスク再生装置 |
EP1603335B1 (en) | 2003-02-19 | 2008-04-30 | Matsushita Electric Industrial Co., Ltd. | Recording medium, reproduction device, recording method, program, and reproduction method |
CN1700329B (zh) * | 2004-01-29 | 2010-06-16 | 索尼株式会社 | 再现装置、再现方法、再现程序和记录介质 |
JP2005348075A (ja) * | 2004-06-02 | 2005-12-15 | Funai Electric Co Ltd | ディジタルテレビジョン放送信号再生装置。 |
KR20070014945A (ko) * | 2005-07-29 | 2007-02-01 | 엘지전자 주식회사 | 기록매체, 데이터 재생방법 및 재생장치와 데이터 기록방법및 기록장치 |
-
2005
- 2005-08-31 US US11/216,409 patent/US7609947B2/en active Active
- 2005-09-09 KR KR1020107015450A patent/KR101121351B1/ko active IP Right Grant
- 2005-09-09 EP EP11004445A patent/EP2373013B1/en active Active
- 2005-09-09 EP EP11004444A patent/EP2373012B1/en active Active
- 2005-09-09 CN CN2009101728053A patent/CN101662692B/zh active Active
- 2005-09-09 KR KR1020107015449A patent/KR101121363B1/ko active IP Right Grant
- 2005-09-09 US US10/588,578 patent/US7865060B2/en active Active
- 2005-09-09 KR KR1020067016863A patent/KR101121238B1/ko active IP Right Grant
- 2005-09-09 CN CN200910168956A patent/CN101646093A/zh active Pending
- 2005-09-09 WO PCT/JP2005/016640 patent/WO2006028216A1/ja active Application Filing
- 2005-09-09 EP EP05782350A patent/EP1715686B1/en active Active
- 2005-09-09 CN CNB2005800065626A patent/CN100556118C/zh active Active
- 2005-09-09 JP JP2006535848A patent/JP4268637B2/ja active Active
- 2005-09-09 KR KR1020107015453A patent/KR101121330B1/ko active IP Right Grant
- 2005-09-09 CN CN2009101689580A patent/CN101640782B/zh active Active
- 2005-09-09 KR KR1020107015452A patent/KR101121278B1/ko active IP Right Grant
- 2005-09-09 CN CN2009101689576A patent/CN101646094B/zh active Active
- 2005-09-12 CN CNB2005100981855A patent/CN100559858C/zh not_active Expired - Fee Related
-
2008
- 2008-04-28 JP JP2008116781A patent/JP4268660B2/ja active Active
- 2008-04-28 JP JP2008116782A patent/JP4268661B2/ja active Active
- 2008-04-28 JP JP2008116783A patent/JP4268662B2/ja active Active
- 2008-09-26 JP JP2008248524A patent/JP4560111B2/ja not_active Expired - Fee Related
- 2008-09-26 JP JP2008248523A patent/JP4537477B2/ja not_active Expired - Fee Related
- 2008-09-26 JP JP2008248525A patent/JP4560112B2/ja not_active Expired - Fee Related
-
2009
- 2009-11-04 US US12/612,503 patent/US7801415B2/en not_active Expired - Fee Related
-
2010
- 2010-11-19 US US12/950,692 patent/US8433178B2/en active Active
- 2010-11-19 US US12/950,702 patent/US8457473B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08191423A (ja) * | 1995-01-10 | 1996-07-23 | Mitsubishi Electric Corp | 光ディスク記録再生方法 |
JPH1175159A (ja) * | 1995-04-11 | 1999-03-16 | Toshiba Corp | 光ディスク再生装置及び再生方法並びに光ディスク記録装置及び記録方法 |
EP1280348A1 (en) | 2000-04-21 | 2003-01-29 | Sony Corporation | Information processing apparatus and method, program, and recorded medium |
JP2002247526A (ja) | 2001-02-19 | 2002-08-30 | Toshiba Corp | 内外ストリームデータの同期再生装置とストリームデータ配信装置 |
JP2003068057A (ja) * | 2001-06-21 | 2003-03-07 | Lg Electronics Inc | マルチチャネルストリームの記録装置及び方法と、それによる記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1715686A4 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009505312A (ja) * | 2005-08-22 | 2009-02-05 | エルジー エレクトロニクス インコーポレーテッド | 記録媒体、データ再生方法及び再生装置、並びにデータ記録方法及び記録装置 |
JP2008022568A (ja) * | 2005-08-25 | 2008-01-31 | Sony Corp | データ生成方法、データ構造、記録装置および方法、並びに、プログラム |
JP2011234390A (ja) * | 2005-08-25 | 2011-11-17 | Sony Corp | データ生成方法、記録装置および方法、並びに、プログラム |
JP2008005289A (ja) * | 2006-06-23 | 2008-01-10 | Sony Corp | 情報処理装置および情報処理方法、プログラム、並びに、プログラム格納媒体 |
JP2008005288A (ja) * | 2006-06-23 | 2008-01-10 | Sony Corp | 情報処理装置および情報処理方法、プログラム、並びに、プログラム格納媒体 |
US20090116818A1 (en) * | 2007-11-01 | 2009-05-07 | Taiji Sasaki | Recording medium, playback apparatus, recording apparatus, playback method, and recording method |
US8326124B2 (en) * | 2007-11-01 | 2012-12-04 | Panasonic Corporation | Recording medium, playback apparatus, recording apparatus, playback method, and recording method for reducing processing load during copyright protection at the TS packet level |
RU2473980C2 (ru) * | 2007-11-01 | 2013-01-27 | Панасоник Корпорэйшн | Носитель записи, устройство воспроизведения, устройство записи, способ воспроизведения и способ записи |
US8942541B2 (en) | 2007-11-01 | 2015-01-27 | Panasonic Intellectual Property Management Co., Ltd. | Recording medium and recording method reducing processing load in realization of TS packet level copyright protection, and playback device and playback method for playback of such recording medium |
WO2019043989A1 (ja) * | 2017-08-30 | 2019-03-07 | パナソニックIpマネジメント株式会社 | 記録方法、及び、記録装置 |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4268660B2 (ja) | 再生装置、プログラム、再生方法、記録方法。 | |
JP4676493B2 (ja) | 記録媒体、再生装置、記録方法 | |
JPWO2007119765A1 (ja) | 記録媒体、再生装置、記録装置、システムlsi、方法、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005782350 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067016863 Country of ref document: KR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 200580006562.6 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2005782350 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006535848 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10588578 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 10588578 Country of ref document: US |