WO2007023728A1 - 再生装置および再生方法、プログラム、プログラム格納媒体、データ構造、並びに、記録媒体の製造方法 - Google Patents
再生装置および再生方法、プログラム、プログラム格納媒体、データ構造、並びに、記録媒体の製造方法 Download PDFInfo
- Publication number
- WO2007023728A1 WO2007023728A1 PCT/JP2006/316177 JP2006316177W WO2007023728A1 WO 2007023728 A1 WO2007023728 A1 WO 2007023728A1 JP 2006316177 W JP2006316177 W JP 2006316177W WO 2007023728 A1 WO2007023728 A1 WO 2007023728A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stream
- video
- file
- information
- audio
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 13
- 238000012545 processing Methods 0.000 claims description 85
- 230000008569 process Effects 0.000 claims description 35
- 230000002194 synthesizing effect Effects 0.000 claims description 17
- 230000015572 biosynthetic process Effects 0.000 claims description 8
- 238000003786 synthesis reaction Methods 0.000 claims description 8
- 230000005540 biological transmission Effects 0.000 claims description 2
- 230000009467 reduction Effects 0.000 abstract description 3
- 230000002452 interceptive effect Effects 0.000 description 73
- 239000000872 buffer Substances 0.000 description 62
- 238000010586 diagram Methods 0.000 description 54
- 230000001360 synchronised effect Effects 0.000 description 33
- 239000000203 mixture Substances 0.000 description 20
- 238000004891 communication Methods 0.000 description 15
- 238000003860 storage Methods 0.000 description 13
- 230000002441 reversible effect Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 230000007246 mechanism Effects 0.000 description 5
- 239000002184 metal Substances 0.000 description 5
- 241000023320 Luma <angiosperm> Species 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 description 3
- 238000001824 photoionisation detection Methods 0.000 description 3
- 229920002120 photoresistant polymer Polymers 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 2
- PWPJGUXAGUPAHP-UHFFFAOYSA-N lufenuron Chemical compound C1=C(Cl)C(OC(F)(F)C(C(F)(F)F)F)=CC(Cl)=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F PWPJGUXAGUPAHP-UHFFFAOYSA-N 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000000465 moulding Methods 0.000 description 2
- 239000004417 polycarbonate Substances 0.000 description 2
- 239000011347 resin Substances 0.000 description 2
- 229920005989 resin Polymers 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 241001416085 Buteo Species 0.000 description 1
- 101100060194 Caenorhabditis elegans clip-1 gene Proteins 0.000 description 1
- NIXOWILDQLNWCW-UHFFFAOYSA-N acrylic acid group Chemical group C(C=C)(=O)O NIXOWILDQLNWCW-UHFFFAOYSA-N 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000011888 foil Substances 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229920003229 poly(methyl methacrylate) Polymers 0.000 description 1
- 229920000515 polycarbonate Polymers 0.000 description 1
- 239000004926 polymethyl methacrylate Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000004528 spin coating Methods 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 238000004544 sputter deposition Methods 0.000 description 1
- 238000007740 vapor deposition Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/92—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/30—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
- G11B27/3081—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is a video-frame or a video-field (P.I.P)
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/327—Table of contents
- G11B27/329—Table of contents on a disc [VTOC]
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42646—Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
- H04N5/45—Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/93—Regeneration of the television signal or of selected parts thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8227—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
- H04N9/877—Regeneration of colour television signals by assembling picture element blocks in an intermediate memory
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2562—DVDs [digital versatile discs]; Digital video discs; MMCDs; HDCDs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/775—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/781—Television signal recording using magnetic recording on disks or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/806—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
- H04N9/8063—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals
Definitions
- the present invention relates to a playback device and playback method, a program, a program storage medium, a data structure, and a recording medium manufacturing method, and in particular, a playback device and playback suitable for playing back stream data.
- the present invention relates to a method, a program, a program storage medium, a data structure, and a recording medium manufacturing method.
- the display method of the display screen that is the sub of the main display screen does not depend on the playback device. For example, for each content or for a predetermined content It was difficult for content creators or content distributors to set each playback location.
- the present invention has been made in view of such a situation, and makes it possible to specify a display method of a display screen that is a sub-screen with respect to a main display screen.
- the playback device includes a main stream including at least one stream.
- Playback management information composed of first information including a main playback path indicating a position on the time axis of a video file and second information including a secondary playback path of a substream file different from the main stream file
- a reading means for reading the main stream file and the substream file to be reproduced
- the reproduction obtained by the obtaining means Video synthesis means for synthesizing the video of the main stream file read by the reading means and the video of the substream file based on management information
- the reproduction management information includes the main stream Information about the display status of the video of the substream file to be combined with the video of the file Are included, the image synthesizing means, based on information about the display state included in the reproduction management information, it synthesizes the image of the previous SL main stream file, a video of the sub stream file.
- the information related to the display state may include information related to the display position of the video of the substream file combined with the video of the main stream file.
- the information related to the display state may include information related to the display size of the video of the substream file combined with the video of the main stream file.
- the information related to the display state may include information related to the display color of the video of the substream file combined with the video of the main stream file.
- the information related to the display state may include information related to a rotation angle of the video of the substream file to be combined with the video of the main stream file after being combined with the original video.
- the information on the display state includes the video of the main stream file, which is executed based on the information indicating the brightness of the video of the substream file combined with the video of the main stream file.
- information relating to the process of combining the video of the substream file is possible to include information relating to the process of combining the video of the substream file.
- the information indicating the brightness of the video of the substream file is the brightness to be subjected to the transmission process. It is possible to perform transparency processing on a video portion that is at least one threshold value of the information indicating the brightness and the video brightness of the substream file is greater than or less than the threshold value.
- An operation input unit that receives an operation input by a user can be further provided.
- the reproduction management information acquired by the acquisition unit includes the substream file for the main stream file. Information indicating whether or not the substream file is reproduced at the same time with respect to the main stream file can be described only when it is instructed by the user, and only when commanded by the user. If the playback management information contains information indicating that the substream file is played back simultaneously with the main stream file only when commanded by the user, it is input by the operation input means. Based on the user's operation input! /, The user power Only when the reproduction of the file is commanded, the it is rubbing ⁇ so as to combine the image of the image and the sub stream file of the main stream file away in.
- a voice synthesizing means for synthesizing the sound reproduced corresponding to the main stream file and the sound reproduced corresponding to the substream file.
- the audio synthesizing unit when the video of the main stream file and the video of the substream file are synthesized by the video synthesizing unit, the audio reproduced corresponding to the main stream file, and the substream file It is possible to synthesize the sound that is played in response to.
- the reproduction management information can be written so as to include information indicating that the display state information is included.
- the reproduction method and program according to the first aspect of the present invention include first information including a main reproduction path indicating a position on a time axis of the main stream file including at least one stream, and the main stream.
- the reproduction management information configured by the second information including the secondary reproduction path of the substream file different from the file is read, and the main stream file to be reproduced and the reproduction are based on the read reproduction management information.
- the substream file is read, and the main stream included in the playback management information
- the first information including a main playback path indicating a position on the time axis of the main stream file including at least one stream is different from the main stream file.
- Playback management information including second information including the secondary playback path of the substream file is read out, and the main stream file and the substream to be played back are read based on the read playback management information.
- the read video of the main stream file and the video of the substream file are Synthesized.
- the data structure of the second aspect of the present invention is a data structure including reproduction management information that is information for managing reproduction of a main stream file including at least one stream, and the reproduction management information includes: First information including a main reproduction path indicating a position on the time axis of the main stream file, second information including a sub reproduction path of a substream file different from the main stream file, and the main information It includes information on the display state of the video of the stream file and the video of the substream file to be synthesized.
- a recording medium manufacturing method is a recording medium manufacturing method in which data that can be played back by a playback device is recorded, and a main stream file including at least one stream Reproduction management information, which is information for managing the reproduction of the main stream file.
- the reproduction management information includes first information including a main reproduction path indicating a position on the time axis of the main stream file, and the main stream file.
- At least one stream Playback management information which is information for managing playback of a mainstream file including a file, and the playback management information includes first information including a main playback path indicating a position on the time axis of the mainstream file;
- a data structure including second information including a secondary playback path of a substream file different from the main stream file, and information regarding a display state of the video of the substream file to be synthesized with the video of the main stream file Is generated, and the generated data is recorded on the recording medium.
- a network is a mechanism in which at least two devices are connected and information can be transmitted from one device to another device.
- Devices that communicate via a network may be independent devices, or may be internal blocks that constitute one device.
- the communication is not only wireless communication and wired communication, but also communication in which wireless communication and wired communication are mixed, that is, wireless communication is performed in a certain section, and wired communication is performed in another section. It may be anything. Further, communication from one device to another device may be performed by wired communication, and communication from another device having power may be performed by wireless communication.
- the recording device may be an independent device, or may be a block that performs recording processing of the recording / reproducing device.
- the video of the main stream file and the video of the substream file can be combined, and in particular, the substream included in the playback management information Based on the information about the display state of the video of the file, the video of the main stream file and the video of the substream file can be combined. For example, the size and display position of the sub display screen in picture-in-picture display In other words, content producers or content distributors that are not dependent on the specifications of the playback device can determine them as appropriate.
- a data structure including reproduction management information which is information for managing reproduction of a main stream file including at least one stream.
- the playback management information includes the first information including the main playback path indicating the position of the main stream file on the time axis, and the substream file different from the main stream file.
- Data having a data structure including the second information including the secondary reproduction path and information regarding the display state of the video of the substream file to be combined with the video of the main stream file can be provided.
- the size and display position of the secondary display screen in the picture display can be appropriately determined by the content producer or the content distribution source without depending on the specifications of the playback device.
- FIG. 1 is a diagram showing an example of an application format on a recording medium mounted on a playback device to which the present invention is applied.
- FIG. 2 is a diagram for explaining the structure of a main path and a sub path.
- FIG. 3 is a diagram illustrating an example of a main path and a sub path.
- FIG. 4 is a diagram for explaining another example of a main path and a sub path.
- FIG. 5 is a diagram for explaining a main display screen and a sub display screen.
- FIG. 6 is a diagram showing an example of a file system of a reproducible data file.
- FIG. 7 is a diagram showing a data structure of a PlayList file having an extension “.mpls” stored in a PLAYLIST directory.
- FIG. 8 is a diagram illustrating the syntax of PlayListExtensionDataO.
- FIG. 9 is a diagram for explaining a first example of the syntax of pipjnetadata described in data_block.
- FIG. 10 is a diagram for explaining the value of pip_metadata_type in FIG. 9 and its meaning.
- FIG. L l A diagram illustrating a second example of the syntax of pipjnetadata described in data_block.
- FIG. 12 is a diagram for explaining a third example of the syntax of pipjnetadata described in data_block.
- FIG. 13 is a diagram for explaining the value of pip_scale in FIG. 12 and its meaning.
- FIG. 14 is a diagram for explaining the value of pip_timeline_type in FIG. 12 and its meaning.
- FIG. 18 is a diagram illustrating the syntax of PlayListO.
- FIG. 19 is a diagram illustrating the syntax of SubPathO.
- FIG. 20 is a diagram for explaining SubPath_type.
- FIG. 21 is a diagram illustrating the syntax of SubPlayItem (i).
- FIG. 22 is a diagram showing the syntax of PlayItemO.
- FIG. 23 is a diagram illustrating the syntax of STN_tableO.
- FIG. 24 is a diagram illustrating the syntax of stream_entry ().
- FIG. 25 is a diagram illustrating the syntax of stream_attribute ().
- FIG. 26 is a diagram for explaining stream_cording_type.
- FIG. 27 is a diagram for explaining video_format.
- FIG. 28 is a diagram for explaining frame_rate.
- FIG. 29 is a diagram for explaining aspect_ratio.
- FIG. 30 is a diagram for explaining audio_presentation_type.
- FIG. 31 is a diagram for explaining sampling_frequency.
- FIG. 32 is a diagram for explaining the Character code.
- FIG. 33 is a diagram illustrating the syntax of STN_tableO.
- FIG. 34 is a diagram illustrating the syntax of STN_tableO.
- FIG. 35 is a block diagram illustrating a configuration example of a playback device to which the present invention has been applied.
- FIG. 36 is a flowchart for explaining playback processing in the playback apparatus of FIG. 35.
- FIG. 37 is a flowchart for explaining primary video playback processing.
- FIG. 38 is a flowchart illustrating primary video playback processing.
- FIG. 39 is a flowchart for explaining primary video playback processing.
- FIG. 40 is a flowchart for explaining primary and secondary video playback processing.
- FIG. 41 is a flowchart illustrating primary and secondary video playback processing.
- FIG. 42 is a flowchart for explaining primary and secondary video playback processing.
- FIG. 43 is a diagram for explaining positioning and scaling.
- FIG. 44 is a diagram for explaining luma_keying synthesis.
- FIG. 45 is a diagram for explaining luma_keying composition.
- FIG. 46 is a diagram for explaining the manufacture of a recording medium on which data that can be played back by the playback device is recorded.
- FIG. 47 is a diagram for explaining the manufacture of a recording medium on which data that can be played back by the playback device is recorded.
- FIG. 48 is a diagram showing a configuration of a personal computer.
- Fig. 1 is a diagram showing an example of an application format on a recording medium mounted on a playback device 20 to which the present invention is applied (described later with reference to Fig. 35).
- the recording medium may be a magnetic disk or a semiconductor memory in addition to the optical disk described later.
- the application format has two layers, Play List and Clip, for managing AV (Audio Visual) streams.
- AV Audio Visual
- a pair of Clip information which is information related to one AV stream, is attached to one object. They are collectively called Clip.
- an AV stream is also referred to as an AV stream file.
- Clip information is also referred to as a Clip information file.
- a file used in a computer or the like is handled as a byte string.
- the content of an AV stream file is expanded on the time axis, and the access point of Clip is mainly specified by PlayList with a time stamp. .
- PlayList and Clip are layers for AV stream management.
- the Clip Information file is used to find address information to start decoding in the AV stream file from the time stamp.
- the PlayList is a collection of AV stream playback sections.
- One playback section in an AV stream is called Playltem, and it is represented by a pair of IN point (playback start point) and OUT point (playback end point) of the playback section on the time axis. Therefore, the PlayList is composed of one or more Playltems as shown in FIG.
- the first PlayList from the left is composed of two Playltems, and the two P1 ayltems refer to the first half and the second half of the AV stream included in the left Clip, respectively.
- the second PlayList from the left is composed of one Playltem, and the entire AV stream included in the Clip on the right is referred to.
- the third PlayList from the left is composed of two Playltems, and by these two Playltems, a part of the AV stream included in the Clip on the left side and a part of the AV stream included in the Clip on the right side respectively. See you!
- the disc navigation program in FIG. 1 specifies the left Playltem included in the first PlayList from the left as the information indicating the playback position at that time
- the left reference that Playltem refers to The first half of the AV stream included in the clip is played.
- the PlayList is used as playback management information for managing playback of AV stream files.
- the disc navigation program has a function of controlling the playback order of the PlayList and interactive playback of the PlayList.
- the disc navigation program is a function that displays a menu screen for the user to instruct the execution of various playbacks. Etc. This disc navigation program is described in a programming language such as Java (registered trademark) and prepared on a recording medium.
- a playback path created by a sequence of one or more Playltems (by successive Playltems) in the PlayList is referred to as a Main Path.
- a playback path that is made parallel (in parallel) with one or more Sub Paths (which may be non-consecutive or continuous, by SubPlayItem) is referred to as a Sub Path. That is, the absolute format on the recording medium mounted on the playback device 20 (described later with reference to FIG. 35) is a sub-path (Sub Path) to be played back in association with the main path (PlayList). Hold in.
- FIG. 2 is a diagram illustrating the structure of the main path and the sub path.
- a PlayList can have one main path and one or more subpaths.
- a main path is created by a sequence of one or more Playltems, and a subpath is created by a sequence of one or more SubPlayltems.
- the PlayList has one main path made up of a sequence of three Playltems and three sub-paths.
- Each Playltem that constitutes the main path is also given an ID (Identification) in order.
- a ClipAV stream file referenced by one Playltem has at least a video stream.
- Data main image data
- a ClipAV stream file may or may not include more than one audio stream that is played back (synchronously) at the same timing as the video stream (main image data) included in the ClipA V stream file. May be.
- the ClipAV stream file may or may not include one or more bitmap subtitle streams that are played back at the same timing as the video stream included in the ClipAV stream file.
- the ClipAV stream file may or may not include one or more interactive graphics streams that are played back at the same timing as the video stream included in the ClipAV stream file.
- a ClipAV stream file referred to by a single Platform includes video stream data, zero or more audio streams to be played in accordance with the video stream, zero or more bitmap subtitle stream data, and zero.
- the above interactive graphics stream data is multiplexed!
- a ClipAV stream file referred to by one Playltem includes a plurality of types of streams such as a video stream, an audio stream, a bitmap subtitle stream file, or an interactive graphics stream. .
- one SubPlayltem refers to audio stream data and subtitle data of a stream (different stream) different from the ClipAV stream file referred to by Playltem.
- FIG. 3 is a diagram for explaining an example of the main path and the sub path.
- the audio playback path that is played at the same timing as the main path (synchronized with AV) is shown using sub-paths.
- SubPlayltemO includes the following data.
- SubPlayltem 0 includes Clip_Information_file_name for designating a Clip referred to by a Sub Path in the PlayList.
- SubPlayltem 0 includes SubPlayItem_IN_time and SubPlayItem_OUT_time for designating the playback path of Sub Path in Clip (Auxiliary audio stream in this case).
- SubPlayltemO includes sync_PlayItem_id and sync_start_PTS_of_PlayItem for designating the time at which the Sub Path starts to be played on the main path time axis (playback time axis).
- the Clip AV stream of the audio referred to by the Sub Path must not include STC discontinuities (system time base discontinuities).
- STC discontinuities system time base discontinuities.
- the clip's audio sample clock used for the subpath is locked to the main path's audio sample clock.
- SubPlayltem information specifying the Clip referred to by Sub Path, information specifying the playback path of Sub Path, and Sub Path starts playback on the time axis of Main Path Contains information specifying the time.
- Clip AV stream used for Sub Path Does not include the STC, the information included in SubPlayltem O (information specifying the clip referenced by the sub path, information specifying the playback path of the sub path, and sub path playback on the time axis of the main path) Based on the information specifying the start time), playback can be performed with reference to an audio stream of a ClipAV stream different from the ClipAV stream (main AV stream) referred to by the main path.
- Playltem and SubPlayltem manage ClipAV stream files, respectively.
- the ClipAV stream file (main AV stream) managed by Playltem and the ClipAV stream file managed by SubPlayltem are It will be a different file.
- a subtitle stream playback path that is played back at the same timing as the main path can be expressed using a subpath.
- FIG. 4 is a diagram for explaining another example of the main path and the sub path.
- the video and audio playback paths that are played back at the same timing as the main path (synchronized with AV) are expressed using sub-paths.
- the stream file referenced by Playltem-1 in the main path is the first half of Clip-0 primary video stream and primary audio stream, PG (Presentation on Graphics) stream and IG (Intaractive Graphics) stream.
- the main AV stream file referred to by Playltem-2 is the latter half of Clip-0 primary video stream and primary audio stream, and PG stream and IG stream.
- the stream files referred to by SubPlayltem in the sub path are a Clip-1 secondary (2ndary) video stream and a secondary audio stream.
- the primary (primary) video stream and the primary Mario stream that are referred to by the main node, and the PG stream and the IG stream are set as one movie content (AV content), and the secondary is referred to by the sub path.
- the video stream and secondary audio stream are commented on the movie as a bonus track, and the video and audio streams referenced in the sub-path are mixed with the video and audio streams referenced in the main path.
- Such a configuration is used in the case of playback (overlap).
- the secondary stream is not synchronized with the primary stream (a playlist setting that is always played back simultaneously), the user can synthesize a bonus track while watching a movie.
- a command to be displayed is input to a playback device (player)
- the video and audio of the primary stream referenced in the main path and the video and audio of the secondary stream referenced in the sub path are mixed and played back Similarly, such a configuration is used.
- PlayItem_id 0
- SubPlayltem (described later in FIG. 21) called by SubPath (described later in FIG. 19) includes SubPlayItem_IN_time and SubPlayItem_out_time for specifying the playback path of Subpath.
- the audio is synthesized and output, and the video is shown in Fig. 5.
- the secondary video is displayed on the main display screen 1 on which the primary video is displayed, being overlapped by the sub display screen 2 having a predetermined size at a predetermined position.
- the force stream that is referred to by the main path is the primary stream
- the reference stream that is referred to by the sub-node is the secondary stream.
- the primary stream and the secondary stream may be referenced by the main path
- the primary stream and the secondary stream may be referenced by! /
- the deviation may be referenced by the sub path! /.
- the primary video stream is displayed as a main screen, and the secondary stream is combined and displayed as a child screen in the main screen.
- loose PinP Pin-in-Picture
- FIG. 6 is a diagram showing an example of a file system of a data file that can be played back by the playback device 20 (described later with reference to FIG. 35).
- a data file that can be played back by the playback device 20 is supplied by a recording medium such as an optical disk!
- This file system has a directory structure.
- the playback device 20 can be used when a data file that can be played back by the playback device 20 is recorded and supplied on a recording medium other than the optical disk, or when it is recorded and played back on an internal recording medium.
- the data file that can be reproduced in 1 has a file system as shown in FIG.
- the Index file is a file in which the above-described index table is described, and includes information regarding a menu for reproducing a data file that can be reproduced by the reproducing device 20.
- the playback device 20 includes items such as, for example, playing back all content included in a data file that can be played back on the playback device 20, playing back only a specific chapter, playing back repeatedly, and displaying an initial menu.
- the playback menu screen is displayed on the display device based on the Index file.
- a NavigationObject that is executed when each item is selected can be set in the index file index table, and when one item is selected from the playback menu screen by the user, the playback device 20 selects the index file index. Execute the NavigationObject command set in the data table.
- the NavigationObject file is a file including NavigationObject.
- the NavigationObject includes a command that controls the playback of the PlayList included in the data file that can be played back by the playback device 20.
- the playback device 20 includes the NavigationObject included in the file system. The content can be played by selecting and executing one of these.
- the BDMV directory also includes a directory with the name “BACKUP” (BACKUP directory), a directory with the name “PLAYLIST” (PLAYLIST directory), A directory with the name of rCLIPINFj (CLIPINF directory), a directory with the name of “STREAM” (STREAM directory), and a directory with the name of “AUXDATA” (AUXDATA directory) are provided.
- a PlayList file is stored in the PLAYLIST directory. As shown in the figure, each PlayList file is named by adding the extension “.mpls” to the file name having a five-digit numerical power. The PlayList file will be described later with reference to FIG.
- a Clip Information file is stored in the CLIPINF directory. As shown in the figure, each Clip Information file is named with a five-digit number name plus the extension “.clpi”.
- a Clip AV stream file and a substream file are stored in the STREAM directory. As shown in the figure, each stream file is named by adding the extension “-m2ts” to the file name that also has a five-digit number.
- AUXDATA directory data that is not included in the Clip AV stream file or substream file but is referenced from the Clip AV stream file or substream file is independent of the Clip AV stream file or substream file. Stores files such as used data.
- the AUXDATA directory is named “11111.otf”! /, A subtitle font file, and a sound effect with the name “sound.bdmv”. Data is stored.
- a data file that can be played back by the playback device 20 is distributed by an optical disk, for example, the content producer such as a production company or a movie distributor, or the supply source of this recording medium.
- Author_id which is an identifier assigned to each title author in order to identify a certain title author, and an identifier assigned to identify the type of optical disc produced in the title author indicated in authorjd The discjd is recorded as secure electronic data that cannot be rewritten by the user, or physically in pits.
- a data file that can be played back by the playback device 20 is a remover other than an optical disk. Recorded on a portable recording medium, downloaded over a network, recorded on a recording medium inside the playback device, or stored in an internal storage unit.
- Authorjd, and id corresponding to authorjd are assigned so that they can be distinguished from each other, and preferably have a directory structure similar to that shown in FIG.
- the name “Index.bdmv” is the same as in the case described with reference to FIG.
- FIG. 7 shows the data structure of the PlayList file having the extension “.mpls” stored in the PLAYLIST directory.
- the typejndicator describes information indicating the type of this file, and this field must be encoded as "MPLS" according to IS0646.
- version_number indicates one character of four characters indicating the version number of xxxx.mpls, and vesrion_number must be encoded as ISO "0089"! / ⁇ .
- PlayList_start_address indicates the start address of PlayListO in units of the relative number of bytes from the start byte of the PlayList file.
- PlayListMark_start_address indicates the start address of PlayListMarkO in units of the relative number of bytes from the start byte of the PlayList file.
- PlayListEntsnsionData_start_address indicates the start address of PlayListEntsnsionDataO in units of the relative number of bytes from the start byte of the PlayList file.
- PlayListO stores parameters related to the main path and sub path of PlayList. Details of PlayListO will be described later with reference to FIG.
- PlayListMarkO stores PlayList mark information. Private data can be inserted into PlayListExtensionDataO! /
- FIG. 8 shows the syntax of PlayListExtensionDataO.
- Length indicates the number of bytes of P1 ayListExtensionDataO from immediately after this length field to the end of PlayListExtensionDataO.
- data_blocks_start_address indicates the head address of the first data_block () in units of the relative number of bytes from the head byte of PlayListExtensionDataO.
- ID1 / ID2 information (identifier, etc.) that can identify the type of information described in the data_block of this PlayListExtensionDataO is described.
- PL_ext_data_start_address indicates the number of the data_block where the PlayListExtensionData starts. The top data of PlayListExtensionData must be aligned to the top of data_block.
- PL_ext_data_length indicates the size of PlayListExtensionData in bytes.
- data_block is an area in which PlayListExtensionData is stored, and all data_blocks in PlayListExtensionData () must have the same size! /.
- data_block information indicating the display position and size of the sub display screen 2 on which the secondary video described with reference to FIG. 5 is displayed can be described.
- data_block metadata indicating picture-in-picture display settings can be described.
- FIG. 9 shows a first example of the syntax of pipjnetadata (metadata indicating picture-in-picture display settings) described in data_block.
- length is pipjnetadata from immediately after this length field to the end of pip_metadata0 Indicates the number of bytes in ().
- pip_metadata_type indicates the type of pipjnetadata.
- the secondary video stream for which picture-in-picture display is specified by this pipjnetadataO is synchronous, that is, synchronized with the main path Playltem time axis (playback time axis). Since it is played back, the secondary stream is always played back and output in synchronization with the primary stream.
- the secondary video stream for which picture-in-picture display is specified by this pipjnetadataO is asynchronous, that is, the power that is played back synchronously with the time axis of the SubPlay Subsystem in the Sub Path. Since there is no need to synchronize with the time axis, the secondary stream is reproduced and output only when the secondary stream display is commanded by the user's operation input.
- ref—to—Playltem I SubPath—id indicates the power indicating the value of Playltem—id to which pip—metadata is applied, or the value of SubPathjd of SubPlayltem to which pipjnetadata is applied.
- pip_metadata_time_stamp is the time stamp of Playltem to which pipjnetadata is applied, and the video of the secondary video stream is displayed on the secondary display screen 2 at the display position and size indicated by pipjnetadataO at the timing indicated by pip_metadata_time_stamp Is done.
- pip_metadata_time_stamp needs to indicate the presentation-time between In_time / Out_time of the platform referenced by re_playItem / SubPath_id.
- the video of the secondary video stream is displayed on the secondary display screen 2 at the display position and size specified by pipjnetadat a0 at the timing.
- pip_entry_video_PID indicates the value of the PID of the secondary video used for picture-in-picture display.
- pip_horizotal_position indicates the X coordinate of the upper left corner of the secondary display screen 2 on which V is displayed on the primary video frame (main display screen 1 in FIG. 5).
- Pip_vertical_position indicates the Y coordinate of the upper left corner of the secondary display screen 2 on which the secondary video is displayed on the primary video frame (main display screen 1 in FIG. 5).
- the X and Y coordinates of the lower right corner of the secondary display screen 2 and the shift display of a plurality of predetermined display positions are specified.
- the secondary display screen using pip_horizotal_position and pip_vertical_position may be specified by a method other than the specification of the X coordinate and the Y coordinate of the upper left corner of 2.
- pip_scale is a field in which information indicating the size of the secondary display screen 2 on which the secondary video is displayed is described. For example, the ratio of the size of the secondary video to the primary video, the original image of the secondary video The reduction ratio with respect to the size of the image, or information indicating how many pixels in the vertical and horizontal directions are included in the sub display screen 2 is described.
- FIG. 10 shows the values of pip_metadata_type in FIG. 9 and their meanings.
- pip_metadata_type 0x00 is a value whose meaning is not set for future expansion.
- this pip_metadata () is synchronous, that is, synchronized to the time axis of the main path Playltem, so the secondary stream is always played back in synchronization with the primary stream.
- this pipjnetadataO is asynchronous, that is, synchronized to the time axis of SubPlayltem of Sub Path, so only when secondary stream display is commanded by user operation input, The video of the secondary video stream is played back and output to the secondary display screen 2.
- the meaning is not set for future expansion.
- FIG. 11 shows a second example of the syntax of pipjnetadata described in data_block, which is different from the case in FIG. Note that the description of the data field having the same name and the same definition as in FIG. 9 is omitted as appropriate.
- number_of_pip_entries is a field indicating the number of picture-in-picture applications.
- Synchronous_PIP_metadata_flag indicates to which path the picture-in-picture application (that is, the secondary video stream) is synchronized. That is, when the value of Synchronous_PIP_metadata_flag is 0 ', this picture-in-picture application is synchronized with SubPath, and ref_to_SubPath_id is registered in ref_to_PlayItem / SubPathjd. On the other hand, when the value of Synchronous_PIP_metadata_flag is 1 ′, this picture-in-picture application is synchronized with MainPath, and re_o_PlayItem is registered in ref_to_PlayItem / SubPathjd.
- Synchronous_PIP_metadata_flag in FIG. 11 is information for defining almost the same content as pip_metadata_type (FIG. 10) in pipjnet adata described with reference to FIG.
- pip_metadata_type indicates the type of metadata of this picture-in-picture.
- secondary video The image displayed for the original stream image Information about image rotation, color designation, and the like can be described.
- pip_metadata_type in FIG. 11 is information whose definition is different from pip_metadata_type in pip_metadata described with reference to FIG.
- Fig. 12 shows the syntax of pip_metadata described in data_block in Fig. 9 and Fig. 1.
- Length is a field in which information indicating the number of bytes of pipjnetadat aO from immediately after the length field to the end of pipjnetadataO is described.
- pip—metadataO is a field that contains information indicating the number of V. 3 metadata block entries.
- Metadata_block_header [k] information including header information related to the metadata block is described.
- ref_to_PlayItem_id is a field in which information indicating the value of Playltemjd of Playltem to which pipjnetadata is applied is described.
- ref to_secondary_video_stream_id [k] is the id of the secondary video to which pipjnetadata is applied, the value of secondary_video_stream_id defined in STN_table (described later in FIG. 22) in the Playltem referenced by re_o_PlayItem_id This is a field in which information is displayed.
- pip_timeline_type [k] is a field in which information indicating the type of time axis referred to by pip_metadata_time_stamp is described. The value of pip_timeline_type [k] and its meaning will be described later.
- Is_luma_key is applied to the secondary video stream according to the values of lower_limit_luma_key and upper_limit_luma_key for luma-keying.
- the lower_limit_luma_key is a field in which information indicating the lower limit of the luminance value of the secondary video for luma keying is described.
- the upper_limit_luma_key is a field in which information indicating the upper limit of the luminance value of the secondary video for luma keying is described.
- luma keying cuts out unnecessary parts using the difference in brightness (luminance value). This means that the image is combined with the video.
- the is_luma_key flag is set to 1
- the image defined by lower_limit_luma_key and upper_limit_luma_key and having the brightness value specified in the range from the lower limit to the upper limit of the brightness value must be set to transparent.
- the secondary video from which the image in the range specified by the brightness information is removed is superimposed on the primary video and synthesized.
- luminance can be set by setting either one of the threshold values. For example, you can set only the part whose value is lower than upper_limit_luma_key or lower_limit_luma_key to be transparent! /.
- metadata_block_data_start_address [k] is a field in which information indicating the start address of the first metadata_block_data [k] 0 is described in units of the relative number of bytes from the start byte of pipjnetadataO. Note that metadata_block_data_start_address [k] entered in pipjnetadataO is registered with the aim of address value!
- a padding word is inserted into the padding—word according to the value of metadata—block—data—start—adaress [k].
- pip_metadata_time_stamp [i] represents the time when pipjnetadata is applied, for example, 45k
- This field describes information including a time stamp in Hz. Note that two consecutive pip-metadata-time-stamp / J and the fe between them are one less. Note that the value of pip—timeline—type is J
- pip_composition_metadataO is a field in which information on the size and position of the frame of the primary video is described.
- pip_horizontal_position [i] indicates the X coordinate of the upper left corner of the secondary display screen 2 on which the secondary video is displayed on the primary video frame (main display screen 1 in FIG. 5).
- pip_vertical_position [i] indicates the Y coordinate of the upper left corner of the secondary display screen 2 on which the secondary video is displayed on the primary video frame (main display screen 1 in FIG. 5).
- pip_scale [i] sets the reduction ratio for the original size of the secondary video to 1x, 1Z2x, 1Z4x, 1.5x, or foil Screen (secondary This field describes the information that the video displays on the entire screen of the main display screen 1 in FIG.
- pip-composition-metadataO of pip-metadata-time-stampLi] is between pip-metadata-time-stamp [i] force pip-metadata-time-stamp [i + l] interval 53 ⁇ 4J.
- the effective interval of the last pip—compositoin—metadata is the SuDPath presentation end timing that is lost by the last pip—metadata—time—stamp, ref—to—secondary—video—stream—id [k]. Up to.
- pip_timeline_type 0 is a value whose meaning is not set for future expansion.
- pip_timeline_type l means that the picture-in-picture presentation path is a synchronous type.
- the subpath type indicated by ref_to_secondary_video_streamjd must be 5 or 7 (details of the subpath type will be described later in FIG. 20).
- pip_metadata_time_stamp refers to the Playtime time axis of ref_to_PlayItem_id and indicates the playback section of the associated SubPlayltem. In other words, the playback area of SubPlayltem is projected on the time axis of Playltem referenced by re_o_PlayItemjd [k].
- pip_metadata_time_stamp [0] indicates the beginning of the playback section of SubPlayItem [0] in SubPath.
- SubPath_type 6
- SubPath contains only one SubPlayltem! /, Na! / ⁇ .
- pip_metadata_time_stamp refers to the time axis of Subpath, indicates the playback section of 3 ⁇ 4UDPlayItem of SuDPath, which is lost by ref—to—secondary—video—stream—id [k], as shown in FIG.
- the pip_metadata_time_stamp [0] indicates the beginning of the playback section of SubPlayltem.
- pip_timeline_type 3 means that the picture-in-picture presentation path is an asynchronous type. At this time pointed by ref_to_secondary_video_streamjd
- V, ru Subpath type must be 6! / ,.
- pip_metadata_time_stamp refers to the Playtime time axis of ref_to_PlayItem_id, and indicates the playback section of PlayList of o_PlayItem_id [k]! /, As shown in FIG. pip_metadata_time_stamp [0] indicates the beginning of the playback section of the Playltem of ref_to_PlayItem_id [k].
- Fig. 18 is a diagram illustrating the syntax of PlayListO.
- length is a 32-bit unsigned integer indicating the number of bytes from immediately after this length field to the end of PlayListO. That is, this field indicates the number of bytes from reserved_forjUture_use to the end of Playlist. 16-bit reserved_for_foture_use is prepared after this length.
- Fig. 19 is a diagram illustrating the syntax of SubPathO.
- length is a 32-bit unsigned integer indicating the number of bytes up to the end of the force Sub Path () immediately after the length field. That is, this field indicates the number of bytes from reserved_forjUture_use to the end of Playlist. 16-bit reserved_for_foture_use is prepared after this length.
- SubPath_type is an 8-bit field indicating the application type of SubPath. SubPath_type is used, for example, to indicate the type such as whether the Sub Path is a power bitmap subtitle that is audio or a text subtitle. This SubPath_type will be described later with reference to FIG.
- is_repeat_SubPath is a 1-bit field that specifies the SubPath playback method, and indicates whether the SubPath playback is repeated during the main path playback, or the SubPath playback is performed only once. For example, when the playback timing of the clip specified by the main AV stream and the sub path is different (when the main path is used as a slide show of still images and the audio path of the sub path is used as BGM (background music) of the main path). Is done. After Is_repeat_SubPath, 8 hits are reserved—for—future—usez).
- FIG. 20 is a diagram for explaining an example of SubPath.type (subpath type). That is, the type of Sub Path is defined as shown in FIG. 20, for example.
- the description "Out-of-mux ' ⁇ -type J and! Is the TS (transport stream) that includes the ES (elementary stream) referenced in the sub-path. )
- the type of subpath when the clip and TS (Clip) containing the play item (one or more ES) referenced in the main path are different, that is, the ES referenced in the subpath.
- SubPath tvpe 0, 1 is reserved.
- SubPath_type 2 indicates that in the playlist, the audio presentation path referred to by the sub path and the main path referred to by the play item are asynchronous.
- the subpath type (kind) is as follows: This is called a pass.
- the picture-in-picture presentation path refers to the primary audio stream stream and the secondary video for a predetermined primary video stream (the video stream referred to in the main path) in the picture-in-picture method described above.
- Stream, secondary audio stream, and subtitle stream 1 The above path (type of such sub-path).
- SubPath—type 6
- Out- of- mux ana AV non- synchronized type of Picture- in- Picture presentation path which contains one or more elementary streams paths i.e.
- SubPath_type 6
- the main path is a TS non-multiplexed and asynchronous type, which is a picture-in-picture presentation path (one or more ES paths).
- SubPath—type / is In- mux type ana AV Synchronized type of Picture-in- Picture presentation path which contains one or more elementary streams paths.
- the description "In-mux typej t ⁇ " includes a TS (Clip) that contains an ES referenced by a subpath and a play item (one or more ESs) that is referenced by a main node.
- This type of type is referred to as a main path multiplex type path.
- SubPath_type 7 is a main path TS multiplexing type and synchronous type path.
- Picture-in-picture presentation path (one or more ES path).
- SubPath_type 8 to 255 is reserved.
- Fig. 21 is a diagram illustrating the syntax of SubPlayltem (i).
- length is a 16-bit unsigned integer indicating the number of bytes from immediately after this length field to the end of Sub playltem ().
- SubPlayltem includes Clip_Information_file—name [0] for designating Clip.
- s_multi_Clip_entries flag is set, the syntax when SubPlayltem refers to multiple Clips is referenced. It also includes SubPlayItem_IN_time and SubPlayItem_OUT_time for specifying the Sub Path playback section in the Clip.
- sync_PlayItemjd and sync_start_PTS_of_PlayItem are included to specify the time when the Sub Path starts playback on the time axis of the main path.
- this sync_PlayItem_id and sync_start_PTS_of_PlayItem are used in the case of Fig. 3 and Fig. 4 (when the playback timing of the file indicated by the main AV stream and the sub path is the same), and indicated by the main AV stream and the sub path. It is not used when the playback timing of the file is different (for example, when the still image referenced by the main path and the audio referenced by the sub path are not synchronized, such as slide show BGM composed of still images).
- SubPlayItem_IN_time, SubPlayltem-OUT-time, sync-Piayltem- ⁇ , sync-start-PTS-of-Playltemi, etc. are also used in common for Clips referenced by SubPlayltem.
- num—of—Clip—entries [The number of lips is the number of clips, and the number of lip—Information—file—name [SubCiip—entryjd] is Clips that excludes Clip_Information_file_name [0]. specify. That is, Clip—Information—file—name [0], Clip—Information—file—name [l], Clip—Information—file_name [2], etc. are specified.
- SubPlayltem also specifies Clip_codec_identifier [SubClip_entry_id] which specifies the codec method of Clip, re_o_STC_id [SubClip_entry_id], which is information on STC discontinuity (system time base discontinuity), and reserved_fore- future-use 3 ⁇ 4 ⁇ Mu
- SubPlayltem-IN-time SubPlayltem-OUT-time
- the Text based subtitle for the selected SubClip_entryjd is the SubPlaylte m—IN—time, SubPlayltem—OUT—time, sync—Playltem—id, and sync—start—PTS—of—Play It em!
- SubClip_entryjd is the value of Clip_Information_file_name
- SubClip_entryjd is allocated from 1 in the order of appearance. Also, SubClip_entry_id of Clip_Information_file_name [0] is 0.
- Fig. 22 is a diagram illustrating the syntax of PlayltemO.
- Clip_Information_file_name [0] is a field for designating a clip referred to by Playltem.
- the main AV stream is referred to by Clip_Information_file_name [0].
- codec method of Clip Clip-codec-identiner [0], reserved- for-future-use, is-multi-angle, connection-condition, STC discontinuity (system time base discontinuity) Contains remu o_STCjd [0], which is information about.
- IN_time and OU_time for specifying the playback section of Playltem in Clip are included.
- the playback range of the main ClipAV stream final is represented by IN.time and OUT_time.
- UO_mask_table0, PlayItem_random_access_mode, and stilLmode are included.
- the case where there are a plurality of is_multi_angles is not directly related to the present invention, so the explanation is omitted.
- STN_table () in PlayltemO is the user's ability to switch audio and subtitles when the target Playltem and one or more SubPaths to be played back are prepared. It provides a mechanism that allows you to choose between the Clips referenced by and the Clips referenced by one or more of these SubPaths. STN_table () provides a mechanism that allows you to select mixing playback of two audio streams.
- FIG. 23 is a diagram illustrating a first example of the syntax of STN_table0.
- STN_table0 is set as an attribute of Playlt em.
- length is a 16-bit unsigned integer indicating the number of bytes from immediately after this length field to the end of STN_table0. 16-bit reserved_for_foture_use is prepared after length.
- number_of_video_stream_entries is entered in STN_table0 Indicates the number of streams to which video_stream_id (registered) is given.
- video_stream_id is information for identifying a video stream, and video_stream_number is a user-friendly video stream number used for video switching.
- number—of—audio—stream—entries indicates the number of streams of the first audio stream given the audio—stream—id entered in STN—tableO.
- the audio_stream_id is information for identifying the audio stream
- the audio_stream_number is the audio stream number that can be seen by the user used for switching the audio.
- number.of audio_stream2_entries indicates the number of streams of the second audio stream to which audio_streamjd2 entered in STN_tableO is given.
- audio_streamjd2 is information for identifying an audio stream
- audio_stream_number is an audio stream number that can be seen by the user used for audio switching.
- the number_of_audio_stream_entries audio stream entered in STN.tableO is an audio stream that is decoded by the 1st audio decoder 75-1 of the playback device 20 shown in FIG. 35 to be described later, and is entered in STN-tableO.
- the audio stream of number—of—audio—stream2—entries is an audio stream decoded by the 2nd audio decoder 75-2 of the playback apparatus 20 of FIG. In this way, in FIG. 23 and STN_tableO in FIGS. 33 and 34 described later, audio streams to be decoded by the two audio decoders can be entered.
- the number_of_audio_stream_entries audio stream decoded by the 1st audio decoder 75-1 of the playback device 20 in Fig. 35 that is, the primary audio stream is referred to as audio stream # 1, and the playback in Fig. 35 is performed.
- the number—of—audio—stream2—entries audio stream, that is, the secondary audio stream, which is encoded by the 2nd audio code 75-2 of the device 20, is referred to as audio stream # 2.
- Audio stream # 1 is assumed to be an audio stream that has priority over audio stream # 2.
- number_of_PG_txtST_stream_entries indicates the number of streams to which PG_txtST_stream_id entered in STN_tableO is given.
- a stream PG, Presentation Graphics stream
- txtST text subtitle file
- PG_txtST_stream_id is information for identifying a subtitle stream
- PG_txtST_stream_number is a subtitle stream number (text subtitle stream number) that can be seen by the user used for subtitle switching.
- number_of_IG_stream_entries indicates the number of streams to which IG_streamjd entered in STN_tableO is given.
- an interactive graphics stream is entered.
- IG_stream_id is information for identifying an interactive graphics stream
- IG_stream_number is a graphics stream number that can be seen by the user used for switching graphics.
- length is an 8-bit unsigned integer indicating the number of bytes from immediately after this length field to the end of stream_entry 0.
- type is an 8-bit field indicating the type of information necessary to uniquely identify the stream to which the above stream number is given.
- video_stream_id is assigned from 0 to one video elementary stream identified for each stream_entryO in order in the for loop of the video stream ID (video_stream_id).
- the video stream number (video_stream_number) may be used instead of the video stream ID (video_streamjd).
- video_stream_number is given from 1 instead of 0. That is, video_stream_number is obtained by adding 1 to the value of video.stream_id.
- the video stream number is defined from 1 because it is the video stream number that can be seen by the user when switching video.
- audio_stream amjd is given from 0 to one audio elementary stream that is specified for each stream_entry0 in order in the for loop of the audio stream ID (audio_stream_id).
- an audio stream number (audio-stream-number) may be used.
- audio_stream_number is given from 1 instead of 0. That is, audio_stream_number is obtained by adding 1 to the value of audio_stream_id.
- the audio stream number is an audio stream number that can be seen by the user and used for audio switching.
- audio_stream_id2 is given from 0 to one audio elementary stream specified for each stram_entry0 in order in the for loop of audio stream ID2 (audio_stream_id2).
- audio stream number 2 (audio-stream-number 2) may be used instead of the audio stream I D2 (audio-stream-id2).
- audio_stream_number2 is given from 1 instead of 0. That is, audio_stream_number 2 is obtained by adding 1 to the value of audio_stream_id2.
- Audio stream number 2 is defined as 1 since it is audio stream number 2 that can be seen by the user and used for audio switching.
- PG_txtST_stream_id is assigned from 0 to one bitmap subtitle elementary stream or text frame specified for each stream_entryO in the for loop of the subtitle stream ID (PG_txtST_stream_id). Similar to the case of the video stream, a subtitle stream number (PG_txtST_stream tremjiumber) may be used instead of the subtitle stream ID (PG_txtST_stream_id). In this case, PG_txtST_stream_number is given from 1 instead of 0. That is, PG_txtST_streamjd is obtained by adding 1 to the value of PG_txtST_streamjd.
- the subtitle stream number is defined from 1 because it is a subtitle stream number (text subtitle stream number) that can be used by the user for subtitle switching.
- IG_streamjd from 0 to IG_streamjd is given to one interactive graphics elementary stream specified for each stream_entry0 in order.
- a graphics stream number (IG_stream_number) may be used instead of the graphics stream ID (1G_stream_id).
- IG_stream_number is given from 1 instead of 0. That is, the power lG_stream_number is obtained by adding 1 to the value of IG_stream_id.
- the graphics stream number is defined from 1 because it is a user-friendly graphics stream number used for switching graphics.
- the stream_attribute0 in the for norpe of the video stream ID is stream Provides stream attribute information of one video elementary stream specified for each _entryO. That is, in this stream_attribute (), stream attribute information of one video elementary stream specified for each stream_entryO is described.
- stream_attribute () in the for loop of the subtitle stream ID (PG_txtST_stream_id) gives stream attribute information of one bitmap subtitle elementary stream or text subtitle elementary stream specified for each stream_entry0. . That is, in this stream.attributeO, stream attribute information of one bitmap subtitle elementary stream specified for each stream_entry0 is described.
- stream_attribute0 in the for loop of the graphics stream ID gives stream attribute information of one interactive graphics elementary stream specified for each stream_entry0. That is, stream attribute information of one interactive graphics elementary stream specified for each stream_entry0 is described in stream_attribute0.
- length is a 16-bit unsigned integer indicating the number of bytes from immediately after this length field to the end of stream_attributeO.
- the stream_coding_type indicates the encoding type of the elementary stream as shown in FIG.
- the encoding type of elementary stream is MPEG-2 video stream, HDM
- V LPCM audio Dolby A — 3 audio ⁇ dts audio ⁇ Presentation graphics stream, Interactive graphics stream ⁇ , and Text subtitle stream.
- video_format indicates the video format of the video elementary stream, as shown in FIG.
- the video format of the video elementary stream is 480i, 576i
- frame_rate indicates the frame rate of the video elementary stream as shown in FIG.
- the frame rate of the video elementary stream is 24000Z1001, 24
- the aspect_ratio indicates the aspect ratio information of the video elementary stream as shown in FIG.
- the aspect ratio information of the video elementary stream is described as 4: 3 display aspect ratio ⁇ and 16: 9 display aspect rations.
- the audio_presentation_type indicates the presentation type information of the audio elementary stream as shown in FIG. Audio elementary stream presentations are described as single mono channel, dual mono channel, stereo (2-channel), and multi-channel force S.
- the sampling_frequency indicates the sampling frequency of the audio elementary stream as shown in FIG. 48kHz and 96kHz are described as the sampling frequency of the audio elementary stream.
- audio_language_code indicates the language code (Japanese, Korean, Chinese, etc.) of the audio elementary stream.
- PG_language_code indicates the language code (Japanese, Korean, Chinese, etc.) of the bitmap subtitle elementary stream.
- IG_language_code indicates the language code (Japanese, Korean, Chinese, etc.) of the interactive graphics elementary stream.
- textST_language_code indicates the language code of the text subtitle elementary stream (eg, Japanese, Korean, Chinese).
- the character_code indicates the character code of the text subtitle elementary stream as shown in Fig. 32.
- Character codes for text subtitle elementary streams include Unicode Vl. KlSO 10646—1), Shift JIS (Japanese), KSC 5601—1987 including KSC 5653 f or Roman character (Korean), GB 18030-2000 (Chinese) ⁇ GB2312 (Chinese) ⁇ and B
- stream_attribute0 contains the video format (Fig. 27), frame rate ( Fig. 28) and aspect ratio information (Fig. 29).
- stream_coding_type in Figure 25 is HDMV LPCM audio ⁇ Dolby A and 3 audio ⁇ 3; or dts audio (in Figure 2o, stream—attnbuteO has its Includes presentation type information (Figure 30), sampling frequency ( Figure 31), and language code for audio elementary streams.
- stream_attribute0 includes the language code of the bitmap subtitle elementary stream.
- stream_attribute0 includes the language code of the interactive graph elementary stream.
- stream_coding_type in Fig. 25 is Text subtitle s tream (Fig. 26)
- stream_attribute0 contains the character code (Fig. 32)
- language of the text subtitle elementary stream Contains code.
- the playback device can check whether or not it has a function of playing back the elementary stream. Further, the playback device can select the elementary stream corresponding to the initial information of the language setting of the playback device by examining this attribute information.
- the playback apparatus has only a playback function for bitmap subtitle elementary streams and does not have a playback function for text subtitle elementary streams.
- the playback device sequentially selects only the bitmap subtitle elementary stream from the for loop of the subtitle stream ID (PG_txtST_stream_id), and Reproduce.
- the playback device selects an audio elementary stream whose language code is Japanese from the audio stream ID for loop. Select only one by one and play it.
- the playback device when playing back an AV stream (for example, a movie) that is referred to by the main path and is a video stream and an audio stream, the user instructs the playback device to switch the audio, and the audio stream
- # 1 audio output in a normal movie
- audio stream # 2 segment by director or performer
- # 2 is specified (selected) as audio to be played
- the playback device Audio stream # 2 is mixed (superimposed) and played with the video stream.
- audio stream # 1 and audio stream # 2 are both audio streams included in the clip referenced by the main node so that they can be divided even if STN—tableO in FIG. 23 is referenced. Good. Also, one of the audio stream # 1 and the audio stream # 2 may be an audio stream included in a clip referenced by the main path, and the other may be an audio stream included in the clip referenced by the sub path. In this way, multiple audios superimposed on the main AV stream referenced by the main node It is also possible to select two audio streams and mix them for playback.
- STN_table () in PlayltemO performs operations such as audio switching and subtitle switching by the user when this Playltem and one or more SubPaths to be played back are prepared.
- a clip that is referenced by this Playltem and a clip that is referenced by one or more SubPaths has been provided so that the main AV stream is recorded and is different from the AV stream to be played back. You can also perform interactive operations on data files and data files.
- each SubPath refers to a SubPlayltem
- playback can be performed by referring to a ClipAV stream file that is different from the ClipAV stream file referenced by the Main Path. Thus, it can be set as the structure which has extensibility.
- STN_tableO in PlayltemO is the audio stream # 1 decoded by the 1st audio decoder 75-1 of the playback device 20 shown in Fig. 35 to be described later, and the audio decoded by the 2nd audio decoder 75-2. Provided a mechanism that can mix and play with stream # 2. For example, if PlayltemO and one or more SubPaths to be played back are prepared, the audio stream of the clip referenced by Playltem is set as audio stream # 1, and the audio of the clip referenced by SubPath is recorded. The audio stream is set to audio stream # 2, and a mechanism that can mix and play these is provided.
- audio stream # 1 and audio stream # 2 Two audio streams included in a clip (main clip) that Playltem refers to are called audio stream # 1 and audio stream # 2, and these can be mixed and played back.
- This allows an audio stream that contains the main AV stream to be different from the main audio stream being played (for example, a director's comment stream). Can be reproduced.
- the two audio streams # 1 and # 2 superimposed on the main AV stream can be superimposed (mixed) and reproduced.
- FIGS. 33 and 34 are diagrams illustrating a second example of the syntax of STN_table ().
- Figs. 33 and 34 show the STN_tableO thin when defining a combination of a secondary video stream, primary audio stream, secondary audio stream, and subtitle stream that is played back in combination with the primary video stream. It is a figure which shows the example of a tax. In the figure, the same parts as those in FIG. 23 are repeated, and the description thereof is omitted.
- the combinations of the secondary video stream, primary audio stream, secondary audio stream, and subtitle stream that are simultaneously played in combination with the primary video stream are defined as follows. The That is, first, one or more secondary video streams that are simultaneously played in combination with the primary video stream are defined. Then, for each of one or more secondary video streams, an audio stream (primary audio stream and secondary audio stream) and a subtitle stream to be played simultaneously are defined.
- Number_of_video_stream2_entries is (are registered) is the entry in the STN_tableO v id eo _stream_id2 indicating the number of streams is given.
- video_streamjd2 is information for identifying a secondary video stream
- video_stream_number2 is a secondary video stream number that is used for video switching and is visible to the user.
- one video elementary stream (video elementary stream that becomes the secondary video stream) identified for each stream_entryO in turn in the for loop of video stream ID2 (video_stream_id2) ) Is given video_stream_id2 from 0.
- audi stream—id and audio—stream—id2 are given as many as number_of_Audio_combinations_for_video2.
- number—of—Audio—combinations—for—video 2 and the subsequent for statement are combinations of audio streams that are played back simultaneously with the secondary video stream, that is, the primary audio stream specified by audio_stream_id and audio_stream_id2. This information defines the combination with the specified secondary audio stream.
- the number of audio stream pairs that can be combined with the secondary video stream specified by video_stream_id2 (strings of primary audio stream and secondary audio stream) is number—of—Audio—combinations—for—video2.
- a audio_stream_id that identifies the primary audio stream audio_stream_id2 that identifies the secondary audio stream, and force defined by the for statement after nu_mber_of_Audio_combinations_for_video2, respectively Is done.
- PG_textST_streamjd The number of subtitle streams that can be combined with the secondary video stream specified by video_streamjd2 is number_of_Subtitle_combinations_for_video2, and the subtitle stream that can be combined with the secondary video stream specified by video_stream_id2 is specified.
- PG_textST_stream_id to be defined is defined in the for statement after nmber_of_Subtitle_combinations_for_video2.
- each number is used instead of each ID, for example, an audio stream number (audio-stream-number) is used instead of au dio-stream-id. Or you can use audio stream number 2 (audio_stream_number2) instead of audio_stream_id2. The same is true for video streams and subtitle streams.
- video_stream_id2 is used to synchronize with the primary video stream.
- a combination of a primary audio stream, a secondary audio stream, and a subtitle stream that are played back simultaneously with the secondary video stream can be defined. That is, it is possible to define a combination of a secondary video stream, a primary audio stream, a secondary audio stream, and a subtitle stream that are played back simultaneously with the primary video stream.
- STN_tableO in PlayltemO if this Playltem and one or more SubPaths to be played back are prepared, set Subpath_type of this Subpath to 5 to 7, that is, picture-in-picture presentation.
- a path one or more of the secondary video stream, secondary audio stream, and presentation graphic stream
- the primary video that is mainly played back as described with reference to FIG.
- picture-in-picture display can be performed.
- the display settings of the picture pictures are described in pipjnetadata as described with reference to FIG. 9, FIG. 11, or FIG. 12, the sub-picture described with reference to FIG.
- the size of the display screen 2 and the display position can be arbitrarily set by the content creator or the content provider without depending on the playback device.
- the picture-in-picture display setting is not described in the data itself of the video stream (secondary video stream) displayed in picture-in-picture. It is described in. In other words, if the size or display position of the secondary display screen 2 described with reference to FIG. 5 is changed, it is not necessary to modify the video stream (secondary video stream) displayed in picture-in-picture.
- the description of pipjnetadata is corrected. Correct it.
- the display setting of the picture picture has been described with reference to FIG. 5 because it has been described in pipjnetadata as described with reference to FIG. 9, FIG. 11, or FIG.
- the size of the video displayed on the secondary display screen 2 can be arbitrarily set by the content creator or the content provider regardless of the size of the image that is the source of the secondary video stream.
- FIG. 35 is a block diagram illustrating a configuration example of the playback device 20 to which the present invention has been applied.
- the playback device 20 is a playback device 20 that plays the PlayList having the main path and the sub path described above.
- the playback device 20 is provided with a storage drive 31, a switch 32, an AV decoder 33, and a controller 34! /.
- the controller 34 reads the PlayList file via the storage drive 31, and based on the information of the PlayList file, the controller 34 reads the HDD, Blu-ray disc, DVD, etc. Read AV stream and AV data from the recording medium.
- the user can use the user interface to instruct the controller 34 to switch between audio and subtitles.
- the controller 34 is also supplied with initial information of the language setting of the playback device 20 such as a storage unit (not shown).
- the PlayList file includes STN_tableO in addition to Main Path and Sub Path information.
- the controller 34 includes a main clip AV stream file (hereinafter referred to as a main clip) referred to by the Playltem included in the PlayList file, a sub Clip AV stream file (hereinafter referred to as a sub Clip) referred to by the SubPlayltem, and a reference from the SubPlayltem.
- the text subtitle data to be read is also read from the recording medium or the like via the storage drive 31.
- the main clip referred to by Playltem and the sub Clip referred to by SubPlayltem may be recorded on a recording medium different in power.
- the main clip may be recorded on a recording medium, and the corresponding sub clip may be supplied via a network (not shown) and stored in the HDD.
- the controller 34 selects an elementary stream corresponding to the playback function of itself (the playback device 20) and controls to play it, or only the elementary stream corresponding to the initial information of the language setting of the playback device 20. Select and control playback The
- the controller 34 refers to the information (or identifier) described in ID1 / ID2 of PlayListExtensionDataO of the PlayList file, and information (pip_metadata) related to picture-in-picture display is stored in the data_block of this PlayListExtensionDataO. If it is detected, refer to pipjnetadata described with reference to Fig. 9 or Fig. 11, and obtain the display setting of the secondary video displayed on the secondary display screen 2 described with V in Fig. 5
- the video plane generator 92 of the AV decoder 33 controls the synthesis of the primary video stream and the secondary video stream.
- the AV decoder unit 33 includes buffers 51 to 54, PID filter 55, PID filter 56, switches 57 to 59, PID filter 60, background decoder 71, 1st video decoder 721, 2nd video decoder 72-2. Presentation graphics decoder 73, Interactive graphics decoder 74, 1st audio decoder 75—1, 2nd audio decoder 75—2, Text-ST composition 76, Switch 77, Background plane generator 91, Video plane generator A unit 92, a presentation graphics plane generation unit 93, an interactive graphics plane generation unit 94, a nota 95, a video data processing unit 96, a mixing processing unit 97, and a mixing processing unit 98 are provided.
- the 1st video decoder 72-1 is for decoding the primary video stream
- the 2nd video decoder 72-2 is for decoding the secondary video stream.
- the 1st audio decoder 75-1 decodes the audio stream # 1 (primary audio stream)
- the 2nd audio decoder 75-2 decodes the audio stream # 2 (secondary audio stream). Is for.
- the 1st video decoder 72-1 for decoding the video stream given by V_ and video_streamjd in STN_tableO in Fig. 23 and Figs.
- the playback device 20 has two video decoders (1st video decoder 72-1 and 2nd video decoder 72-2) for decoding two video streams, and two video decoders are provided.
- two audio decoders (1st audio decoder 75-1 and 2nd audio decoder 75-2) are provided.
- the 1st video decoder 72-1 and the 2nd video decoder 72-2 are not individually distinguished, they are referred to as the video decoder 72, and the 1st audio decoder 75-1 and the 2nd audio decoder 752 are not individually distinguished.
- the audio decoder 75 is referred to.
- the file data read by the controller 34 is demodulated by a demodulation and ECC decoding unit (not shown), and error correction is performed on the demodulated multiplexed stream.
- the switch 32 selects the demodulated and error-corrected data for each stream type based on the control from the controller 34 and supplies the selected data to the corresponding buffers 51 to 54. Specifically, the switch 32 supplies the background image data to the buffer 51, the main clip data to the buffer 52, and the sub clip data to the buffer 53 based on the control from the controller 34. Then, switch 32 is switched to supply Text-ST data to buffer 54.
- the buffer 51 buffers the background image data
- the buffer 52 buffers the data of the main clip
- the buffer 53 buffers the data of the sub clip
- the buffer 54 stores the text-ST data. Is buffered.
- the main clip is a stream (for example, a transport stream) obtained by multiplexing one or more streams in addition to video among video, audio, bitmap subtitle (Presentation Graphics stream), and interactive graphics.
- a sub clip is a stream in which one or more streams of video, bit map subtitles, interactive graphics, and audio are multiplexed.
- the data of the text subtitle data file may or may not be in the form of a multiplexed stream such as a transport stream.
- each file may be read alternately in a time division manner, or the sub clip or text All subtitle data may be preloaded into the buffer (buffer 53 or buffer 54) before reading the main clip.
- the playback device 20 also reads the recording medium power of the data of these files via the storage drive 31, and plays back video, bitmap subtitles, interactive graphics, and audio.
- the stream data read from the buffer 52 which is the main clip read buffer is output to the subsequent PID (packet ID) filter 55 at a predetermined timing.
- the PID filter 55 distributes the input main clip to the decoders of the subsequent elementary streams according to the PID (packet ID) and outputs the result. That is, the PID filter 55 supplies the video stream to the PID filter 60 for supplying the video stream to either the 1st video decoder 72-1 or the 2nd video decoder 72-2, and the presentation graphics stream 73.
- switch 57 which is the source of the video
- switch 58 which is the source of the interactive graphics decoder 74
- the audio stream is supplied to the 1st audio decoder 75-1 and 2nd audio decoder 75.
- Switch 59 which is the supply source to 2.
- the presentation graphics stream is, for example, bitmap subtitle data
- the text subtitle data is, for example, text subtitle data.
- the stream data read from the buffer 53 which is the sub clip read buffer, is output to the PID (packet ID) filter 56 in the subsequent stage at a predetermined timing.
- This PID filter 56 distributes the input sub clip to the decoders of the subsequent elementary streams according to the PID (packet ID) and outputs the result. That is, the PID filter 56 supplies the supplied video stream to the PID filter 60 for supplying either the 1st video decoder 72-1 or the 2nd video decoder 72-2, and the presentation graphics stream is supplied to the presentation graphics.
- the video stream is supplied to the switch 57 that is the source of the video decoder 73, the interactive graphics stream is supplied to the switch 58 that is the source of the interactive graphics decoder 74, and the audio stream is sent to the 1st audio decoder 75-1 and 2nd audio. This is supplied to the switch 59 which is the supply source to the decoder 75-2.
- the switch 57 selects one of the presentation graphics stream included in the main clip supplied from the PID filter 55 and the presentation graphics stream included in the sub clip, and selects the selected presentation graphic.
- the stream is supplied to the presentation graphics decoder 73 at the subsequent stage.
- the presentation graphics decoder 73 decodes the presentation graphics stream, and supplies the decoded presentation graphics stream data to the switch 77 serving as a supply source to the presentation graphics plane generating unit 93.
- the switch 58 selects one of the interactive graphics stream included in the main clip supplied from the PID filter 55 and the interactive graphics stream included in the sub clip.
- the obtained interactive graphics stream is supplied to the subsequent interactive graphics stream decoder 74. That is, the interactive graphics stream that is simultaneously input to the interactive graphics decoder 74 is a stream separated from either the main clip or the sub clip.
- the interactive graphics decoder 74 decodes the interactive graphics stream, and supplies the decoded interactive graphics stream data to the interactive graphics plane generating unit 94.
- the switch 59 selects one of the audio stream included in the main clip supplied from the PID filter 55 and the audio stream included in the sub clip, and selects the selected audio stream. This is supplied to the 1st audio decoder 75-1 or the 2nd audio decoder 75-2 at the subsequent stage.
- the audio stream simultaneously input to the 1st audio decoder 75-1 is a stream separated from either the main clip or the sub clip.
- the audio stream that is simultaneously input to the 2nd audio decoder 75-2 is a stream separated from either the main clip or the sub clip. For example, if audio stream # 1 and audio stream # 2 are included in the main clip, PID filter 55 filters audio stream # 1 and audio stream # 2 based on the PID of the audio stream. To 59 Supply.
- the switch 59 selects the switch to supply the audio stream # 1 supplied from the PID filter 55 to the 1st audio decoder 75-1 and the audio stream # 2 supplied from the PID filter 55. To the 2nd audio decoder 75-2.
- PID filter 60 is included in the main clip supplied with PID filter 55! /, Or the video stream included in the sub-clip supplied from PID filter 56 Based on the control of the controller 34, it is determined whether it is a primary video stream or a secondary video stream, the primary video stream is supplied to the 1st video decoder 72—1, and the secondary video stream is supplied to the 2nd video decoder 72. Supply to 2.
- the video stream distributed by the PID filter 60 is supplied to the subsequent 1st video decoder 72-1 or 2nd video decoder 72-2.
- the 1st video decoder 72-1 or the 2nd video decoder 72-2 decodes the supplied video stream and outputs the decoded video data to the video plane generation unit 92.
- the video plane generating unit 92 When the video data is supplied from the 1st video decoder 72-1 and the 2nd video decoder 72-2, the video plane generating unit 92 generates the supplied video data based on the control of the controller 34 referring to pipjnetadata. A video plane composed of the main display screen 1 and the sub display screen 2 as described with reference to FIG. 5 is generated and supplied to the video data processing unit 96. In addition, when only the 1st video decoder 72-1 is supplied with video data, the video plane generation unit 92 generates a video plane using the supplied video data and supplies this to the video data processing unit 96. . Note that combining two video data is also referred to as mixing or superimposing, for example.
- the 1st audio decoder 75-1 decodes the audio stream and supplies the decoded audio stream data to the mixing processing unit 101.
- the 2nd audio decoder 75-2 decodes the audio stream and supplies the decoded audio stream data to the mixing processing unit 101.
- the 1st audio decoder 75 —Audio stream # 1 decoded by 1 and audio stream # 2 decoded by 2nd audio decoder 75 — 2 are supplied to mixing processing section 101.
- the mixing processing unit 101 mixes (superimposes) the audio data from the 1st audio decoder 75-1 and the audio data from the 2nd audio decoder 75-2, and outputs them to the mixing processing unit 97 in the subsequent stage.
- mixing (superimposing) the audio data output from the 1st audio decoder 75-1 and the audio data output from the 2nd audio decoder 75-2 is also referred to as synthesis.
- synthesis means to mix two audio data.
- the sound data selected by the switch 32 is supplied to the nother 95 and is notoffed.
- the nota 95 supplies the sound data to the mixing processing unit 97 at a predetermined timing.
- the sound data is sound effect data by menu selection or the like.
- the mixing processing unit 97 is an audio data mixed by the mixing processing unit 101 (an audio data output from the 1st audio decoder 75-1 and an audio data output from the 2nd audio decoder 75-2). DIO data) and sound data supplied from NOFFA 95 are mixed (superimposed or synthesized) and output as an audio signal.
- the data read from the buffer 54 which is a text subtitle read buffer, is output to the text subtitle composition (decoder) 76 in the subsequent stage at a predetermined timing.
- the text subtitle composition 76 decodes the Text-ST data and supplies it to the switch 77.
- the switch 77 selects either the presentation graphics stream decoded by the presentation graphics decoder 73 or Text-ST (text subtitle data), and displays the selected data as a presentation graphics sprayer. Supplied to the generator 93. In other words, the same as the presentation graphics plane 93 The subtitle image that is sometimes supplied is the output of either the presentation graphics decoder 73 or the text subtitle (Text-ST) composition 76.
- the presentation graphics stream input to the presentation graphics decoder 73 at the same time is a stream separated from the main clip or sub clip! And the displacement force (selected by switch 57). Therefore, the subtitle image that is simultaneously output to the presentation graphics plane 93 is a presentation graphics stream from the main clip, a powerful presentation graphics stream from the sub clip, or a decoded output of text subtitle data.
- the knock ground plane generation unit 91 Based on the knock ground image data supplied from the background decoder 71, the knock ground plane generation unit 91 generates a background plane that becomes a wallpaper image when, for example, a video image is reduced and displayed. This is supplied to the video data processing unit 96.
- the presentation graphics plane generating unit 93 generates, for example, a presentation graphics plane that is a rendering image based on the data (presentation graphics stream or text subtitle data) selected by the switch 77 and supplied. This is supplied to the video data processing unit 96.
- the interactive graphics plane generation unit 94 generates an interactive graphics plane based on the data of the interactive graphics stream supplied from the interactive graphics decoder 74, and supplies this to the video data processing unit 96. .
- the video data processing unit 96 includes a background plane from the background plane generation unit 91, a video plane from the video plane generation unit 92, a presentation graphics plane from the presentation graphics plane generation unit 93, and an interactive graphic spray.
- the interactive graphics plane from the video generator 94 is synthesized and output as a video signal.
- the mixing processing unit 97 mixes the audio data from the mixing processing unit 101 (the audio data decoded by the 1st audio decoder 75-1 and the audio data decoded by the 2nd audio decoder 75-2). Audio data) and sound data from buffer 95 are mixed (synthesized or superimposed) and output as an audio signal.
- switches 57 to 59 and the switch 77 switch the switches based on the selection from the user via the user interface or the file side including the target data. For example, when the audio stream is included only in the sub Clip AV stream file, the switch 59 switches the switch to the sub side.
- step S1 the controller 34 reads a PlayList file recorded on a recording medium or an HDD (Hard Disk Drive) (not shown) via the storage drive 31.
- a PlayList file recorded on a recording medium or an HDD (Hard Disk Drive) (not shown) via the storage drive 31.
- the PlayList file (X X X X .mpls) described with reference to FIG. 7 is read.
- step S2 the controller 34 determines whether or not the secondary video exists in the stream to be played based on the read PlayList file.
- the controller 34 executes, for example, the STN_tableO secondary video loop (video_streamjd2 for loop) described in FIG. 33 and FIG. 34 in the Playltem in the read PlayList file. Thus, it is determined whether or not secondary video exists in the stream to be played back.
- the STN_tableO secondary video loop video_streamjd2 for loop
- step S3 If it is determined in step S2 that there is no secondary video, in step S3, a primary video playback process described later with reference to Figs. 37 to 39 is executed.
- step S4 the controller 34 determines whether or not the reproduction of the stream corresponding to the read PlayList file has ended. If it is determined in step S4 that the playback of the stream is not finished, the process returns to step S3, the subsequent processing is repeated, and if it is determined that the playback of the stream is finished, the process ends. Be done
- step S2 If it is determined in step S2 that a secondary video exists, the controller 34 reads pipjnetadata in step S5.
- pipjnetadata related to Playltem may be stored in the memory, or as described in FIGS. 7 to 9, pipjnetadata Since it is a part of Playlist, pipjnetadata can be read at the same time when PlayList file of Step SI is read.
- the controller 34 for example, based on the half (J ⁇ 1 "5 is secondary- video- stream- id by the current Playltemjd and STN_tableO, and Toku ⁇ the interest Pip- metadata, Read the specified pipjnetadata from the PlayList file.
- step S6 the controller 34 synchronizes the secondary stream with the primary stream based on the description of the read pipjnetadata (that is, synchronizes with the time axis of the Play Item of the main path). Determine whether or not.
- the controller 34 determines that the secondary stream is based on the value of pip_metadata_type described with reference to FIG. If the data structure of pip.metadata is described with reference to Fig. 11, the controller 34 refers to the Synchronous_PIP_metadata_flag and the secondary stream is set to the primary stream. It can be determined whether or not the force is synchronized.
- the controller 34 uses Subpathj pe to determine whether the secondary stream is synchronized with the primary stream. Judgment can be made.
- step S6 If it is determined in step S6 that the secondary stream is synchronized with the primary stream, the process proceeds to step S7, and the controller 34 displays the display time power of the primary stream as described with reference to FIG. Judge whether or not it is SubPlayItem_IN_time indicating the beginning of the playback section of the secondary stream.
- step S7 If it is determined in step S7 that the display time power of the primary stream is not SubPlayItem_IN_time, a priority described later with reference to Figs. 37 to 39 is used in step S8. Mari video playback processing is executed.
- step S9 the primary and secondary video described later with reference to FIGS. 40 to 42 are used. Playback processing is executed.
- step S10 the controller 34 determines whether or not it is SubPlayItem_OUT_time indicating the end of the playback time period of the primary stream display time power secondary stream. If it is determined in step S10 that it is not SubPlayItem_OUT_time, the process returns to step S9, and the subsequent processing is repeated. If it is determined that it is SubPlayItem_OUT_time, the controller 34 It is determined whether or not the playback of the stream corresponding to the read PlayList file is complete.
- step S11 If it is determined in step S11 that the playback of the stream is not finished, the process returns to step S7, and the subsequent processing is repeated. If the playback of the stream is determined to be finished, the process Is terminated.
- step S6 determines whether the secondary stream is V ⁇ in synchronization with the primary stream (that is, in synchronization with the time axis of the SubPlayltem in the sub path! Since only the primary video stream is displayed until the display of the secondary video stream is instructed, the primary video playback process described later with reference to FIGS. 37 to 39 is executed in step S12.
- step S13 the controller 34 determines whether or not it has received the instruction to display the secondary video stream from the user.
- step S14 the controller 34 finishes reproducing the stream corresponding to the read PlayList file. Determine whether or not. If it is determined in step S14 that the playback of the stream is not complete, the process returns to step S12, and the subsequent processes are repeated. If it is determined that the playback of the stream is complete, the process ends. Is done.
- step S 13 if it is determined in step S 13 that an instruction to display a secondary video stream has been received, a primer to be described later with reference to FIGS. 40 to 42 in step S 15. And secondary video playback processing is executed.
- step S16 the controller 34 determines whether or not it has received a command to end display of the secondary video stream from the user. If it is determined in step S16 that a secondary stream display end command has been received, the process returns to step S12, and the subsequent processes are repeated.
- step S17 the controller 34 finishes playback of the stream corresponding to the read PlayList file. Determine whether or not. If it is determined in step S 17 that the stream playback is not finished, the process returns to step S 15, and the subsequent processing is repeated. If it is determined that the stream playback is finished, the process is resumed. Is terminated.
- step S3 the primary video playback process executed in step S3, step S8, or step S12 of FIG. 36 will be described.
- step S41 the controller 34 reads out the main clip, sub clip, and text subtitle data (Text-ST data). Specifically, the controller 34 reads the main clip based on the Playltem described with reference to FIG. 22 included in the PlayList described with reference to FIG. Further, the controller 34 reads out the sub clip and the text subtitle data based on the SubPlayltem described with reference to FIG. 21, which is referred to by the SubPath included in the PlayList.
- the controller 34 reads out the main clip, sub clip, and text subtitle data (Text-ST data). Specifically, the controller 34 reads the main clip based on the Playltem described with reference to FIG. 22 included in the PlayList described with reference to FIG. Further, the controller 34 reads out the sub clip and the text subtitle data based on the SubPlayltem described with reference to FIG. 21, which is referred to by the SubPath included in the PlayList.
- step S42 the controller 34 reads the read data (main clip, sub clip). , And text subtitle data) are controlled to the corresponding buffers 51 to 54. Specifically, the controller 34 supplies the knockout image data to the buffer 51, the main clip data to the buffer 52, the sub clip data to the buffer 53, and the Text-ST data. Switch 32 is fed to buffer 54.
- step S 43 the switch 32 switches the switch 32 based on the control from the controller 34.
- the knock ground image data is supplied to the notper 51
- the main clip data is supplied to the buffer 52
- the sub clip data is supplied to the buffer 53
- the text subtitle data is supplied to the buffer 54.
- each of the buffers 51 to 54 buffers the supplied data. Specifically, the knocker 51 buffers the knockout image data, the buffer 52 buffers the main clip data, and the buffer 53 buffers the sub clip data. 54 buffers Text-ST data.
- step S 45 the nota 51 outputs the background image data to the background decoder 71.
- step S46 the buffer 52 outputs the stream data of the main clip to the PID filter 55.
- the PID filter 55 distributes to the decoder of each elementary stream based on the PID attached to the TS packet constituting the main Clip AV stream file. Specifically, the PID filter 55 supplies the video stream to the 1st video decoder 72-1 via the PID filter 60 and the presentation graphics stream to the switch 57 serving as the supply source to the presentation graphics decoder 73.
- the interactive graphics stream is supplied to the switch 58 serving as the supply source to the interactive graphics decoder 74, and the audio stream is supplied to the switch 59 serving as the supply source to the 1st audio decoder 75-1.
- video streams, presentation graphics streams, interactive graphics streams, and audio streams are assigned different PIDs.
- step S48 the buffer 53 converts the sub clip stream data to the PID filter 56. Output to.
- step S49 the PID filter 56 distributes to the decoder of each elementary stream based on the PID. Specifically, the PID filter 56 supplies the supplied video stream to the 1st video decoder 72-1 through the PID filter 60 and the presentation graphics stream to the switch serving as the supply source to the presentation graphics decoder 73. 57, and the interactive graphics stream is supplied to the switch 58, which is the source of the interactive graphics decoder 74. The audio stream is the source of the 1st audio decoder 75-1 and 2nd audio decoder 75-2. Supply to switch 59.
- step S50 the switches 57-59 to 59 and PID filter 60 following PID filter 55 and PID filter 56 perform either main clip or sub-clip based on control from controller 34 via the user interface. select. Specifically, the switch 57 selects the presentation graphics stream of the main clip or sub clip supplied from the PID filter 55 and supplies it to the presentation graphics decoder 73 at the subsequent stage. The switch 58 selects the main clip or sub clip interactive graphics stream supplied from the PID filter 55 and supplies the selected interactive graphics stream decoder 74 to the subsequent interactive graphics stream decoder 74.
- the switch 59 receives the audio stream of the main clip supplied from the PID filter 55 or the sub clip supplied from the PID filter 56 (in this case, the audio stream # 1 before the audio is switched). This is selected and supplied to the 1st audio decoder 75-1.
- the switch 59 supplies the audio stream of the main clip to the 2nd audio decoder 75-2 or the audio stream of the sub clip as the 1st audio.
- the decoder 75-1 and the 2nd audio decoder 75-2 here, since the playback process before the sound is switched is described, the description thereof is omitted.
- step S51 the nota 54 outputs the text subtitle data to the text subtitle composition 76.
- step S52 the background decoder 71 performs background image decoding.
- the data is decoded and output to the background plane generation unit 91.
- step S53 the 1st video decoder 72-1 decodes the video stream (that is, the supplied primary video stream), and outputs this to the video plane generating unit 92.
- step S54 the presentation graphics decoder 73 decodes the presentation graphics stream selected and supplied by the switch 57, and outputs this to the subsequent switch 77.
- step S55 the interactive graphics decoder 74 decodes the interactive graphics stream selected and supplied by the switch 58, and outputs the decoded interactive graphics stream to the subsequent interactive graphics plane generating unit 94.
- step S56 the 1st audio decoder 75-1 decodes the supplied audio stream (audio stream # 1) selected by the switch 59 and outputs it to the subsequent mixing processing unit 101. .
- audio data is not output from the 2nd audio decoder 75-2, so that the mixing processing unit 101 does not output the 1st audio decoder 75.
- the audio data output from 1 is supplied to the subsequent mixing processing unit 97 as it is.
- step S57 the Text-ST composition 76 decodes the text subtitle data and outputs it to the switch 77 at the subsequent stage.
- step S58 the switch 77 selects whether the data from the presentation graphics decoder 73 or the Text-ST composition 76 is! /. Specifically, the switch 77 selects one of the presentation graphics stream decoded by the presentation graphics decoder 73 and Text-ST (text subtitle data), and selects the selected data. Is supplied to the presentation graphics plane generator 93.
- step S59 the background plane generating unit 91 generates a knock plane based on the background image data supplied from the knock background decoder 71.
- step S60 the video plane generating unit 92 generates a video plane based on the video data supplied from the 1st video decoder 72-1.
- step S61 the presentation graphics plane generating unit 93 selects the data from the presentation graph status decoder 73 or the data from the Text-ST composition 76 selected and supplied by the switch 77 in the process of step S58. Based on the above, a presentation graphics plane is generated.
- step S62 the interactive graphics plane generating unit 94 generates an interactive graphics plane based on the data of the interactive graphics stream supplied from the interactive graphics decoder 74.
- step S63 the buffer 95 buffers the sound data selected and supplied in the process of step S43, and supplies it to the mixing processor 97 at a predetermined timing.
- step S64 the video data processing unit 96 synthesizes and outputs the data of each plane. Specifically, the data from the knock ground plane generating unit 91, the video plane generating unit 92, the presentation graphics plane generating unit 93, and the interactive graphics plane generating unit 94 are combined and output as video data.
- step S65 the mixing processing unit 97 mixes (synthesizes) the audio data (audio data output from the mixing processing unit 101) and the sound data, and outputs them. Return to step S3 in FIG. 36 and proceed to step S4, or return to step S8 in FIG. 36 and turn it into step S7 Return to step S12 in FIG. 30 and proceed to step S13.
- the main clip, sub clip, and text subtitle data are referred to by the main path and sub node included in the PlayList and reproduced.
- the video to be displayed is only the video of the primary video stream, and the secondary display screen 2 described with reference to FIG. 5 is not displayed. Since the main path and sub path are set in the PlayList, and the clip can be specified with a clip that is different from the Clip AV stream file specified in the main path, the clip is different from the main clip pointed to by the Play path in the main path. Play a sub clip data and main clip data together (at the same time) It can be done.
- step S45 and step S46 may be performed in reverse order or in parallel.
- step S47 and step S49 may be performed in reverse order or in parallel.
- steps S52 to S57 may be executed in reverse order or in parallel.
- steps S59 to S62 may be performed in reverse order or in parallel.
- step S64 and step S65 may be performed in reverse order or in parallel. That is, in FIG.
- the processes of unit 97 may be executed in parallel, and the order thereof does not matter.
- steps S101 to S106 basically the same processing as in steps S41 to S46 described with reference to FIGS. 37 and 38 is executed.
- the controller 34 reads the main clip, sub clip, and text subtitle data (Text-ST data), and the read data (main clip, sub clip, and text subtitle data) corresponds to the buffer.
- Switch 32 is controlled to feed 51-54. Based on the control from the controller 34, the switch 32 supplies the knockout image data to the buffer 51, the main clip data to the buffer 52, the sub clip data to the buffer 53, and the text subtitle. Data is supplied to buffer 54 and buffered in buffers 51-54.
- the buffer 51 outputs the background image data to the background decoder 71.
- the noffer 52 outputs the stream data of the main clip to the PID filter 55.
- step S107 the PID filter 55 transmits the elementary stream to the decoder of each elementary stream, based on the PID attached to the TS packet constituting the main Clip AV stream file.
- the PID filter 55 supplies the video stream to the PID filter 60, and supplies the presentation graphics stream to the switch 57, which is the supply source to the presentation graphics decoder 73, and the interactive graphics stream.
- the switch 58 serving as the supply source to the interactive graphics decoder 74
- the audio stream is supplied to the switch 59 serving as the supply source to the 1st audio decoder 75-1.
- video streams, presentation graphics streams, interactive graphics streams, and audio streams are assigned different PIDs.
- the PID filter 60 supplies the primary video stream to the 1st video decoder 72-1 and supplies the secondary video stream to the 2nd video decoder 72-2.
- step S108 the buffer 53 outputs the stream data of the sub clip to the PID filter 56.
- the PID filter 56 distributes to the decoder of each elementary list based on the PID. Specifically, the PID filter 56 supplies the supplied video stream to the PID filter 60, and supplies the presentation graphics stream to the switch 57 serving as the supply source to the presentation graphics decoder 73, thereby The active graphics stream is supplied to the switch 58 that supplies the interactive graphics decoder 74, and the audio stream is supplied to the switch 59 that supplies the first audio decoder 75-1 and 2nd audio decoder 75-2. . Under the control of the controller 34, the PID filter 60 supplies the primary video stream to the 1st video decoder 72-1, and supplies the secondary video stream to the 2nd video decoder 72-2.
- steps S110 to S112 basically the same processing as in steps S50 to S52 described with reference to FIGS. 37 and 38 is executed.
- the switches 57 to 59 and the PID filter 60 select either the main clip or the sub clip based on the control from the controller 34.
- the noffer 54 outputs the text subtitle data to the text subtitle composition 76.
- the background decoder 71 decodes the background image data and converts it into the background. Output to the plane generation unit 91.
- step S113 the 1st video decoder 72-1 decodes the supplied primary video stream and outputs it to the video plane generating unit 92.
- step S114 the 2nd video decoder 72-2 decodes the supplied secondary video stream and outputs it to the video plane generating unit 92.
- step S115 the presentation graphics decoder 73 operates the switch 5
- step S116 the interactive graphics decoder 74 decodes the interactive graphics stream selected and supplied by the switch 58, and outputs this to the subsequent interactive graphics plane generating unit 94.
- step S117 the 1st audio decoder 75-1 decodes the primary audio stream selected and supplied by the switch 59, and outputs this to the subsequent mixing processing unit 101.
- step S118 the 2nd audio decoder 75-2 decodes the secondary audio stream selected and supplied by the switch 59, and outputs this to the mixing processing unit 101 at the subsequent stage.
- step S119 the Text-ST composition 76 decodes the displayed text subtitle data, either primary or secondary, and outputs it to the subsequent switch 77.
- step S120 the switch 77 selects whether the data from the presentation graphics decoder 73 or the Text-ST composition 76 is! /. Specifically, the switch 77 selects one of the presentation graphics stream decoded by the presentation graphics decoder 73 and Text-ST (text subtitle data), and selects the selected data. Is supplied to the presentation graphic plane generator 93.
- step S121 the background plane generation unit 91 generates a knock ground based on the background image data supplied from the background decoder 71. Generate a plane.
- step S122 the video plane generating unit 92 combines the video data supplied from the 1st video decoder 72-1 and the 2nd video decoder 72-2 based on the control of the controller 34 referring to pip_metadata. Then, a video plane constituted by the main display screen 1 and the sub display screen 2 as described with reference to FIG. 5 is generated and supplied to the video data processing unit 96.
- the video plane generation unit 92 is based on the control of the controller 34 referring to pipjnetadata described with reference to FIG. 9, FIG. 11, or FIG.
- the scaled secondary bidet stream is combined with the primary bidet stream to generate a video plane composed of the main display screen 1 and the secondary display screen 2 as described with reference to FIG.
- the data is supplied to the data processing unit 96.
- pip.horizot aLposition is, for example, the secondary display on which the secondary video is displayed on the main display screen 1 of FIG.
- the X coordinate of the upper left corner of screen 2 is indicated
- pip_vertical_position indicates, for example, the Y coordinate of the upper left corner of secondary display screen 2 on which secondary video is displayed on main display screen 1 in FIG.
- Pip_scale indicates information on the size of the secondary display screen 2 on which the secondary video is displayed.
- the scaled secondary video adjusted to a predetermined size based on pip_scale is the primary video plane. From the upper left corner of the video, pip—horizotal-position in the X-axis direction and pip—vertical-position in the Y-axis direction 1 ⁇ 4; ⁇ , and the scaling will be placed in the upper left corner of the secondary video.
- the video plane generating unit 92 performs luma_keying synthesis of the primary video stream and the secondary video stream, and A plane is generated and supplied to the video data processing unit 96.
- the luma_keying composition is brightness (brightness) as explained by pipjnetadata in Fig. 12. This is a method of combining an image with unnecessary portions cut out using the component difference of the degree value) and overlaying it on the video. Next, details of luma_keying composition will be described with reference to FIGS. 44 and 45. FIG.
- Figure 44 shows luma—primary video and secondary video before keying synthesis.
- the secondary video from which the lower limit and the upper limit range of the luminance values defined by lower_limit_luma_key and upper_limit_luma_key are removed is combined with the primary video.
- the parallelogram of the left secondary video and the non-circled region other than the circle are luminance values within the lower limit and upper limit of the luminance value.
- the left secondary video force processed in this way is superimposed on the right primary video.
- FIG. 45 is a diagram showing the primary video and the secondary video after luma_keying composition.
- the primary video and secondary video after the luma_keying composition are transparent except for the parallelogram and circle of the secondary video. Only the parallelogram and circle areas will be combined with the force primary video.
- step S123 the presentation graphics plane generator 93 selects the data from the presentation graphics decoder 73 supplied by the switch 77 in the process of step S58 and the supplied Text-ST control. Based on data from position 76! /, The presentation graphics plane Generate.
- step S124 the interactive graphics plane generating unit 94 generates an interactive graphics plane based on the data of the interactive graphics stream supplied from the interactive graphics decoder 74.
- step S125 the buffer 95 buffers the sound data selected and supplied in the process of step S43, and supplies it to the mixing processing unit 97 at a predetermined timing.
- step S126 the video data processing unit 96 synthesizes and outputs the data of each plane. Specifically, the data from the knock ground plane generation unit 91, the video plane generation unit 92, the presentation graphics plane generation unit 93, and the interactive graphics plane generation unit 94 are combined and output as video data.
- step S127 the mixing processing unit 101 synthesizes the primary audio data output from the 1st audio decoder 75-1 and the secondary audio data output from the 2nd audio decoder 75-2, and performs mixing processing. Supply to part 97.
- step S1208 the mixing processor 97 mixes (synthesizes) the synthesized audio data and sound data output from the mixing processor 101, and outputs them. Return to step S10, or advance to step S10, or step S15 in FIG. 36, return to step S16.
- the main clip, sub clip, and text subtitle data are referred to by the main path and sub node included in the PlayList and reproduced.
- a main path and a sub path are provided in the PlayList, and a clip different from the ClipAV stream file specified in the main path can be specified in the sub path, and the display image of the secondary video stream is displayed as the display image of the primary video stream. Can be superimposed.
- the display size and display position of the secondary video stream can be set, so the display position and display size of the secondary display screen depend on the playback device.
- the secondary video stream may be displayed at a position and size that does not interfere with the display of the primary video stream, or depending on the content of the secondary video stream, It is possible to display a large image at the position or to display a small image at the edge of the main display screen 1 if the content is not important.
- the size and display position of the display image of the secondary video stream can be determined as appropriate by the content producer or the content distributor.
- steps S105 and S106 may be executed in reverse order or in parallel. Further, the processing of step S107 and step S109 may be performed in reverse order or in parallel.
- steps S112 to S119 may be performed in reverse order or in parallel.
- steps S121 to S124 may be performed in reverse order or in parallel.
- the process of step S126 and the processes of step S127 and step S128 may be performed in reverse order or in parallel. That is, in FIG. 35, the processing of buffers 51 to 54 in the same vertical layer, the processing of switches 57 to 59, the processing of decoders 71 to 76, the processing of plane generation units 91 to 94, the video data processing unit 96 and the mixing processing
- the processes of unit 97 may be executed in parallel, and the order thereof is not limited.
- the playback device 20 has a main path that is a main playback path indicating the position of a main ClipAV stream file including at least one stream, and a sub ClipAV stream file that is played back using a playback path different from the main ClipAV stream file.
- a PlayList is obtained as playback management information that includes information on the sub-path that is the playback path. Then, the playback device 20 accepts selection of a stream to be played based on STN_tableO for selecting a stream to be played back included in the PlayList.
- STN_tableO is information for selecting a specific type of stream (for example, primary video stream) of the main ClipAV stream file and other stream files that are played back in synchronization with the playback timing of the stream or asynchronously. Therefore, based on this STN_table (), the playback You can accept the selection of the stream.
- the main path and subpath can refer to the same Clip, and more subnodes can be added, so the stream has extensibility. be able to.
- the configuration is such that a plurality of files can be referred to by one Sub Path (for example, Fig. 4), so that the middle user of a plurality of different streams can select.
- stream_attributeO in Fig. 25, which is stream attribute information, is provided in STN_tableO, the playback apparatus 20 can determine whether or not the selected stream can be played back. Furthermore, by referring to stream_attribute (), only streams with playback functions can be selected and played back.
- a video plane generating unit 92 for synthesizing (mixing) the primary video data and secondary video data decoded by the two video decoders is provided, and a mixing processing unit 101 for synthesizing the audio data decoded by the two audio decoders is provided. I tried to provide it. This allows two identical types of streams to be combined and played simultaneously.
- PinP picture-in-picture
- the size and display position of the display image of the secondary video stream in picture-in-picture display can be set by pipjnetadata. This enables the display of the primary video stream depending on the content or the timing at which it is displayed, compared to the case where the display position and display size of the sub display screen are determined in advance depending on the playback device.
- the secondary video stream can be displayed at a position and size that is not disturbing, or depending on the content of the secondary video stream, for example, if the content is important, it can be displayed in a prominent position, If the content is not important, it can be displayed small on the edge of the main display screen 1.
- the size and display position of the display image of the secondary video stream can be determined as appropriate by the content producer or the content distributor.
- information indicating the display setting of this picture-in-picture can be described in pipjnetadata.
- information indicating the position and size of the secondary display screen 2 on which the secondary video stream is displayed for example, information on the rotation and color designation of the video displayed with respect to the original video of the secondary video stream is described. be able to.
- the display information including the size and display position of the display image of the secondary video stream is not described in the secondary video stream, but is included in the playlist for controlling the playback of the secondary video stream. be written.
- the playback device Only update the pipjnetadata entry in the playlist without changing the settings or the secondary video stream itself!
- the same secondary video stream is displayed in different display methods (for example, the same secondary video stream is displayed). It is very easy to display a large content in the first content and a small content in the second content). That is, without changing the data of the secondary video stream itself, in each of the first content and the second content, by describing the pip.metadata ⁇ of the playlist and information indicating the desired display format, In the first content and the second content, the same secondary video stream can be displayed in different display methods desired by the content creator or content distributor.
- the Sub Path type (subtype of audio and text subtitles and the type indicating whether or not to synchronize with the main path)
- SubPath_type indicating the name of the sub clip referenced by the Sub Path
- Clip_Information_file_name in FIG. 21 indicating the name of the Sub Clip referenced by the Sub Path
- SubPlayItem_IN_time in FIG. 21 indicating the In point of the Clip referenced by the Sub Path
- SubPlayItem_OUT_time in FIG. 21 indicating the Out point.
- the data read by the storage drive 31 in FIG. 35 may be recorded on a recording medium such as a DVD (Digital Versatile Disc)! Or recorded on a hard disk! /
- the data may also be data that has been downloaded via a network (not shown), or may be a combination of these. For example, it may be played back based on the PlayList and sub Clip downloaded and recorded on the hard disk and the main ClipAV stream file recorded on the DVD.
- the main list is recorded on the hard disk based on the PlayList. Clip and sub clip may be read and played back from hard disk and DVD, respectively.
- the position and size of the secondary video can be changed by the function of the playback device 20 (player), but in this case, there is a force S when the size does not reach the size intended by the author.
- the size, position, etc. are managed by the playlist which is management information. For example, even if you want to change the position after obtaining the ROM disk, even if you do not change the actual Clip, the size is relatively small!
- the position of the secondary video can be changed as the author intended.
- the recording medium 21 is a disc-shaped recording medium. This will be described as an example.
- a master disk made of glass, for example is prepared, on which a master disk is prepared.
- a recording material made of a photoresist or the like is applied.
- a recording master is produced.
- the software production processing unit stores the video data in a format that is encoded by the encoding device (video encoder) and can be played back by the playback device 20, in a temporary buffer.
- Audio data encoded by the audio encoder is stored in the temporary buffer, and data other than the stream (eg, Indexes, Playlist, Playltem, etc.) encoded by the data encoder is also stored. It is stored in the hour buffer.
- Video data, audio data, and non-stream data stored in each buffer are multiplexed together with the synchronization signal by a multiplexer (MPX), and error correction code (ECC) is used for error correction.
- MPX multiplexer
- ECC error correction code
- a predetermined modulation is applied by a modulation circuit (MOD), which is recorded on a magnetic tape or the like according to a predetermined format, and recorded on a recording medium 21 that can be played back by the playback device 20. Is produced.
- MOD modulation circuit
- This software is edited (pre-mastered) as necessary, and a signal in a format to be recorded on the optical disc is generated. Then, as shown in FIG. 46, the laser beam is modulated in response to the recording signal, and this laser beam is irradiated onto the photoresist on the master. Thereby, the photoresist on the master is exposed in accordance with the recording signal.
- the master is developed, and pits appear on the master.
- the original master prepared in this way is treated with, for example, electric lamps, and a metal master is produced in which the pits on the glass master are transferred.
- a metal stamper is produced from this metal master and is used as a molding die.
- a material such as PMMA (acrylic) or PC (polycarbonate) is injected into the molding die by, for example, injection and fixed.
- 2 P (ultraviolet curable resin) or the like is applied on a metal stamper and then cured by irradiating with ultraviolet rays. As a result, the pits on the metal stamper can be transferred onto a replica made of resin.
- a reflective film is formed by vapor deposition or sputtering.
- a reflective film is formed on the generated replica by spin coating.
- the inner and outer diameters of the disk are processed, and necessary measures such as bonding two disks are performed.
- a label is attached or a hub is attached and inserted into the cartridge. In this way, the recording medium 21 on which data reproducible by the reproducing device 20 is recorded is completed.
- a CPU (Central Processing Unit) 501 follows a program stored in a ROM (Read Only Memory) 502 or a program loaded from a storage unit 508 into a RAM (Random Access Memory) 503. Perform various processes.
- the RAM 503 appropriately stores data necessary for the CPU 501 to execute various processes.
- the CPU 501, ROM 502, and RAM 503 are connected to each other via the internal bus 504.
- An input / output interface 505 is also connected to the internal bus 504.
- the input / output interface 505 includes an input unit 506 including a keyboard and a mouse, a display including CRT (Cathode Ray Tube) and LCD (Liquid Crystal Display), an output unit 507 including a spin force, and a hard disk. And a communication unit 509 including a modem and a terminal adapter are connected. The communication unit 509 performs communication processing via various networks including a telephone line and CATV.
- a drive 510 is also connected to the input / output interface 505 as necessary, and a removable pub comprising a magnetic disk, optical disk, magneto-optical disk, semiconductor memory, or the like.
- the computer program power read from the media 521 is appropriately installed, and installed in the storage unit 508 as necessary.
- this program storage medium is composed of a package medium made of removable media 521 on which a program is recorded, which is distributed to provide a program to the user, separately from the computer. It is composed of a hard disk or the like that includes a ROM 502 or a storage unit 508 that stores a program provided to the user in a state of being pre-installed in the apparatus body.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Television Signal Processing For Recording (AREA)
- Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
- Studio Circuits (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ES06782790.7T ES2517418T3 (es) | 2005-08-25 | 2006-08-17 | Dispositivo de reproducción, método de reproducción, programa, soporte de almacenamiento de programa, estructura de datos y método de fabricación de un soporte de registro |
EP06782790.7A EP1919210B1 (en) | 2005-08-25 | 2006-08-17 | Reproduction device, reproduction method, program, program storage medium, data structure, and recording medium fabrication method |
CN2006800015621A CN101091385B (zh) | 2005-08-25 | 2006-08-17 | 播放设备和播放方法 |
US11/791,585 US8340496B2 (en) | 2005-08-25 | 2006-08-17 | Playback apparatus, playback method, program, program storage medium, data structure, and recording-medium manufacturing method |
KR1020077014458A KR101320793B1 (ko) | 2005-08-25 | 2006-08-17 | 재생 장치 및 재생 방법 및 프로그램 저장 매체 |
KR1020127033439A KR101244148B1 (ko) | 2005-08-25 | 2006-08-17 | 데이터 생성 방법, 기록 장치, 기록 방법, 저장 매체, 재생 장치 및 재생 방법 |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005244612 | 2005-08-25 | ||
JP2005-244612 | 2005-08-25 | ||
JP2005-264041 | 2005-09-12 | ||
JP2005264041 | 2005-09-12 | ||
JP2005-311477 | 2005-10-26 | ||
JP2005311477A JP4081772B2 (ja) | 2005-08-25 | 2005-10-26 | 再生装置および再生方法、プログラム、並びにプログラム格納媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007023728A1 true WO2007023728A1 (ja) | 2007-03-01 |
Family
ID=37771476
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/316177 WO2007023728A1 (ja) | 2005-08-25 | 2006-08-17 | 再生装置および再生方法、プログラム、プログラム格納媒体、データ構造、並びに、記録媒体の製造方法 |
Country Status (8)
Country | Link |
---|---|
US (1) | US8340496B2 (ja) |
EP (1) | EP1919210B1 (ja) |
JP (1) | JP4081772B2 (ja) |
KR (2) | KR101320793B1 (ja) |
CN (4) | CN102855902A (ja) |
ES (1) | ES2517418T3 (ja) |
TW (2) | TWI479892B (ja) |
WO (1) | WO2007023728A1 (ja) |
Families Citing this family (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2562757B1 (en) * | 2004-02-16 | 2018-05-09 | Sony Corporation | Reproduction device, reproduction method, and program |
AU2006277225B2 (en) | 2005-08-09 | 2011-06-09 | Panasonic Corporation | Recording medium, playback apparatus, method and program |
JP4870493B2 (ja) * | 2005-08-09 | 2012-02-08 | パナソニック株式会社 | 再生装置、記録方法、再生方法、システムlsi、プログラム |
JP2009177527A (ja) * | 2008-01-24 | 2009-08-06 | Panasonic Corp | 画像記録装置、画像再生装置、記録媒体、画像記録方法及びプログラム |
JP2009177531A (ja) * | 2008-01-24 | 2009-08-06 | Panasonic Corp | 画像記録装置、画像再生装置、記録媒体、画像記録方法及びプログラム |
JP2009177619A (ja) | 2008-01-25 | 2009-08-06 | Panasonic Corp | 画像記録装置、画像再生装置、記録媒体、画像記録方法及びプログラム |
JP4544332B2 (ja) * | 2008-04-07 | 2010-09-15 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム |
WO2010013382A1 (ja) * | 2008-07-31 | 2010-02-04 | 三菱電機株式会社 | 映像符号化装置、映像符号化方法、映像再生装置、映像再生方法、映像記録媒体、及び映像データストリーム |
JP2010135011A (ja) * | 2008-12-05 | 2010-06-17 | Funai Electric Co Ltd | デジタル放送録画再生装置 |
KR101639053B1 (ko) * | 2009-02-17 | 2016-07-13 | 코닌클리케 필립스 엔.브이. | 3d 이미지 및 그래픽 데이터의 조합 |
TWI400949B (zh) * | 2010-04-06 | 2013-07-01 | Hon Hai Prec Ind Co Ltd | 媒體資料播放裝置及其重播方法 |
JP2012064135A (ja) | 2010-09-17 | 2012-03-29 | Sony Corp | 情報処理装置、および情報処理方法、並びにプログラム |
US20130100248A1 (en) * | 2011-05-11 | 2013-04-25 | Shinya Kadono | Video transmitting apparatus and video transmitting method |
WO2012174301A1 (en) | 2011-06-14 | 2012-12-20 | Related Content Database, Inc. | System and method for presenting content with time based metadata |
JP2015510652A (ja) * | 2012-01-09 | 2015-04-09 | トムソン ライセンシングThomson Licensing | メディアコンテンツを処理するためのシステム及び方法 |
JP5627617B2 (ja) * | 2012-02-22 | 2014-11-19 | 株式会社東芝 | 画像処理装置及び画像表示システム |
US9716904B2 (en) * | 2012-05-17 | 2017-07-25 | Ericsson Ab | Video content presentation override control systems, methods and devices |
KR101249279B1 (ko) * | 2012-07-03 | 2013-04-02 | 알서포트 주식회사 | 동영상 생성 방법 및 장치 |
US10356484B2 (en) | 2013-03-15 | 2019-07-16 | Samsung Electronics Co., Ltd. | Data transmitting apparatus, data receiving apparatus, data transceiving system, method for transmitting data, and method for receiving data |
US9723245B2 (en) | 2013-03-15 | 2017-08-01 | Samsung Electronics Co., Ltd. | Data transmitting apparatus, data receiving apparatus, data transceiving system, method for transmitting data, and method for receiving data |
EP3097696A4 (en) * | 2014-01-21 | 2017-09-20 | Lg Electronics Inc. | Broadcast transmission device and operating method thereof, and broadcast reception device and operating method thereof |
US20150253974A1 (en) | 2014-03-07 | 2015-09-10 | Sony Corporation | Control of large screen display using wireless portable computer interfacing with display controller |
CN103888840B (zh) * | 2014-03-27 | 2017-03-29 | 电子科技大学 | 一种视频移动终端实时拖动与缩放的方法及装置 |
US9697630B2 (en) * | 2014-10-01 | 2017-07-04 | Sony Corporation | Sign language window using picture-in-picture |
US20160098180A1 (en) * | 2014-10-01 | 2016-04-07 | Sony Corporation | Presentation of enlarged content on companion display device |
US10097785B2 (en) * | 2014-10-01 | 2018-10-09 | Sony Corporation | Selective sign language location |
US10204433B2 (en) | 2014-10-01 | 2019-02-12 | Sony Corporation | Selective enablement of sign language display |
JP2016081553A (ja) * | 2014-10-17 | 2016-05-16 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 記録媒体、再生方法、および再生装置 |
KR102202576B1 (ko) * | 2014-12-12 | 2021-01-13 | 삼성전자주식회사 | 음향 출력을 제어하는 디바이스 및 그 방법 |
US20160307603A1 (en) * | 2015-04-15 | 2016-10-20 | Sony Corporation | Information processing device, information recording medium, information processing method, and program |
CN107547917B (zh) * | 2016-06-27 | 2020-07-10 | 中兴通讯股份有限公司 | 频道的播放和处理方法及装置,频道的处理系统 |
JP6934052B2 (ja) * | 2017-06-28 | 2021-09-08 | 株式会社ソニー・インタラクティブエンタテインメント | 表示制御装置、表示制御方法及びプログラム |
CN111182213B (zh) * | 2019-12-31 | 2021-09-24 | 维沃移动通信有限公司 | 视频制作方法、电子设备及介质 |
CN111522497B (zh) * | 2020-04-16 | 2022-09-13 | 深圳市颍创科技有限公司 | Pip模式中触摸控制显示设备子画面大小和位置的方法 |
EP4007228A1 (en) * | 2020-11-27 | 2022-06-01 | Telefonica Digital España, S.L.U. | System and method for live media streaming |
US20230007210A1 (en) * | 2021-06-30 | 2023-01-05 | Lemon Inc. | Signaling the Purpose of Preselection |
US20230018718A1 (en) * | 2021-06-30 | 2023-01-19 | Lemon Inc. | Signaling Replacement of Video Data Units in a Picture-in-Picture Region |
CN114911407A (zh) * | 2022-06-20 | 2022-08-16 | 上海联影医疗科技股份有限公司 | 医学诊疗系统的图像播放控制装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000022928A (ja) * | 1998-06-30 | 2000-01-21 | Omron Corp | 合成画像作成装置および記録媒体 |
JP2003244613A (ja) * | 2002-02-15 | 2003-08-29 | Toshiba Corp | 映像データ記録再生装置及び映像データ記録再生方法 |
WO2004034398A1 (en) * | 2002-10-11 | 2004-04-22 | Thomson Licensing S.A. | Method and apparatus for synchronizing data streams containing audio, video and/or other data |
JP2004362719A (ja) * | 2003-06-06 | 2004-12-24 | Sharp Corp | データ記録方法、データ記録装置、およびデータ記録媒体 |
JP2005020242A (ja) * | 2003-06-25 | 2005-01-20 | Toshiba Corp | 再生装置 |
WO2005074270A1 (ja) * | 2004-01-30 | 2005-08-11 | Matsushita Electric Industrial Co., Ltd. | 記録媒体、再生装置、プログラム、再生方法 |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3372037B2 (ja) | 1995-08-21 | 2003-01-27 | 松下電器産業株式会社 | 記録媒体の記録方法、再生装置、再生方法 |
CN1205825A (zh) * | 1995-12-29 | 1999-01-20 | 汤姆森消费电子有限公司 | 具有用于显示可变尺寸辅助图像的设备的电视系统 |
US6088064A (en) * | 1996-12-19 | 2000-07-11 | Thomson Licensing S.A. | Method and apparatus for positioning auxiliary information proximate an auxiliary image in a multi-image display |
US5963215A (en) * | 1997-03-26 | 1999-10-05 | Intel Corporation | Three-dimensional browsing of multiple video sources |
MY117040A (en) * | 1997-09-17 | 2004-04-30 | Matsushita Electric Ind Co Ltd | Optical disc, recording apparatus, and computer-readable recording medium. |
CN1147145C (zh) * | 1997-12-01 | 2004-04-21 | 星视电视广播公司 | 在弹出区域中具有广告消息的电子节目表系统 |
KR100277994B1 (ko) * | 1998-12-31 | 2001-01-15 | 구자홍 | 경계 영역 표시 장치 |
JP4503858B2 (ja) | 1999-04-14 | 2010-07-14 | ライト チャンス インコーポレイテッド | 遷移ストリームの生成/処理方法 |
JP4328989B2 (ja) * | 1999-11-24 | 2009-09-09 | ソニー株式会社 | 再生装置、再生方法、並びに記録媒体 |
JP2001251266A (ja) | 2000-03-07 | 2001-09-14 | Sony Corp | データ送出装置及びトランスポートストリーム生成装置並びにそのデータストリーム生成方法 |
CN1239021C (zh) * | 2000-04-21 | 2006-01-25 | 索尼公司 | 信息处理设备及方法、程序和记录介质 |
KR100676328B1 (ko) * | 2000-06-28 | 2007-01-30 | 삼성전자주식회사 | 피아이피 기능을 갖는 디지탈 영상표시기기의 피봇장치 |
US6697123B2 (en) * | 2001-03-30 | 2004-02-24 | Koninklijke Philips Electronics N.V. | Adaptive picture-in-picture |
CN100414537C (zh) * | 2002-02-07 | 2008-08-27 | 三星电子株式会社 | 包含显示模式信息的信息存储介质、再现装置及其方法 |
JP3741668B2 (ja) | 2002-03-26 | 2006-02-01 | 池上通信機株式会社 | データ多重化装置 |
JP4441884B2 (ja) * | 2002-11-11 | 2010-03-31 | ソニー株式会社 | 情報処理装置および方法、プログラム格納媒体、記録媒体、並びにプログラム |
KR100873437B1 (ko) * | 2002-11-28 | 2008-12-11 | 삼성전자주식회사 | Pip화면을 이용한 듀얼모드 신호처리장치 |
US20040189828A1 (en) * | 2003-03-25 | 2004-09-30 | Dewees Bradley A. | Method and apparatus for enhancing a paintball video |
US8065614B2 (en) | 2003-04-09 | 2011-11-22 | Ati Technologies, Inc. | System for displaying video and method thereof |
JP4228767B2 (ja) * | 2003-04-25 | 2009-02-25 | ソニー株式会社 | 再生装置、再生方法、再生プログラムおよび記録媒体 |
KR101130368B1 (ko) | 2003-06-02 | 2012-03-27 | 디즈니엔터프라이지즈,인크. | 소비자용 비디오 플레이어를 위한 프로그램된 윈도우 제어 시스템 및 방법 |
WO2005024824A1 (en) * | 2003-09-04 | 2005-03-17 | Koninklijke Philips Electronics N.V. | Record carrier carrying a video signal and at least one additional information signal |
KR101003957B1 (ko) * | 2003-11-20 | 2010-12-30 | 엘지전자 주식회사 | 광디스크 장치에서의 유저 마크 기록방법 |
US20050179817A1 (en) * | 2004-01-14 | 2005-08-18 | Matsushita Electric Industrial Co., Ltd. | Video signal display unit |
US7609947B2 (en) * | 2004-09-10 | 2009-10-27 | Panasonic Corporation | Method and apparatus for coordinating playback from multiple video sources |
US20060200842A1 (en) * | 2005-03-01 | 2006-09-07 | Microsoft Corporation | Picture-in-picture (PIP) alerts |
-
2005
- 2005-10-26 JP JP2005311477A patent/JP4081772B2/ja active Active
-
2006
- 2006-08-10 TW TW099136799A patent/TWI479892B/zh not_active IP Right Cessation
- 2006-08-10 TW TW095129432A patent/TW200715853A/zh unknown
- 2006-08-17 WO PCT/JP2006/316177 patent/WO2007023728A1/ja active Application Filing
- 2006-08-17 CN CN201210033302XA patent/CN102855902A/zh active Pending
- 2006-08-17 KR KR1020077014458A patent/KR101320793B1/ko not_active IP Right Cessation
- 2006-08-17 US US11/791,585 patent/US8340496B2/en not_active Expired - Fee Related
- 2006-08-17 EP EP06782790.7A patent/EP1919210B1/en not_active Not-in-force
- 2006-08-17 CN CN201210033301.5A patent/CN102855901B/zh not_active Expired - Fee Related
- 2006-08-17 CN CN2006800015621A patent/CN101091385B/zh active Active
- 2006-08-17 ES ES06782790.7T patent/ES2517418T3/es active Active
- 2006-08-17 KR KR1020127033439A patent/KR101244148B1/ko not_active IP Right Cessation
- 2006-08-17 CN CN2012100334910A patent/CN102572454B/zh active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000022928A (ja) * | 1998-06-30 | 2000-01-21 | Omron Corp | 合成画像作成装置および記録媒体 |
JP2003244613A (ja) * | 2002-02-15 | 2003-08-29 | Toshiba Corp | 映像データ記録再生装置及び映像データ記録再生方法 |
WO2004034398A1 (en) * | 2002-10-11 | 2004-04-22 | Thomson Licensing S.A. | Method and apparatus for synchronizing data streams containing audio, video and/or other data |
JP2004362719A (ja) * | 2003-06-06 | 2004-12-24 | Sharp Corp | データ記録方法、データ記録装置、およびデータ記録媒体 |
JP2005020242A (ja) * | 2003-06-25 | 2005-01-20 | Toshiba Corp | 再生装置 |
WO2005074270A1 (ja) * | 2004-01-30 | 2005-08-11 | Matsushita Electric Industrial Co., Ltd. | 記録媒体、再生装置、プログラム、再生方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1919210A4 * |
Also Published As
Publication number | Publication date |
---|---|
CN102572454A (zh) | 2012-07-11 |
CN101091385A (zh) | 2007-12-19 |
ES2517418T3 (es) | 2014-11-03 |
EP1919210A1 (en) | 2008-05-07 |
KR101320793B1 (ko) | 2013-10-23 |
EP1919210A4 (en) | 2011-09-07 |
TWI346506B (ja) | 2011-08-01 |
CN102855901B (zh) | 2014-10-29 |
JP2007104615A (ja) | 2007-04-19 |
CN102855902A (zh) | 2013-01-02 |
CN102855901A (zh) | 2013-01-02 |
CN102572454B (zh) | 2013-09-11 |
EP1919210B1 (en) | 2014-10-01 |
US8340496B2 (en) | 2012-12-25 |
US20080267588A1 (en) | 2008-10-30 |
CN101091385B (zh) | 2012-10-10 |
KR101244148B1 (ko) | 2013-03-15 |
KR20130007672A (ko) | 2013-01-18 |
KR20080040617A (ko) | 2008-05-08 |
JP4081772B2 (ja) | 2008-04-30 |
TW200715853A (en) | 2007-04-16 |
TWI479892B (zh) | 2015-04-01 |
TW201108739A (en) | 2011-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4081772B2 (ja) | 再生装置および再生方法、プログラム、並びにプログラム格納媒体 | |
JP4770601B2 (ja) | 情報処理装置および情報処理方法、プログラム、並びに、プログラム格納媒体 | |
JP4923751B2 (ja) | 再生装置、並びに記録媒体及びその製造方法 | |
US9076495B2 (en) | Reproducing apparatus, reproducing method, computer program, program storage medium, data structure, recording medium, recording device, and manufacturing method of recording medium | |
JP4822081B2 (ja) | 再生装置、再生方法、および記録媒体 | |
JP4849343B2 (ja) | データ生成方法、記録装置および方法、並びに、プログラム | |
JP4720676B2 (ja) | 情報処理装置および情報処理方法、データ構造、記録媒体の製造方法、プログラム、並びに、プログラム格納媒体 | |
JP4900708B2 (ja) | 再生装置および再生方法、プログラム、並びにプログラム格納媒体 | |
JP5201428B2 (ja) | データ生成方法、記録装置および方法、並びに、プログラム | |
JP2008193604A (ja) | 再生装置および方法、並びにプログラム | |
JP4821456B2 (ja) | 情報処理装置および情報処理方法、プログラム、データ構造、並びに記録媒体 | |
JP5234144B2 (ja) | 再生装置、並びに記録媒体及びその製造方法 | |
JP2008052836A (ja) | 情報処理装置および情報処理方法、プログラム、並びに、プログラム格納媒体 | |
JP2012075187A (ja) | 再生装置、再生方法、および記録方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 4215/DELNP/2007 Country of ref document: IN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 200680001562.1 Country of ref document: CN Ref document number: 2006782790 Country of ref document: EP Ref document number: 1020077014458 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11791585 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020127033439 Country of ref document: KR |