WO2004095837A1 - 再生装置、プログラム。 - Google Patents
再生装置、プログラム。 Download PDFInfo
- Publication number
- WO2004095837A1 WO2004095837A1 PCT/JP2004/005778 JP2004005778W WO2004095837A1 WO 2004095837 A1 WO2004095837 A1 WO 2004095837A1 JP 2004005778 W JP2004005778 W JP 2004005778W WO 2004095837 A1 WO2004095837 A1 WO 2004095837A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- resolution
- display
- graphics
- data
- ratio
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 4
- 238000000034 method Methods 0.000 claims description 28
- 230000008569 process Effects 0.000 claims description 12
- 230000001360 synchronised effect Effects 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 35
- 239000000872 buffer Substances 0.000 description 31
- 238000012545 processing Methods 0.000 description 28
- 230000002452 interceptive effect Effects 0.000 description 17
- 239000000203 mixture Substances 0.000 description 11
- 238000006243 chemical reaction Methods 0.000 description 10
- 239000008186 active pharmaceutical agent Substances 0.000 description 9
- 230000000694 effects Effects 0.000 description 8
- 238000004519 manufacturing process Methods 0.000 description 7
- 239000003086 colorant Substances 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000002194 synthesizing effect Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000003796 beauty Effects 0.000 description 3
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 235000002595 Solanum tuberosum Nutrition 0.000 description 1
- 244000061456 Solanum tuberosum Species 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011010 flushing procedure Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000010926 purge Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/11—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/418—External card to be used in combination with the client device, e.g. for conditional access
- H04N21/4184—External card to be used in combination with the client device, e.g. for conditional access providing storage capabilities, e.g. memory stick
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42646—Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
- H04N21/4355—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream involving reformatting operations of additional data, e.g. HTML pages on a television screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440218—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4621—Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
- H04N21/4858—End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/8042—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2541—Blu-ray discs; Blue laser DVR discs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/46—Receiver circuitry for the reception of television signals according to analogue transmission standards for receiving on more than one standard at will
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/775—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/907—Television signal recording using static stores, e.g. storage tubes or semiconductor memories
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0117—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
- H04N7/0122—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal the input and the output signals having different aspect ratios
Definitions
- the present invention relates to a playback device that plays back content composed of moving image data and auxiliary data, and relates to an improvement in displaying auxiliary data while synchronizing with moving image data.
- BD-ROMs There are two types of content supplied on large-capacity discs such as BD-ROMs, depending on the supported resolution.
- One is high quality with a resolution of 1920 x 1080 and the other is standard quality with a resolution of 720 x 480.
- High-resolution content is suitable for playback on a display device called HDTV Hi Definition Television.
- content with standard resolution is suitable for playback on a display device called SDTV (Standard Definition Television).
- SDTV Standard Definition Television
- the content resolution of these content and display devices is 1920x1080 and the display device is of HDTV type, the moving images and subtitles that make up the content will be displayed at their original resolution. . By playing back at the original resolution, the user can enjoy movie content with high image quality comparable to a movie theater.
- auxiliary data is subtitles
- a digital stream consisting of an HDTV-compatible video stream and HDTV-compatible subtitle graphics a digital stream consisting of an SDTV-compatible video stream
- a digital stream consisting of an SDTV-compatible subtitle graphics You have to make the stream and record it on a recording medium.
- subtitles need to be large for each country / region where movie content should be distributed. Creating graphics for each of SDTV and HDTV and multiplexing them with the video stream requires a lot of man-hours.
- An object of the present invention is to provide a playback device capable of displaying subtitles equivalent to those produced, even if production of either of the subtitle graphics for SDTV and the subtitle graphics for HDTV is omitted. That is.
- a playback device includes: a detection unit that detects a resolution for expressing a frame image in moving image data; a resolution of a display device connected to the playback device; When the ratio to the resolution detected by the first is 1, the auxiliary data multiplexed with the moving image data and recorded on the recording medium is displayed on the display device in synchronization with the moving image data.
- a display unit and a second display unit that, when the resolution ratio is not 1, displays the auxiliary data supplied from the server device on the display device in synchronization with the moving image data.
- the second display means displays the subtitle data supplied from the server device.
- subtitles are displayed according to the ratio of the resolution of the display device to the resolution of the content, even if the combination of “content-display device” changes dynamically, the optimal display for that combination is displayed. Can be performed.
- the above-mentioned reproducing apparatus receives the supply of the auxiliary data from the server apparatus, this is an optional item in the present invention, and the reproducing apparatus according to the present invention requires this technical matter as essential. Do not. This is because if the auxiliary data can be supplied from a recording source different from the recording medium on which the video data is recorded, the above-mentioned object can be achieved without receiving the supply from the server device. . BRIEF DESCRIPTION OF THE FIGURES
- FIG. 1 is a diagram showing a mode of use of the playback device according to the present invention.
- FIG. 2 is a diagram showing a configuration of the BD-R0M.
- FIG. 3 is a diagram schematically showing how the AV Clip is configured.
- FIG. 4 (a) is a diagram showing a configuration of a presentation graphics stream.
- FIG. 4B is a diagram showing the internal configuration of the PES bucket.
- FIG. 5 is a diagram showing a logical structure composed of various types of functional segments.
- FIG. 6 is a diagram showing a relationship between the subtitle display position and the Epoch.
- FIG. 7 (a) is a diagram showing how a graphics object by 0DS is performed.
- FIG. 7B shows the data structure of the PDS.
- FIG. 8A is a diagram showing the data structure of WDS.
- FIG. 8B shows the data structure of the PCS.
- Figure 9 is a diagram showing a description example for implementing caption display.
- FIG. 10 is a diagram illustrating a description example of a PCS in DS1.
- FIG. 11 is a diagram illustrating a description example of a PCS in DS2.
- FIG. 12 is a diagram illustrating a description example of a PCS in DS3.
- FIG. 13 is a diagram showing a comparison between the contents of movie content and the contents of subtitle content.
- FIG. 14 is a diagram showing an example of a text type subtitle content.
- FIG. 15 is a diagram showing the internal configuration of the playback device according to the present invention.
- FIG. 16 is a diagram showing an internal configuration of the graphics decoder 9.
- FIG. 17 is a flowchart showing the processing procedure of the Graphics Controller 3.7.
- FIG. 18 is a flowchart showing the processing procedure of the movie content playback processing.
- FIG. 19 is a front chart showing the processing procedure of the text subtitle display processing.
- FIGS. 20 (a) to 20 (c) are diagrams showing the process of enlarging the outer information based on the resolution ratio.
- FIGS. 21 (a) to 21 (c) are explanatory diagrams of a conversion process of an HTML document by the control unit 29 according to the second embodiment.
- FIGS. 22 (a) to 22 (c) are explanatory diagrams showing a processing procedure of line spacing adjustment.
- FIG. 1 is a diagram showing a mode of use of a playback device according to the present invention.
- the playback device according to the present invention The device 200, together with the television 300 and the remote controller 400, forms a home theater system.
- BD-R0M100 has the purpose of supplying movie content to home theater systems.
- Movie content is composed of AVClip, which is a digital stream, and Clip information, which is its management information.
- AVClip is entity data that constitutes video, audio, and subtitles of movie content.
- Subtitles in movie content are bitmap-type subtitles and consist of an elementary stream called a graphics stream.
- the Cli information includes resolution information indicating the resolution for expressing a frame image in moving image data.
- the resolution indicated in the resolution information is typically a numerical value such as 1920 ⁇ 1080 (1080i), 720 ⁇ 480 (480i, 480 ⁇ ), 1440 1080, 1280 720, 540 ⁇ 480.
- the suffix "i” means the interlace method, and the suffix "P” means the progressive method.
- the playback device 200 loads the BD-R0M100 and plays the movie content recorded on the BD-R0M100.
- the display device 300 is connected to a playback device via an HDMK High Definition Multimedia Interface), and the playback device can acquire resolution information from the display device 300 via the HDMI. . Since this resolution information indicates the resolution of the display device 300, the playback device can know whether the display resolution of the display device 300 is high resolution or standard image quality.
- Remote controller 400 is a portable device that receives an operation from a user.
- the server device 500 holds subtitle content in various languages, and supplies the subtitle content to the playback device in response to a request from the playback device.
- FIG. 2 is a diagram showing a configuration of the BD-ROM.
- the fourth row of the figure shows the BD-ROM, and the third row shows the tracks on the BD-ROM.
- the track in this figure is drawn by extending a spiral track from the inner circumference to the outer circumference of the BD-R0M in the horizontal direction.
- This track includes a read-in area, a volume area, and a read-write area.
- the film area in this figure has a layer model of a physical layer, a file system layer, and an application layer. Expressing the application layer format (application format) of the BD-ROM using the directory structure is as shown in the first row of the figure.
- the BD-ROM has a BDMV directory under the ROOT directory, and files such as XXX.M2TS and XXX.CLPI under the BDMV directory.
- files such as XXX.M2TS and XXX.CLPI under the BDMV directory.
- File XXX. M2TS corresponds to AVClip
- file XXX.CLPI corresponds to Cli information.
- the recording medium according to the present invention is produced by creating an application format as shown in FIG.
- AVClip As the components (AVClip—Clip information) of the movie content, AVClip will be described.
- FIG. 3 is a diagram schematically showing how the AVClip is configured.
- AVCli stage 4
- video stream consisting of multiple video frames (pictures pjl, 2, 3)
- audio and video streams consisting of multiple audio frames (stage 1)
- PES It is converted to a packet sequence (2nd stage), and further converted to TS packets (3rd stage).
- a subtitle presentation graphics stream and an interactive interactive graphics stream (2nd stage) (7th stage) is converted to a PES bucket sequence (6th stage), further converted to a TS bucket (5th stage), and multiplexed.
- FIG. 4A is a diagram showing a configuration of the presentation graphics stream.
- the first level shows a sequence of TS packets constituting the AVClip.
- the second row shows the PES packet sequence that makes up the graphics stream.
- the PES bucket sequence in the second stage is constructed by extracting the payload from the TS bucket having a predetermined PID out of the TS buckets in the first stage and concatenating them.
- the third row shows the configuration of the graphics stream.
- the graphics stream consists of function segments called PCS (Presentation Composition Segment), WDS (Window Define Segment), PDSCPalette Dif inition Segment, ODS Object—Definition—Segment, and END (END of Display Set Segment). Consisting of Of these functional segments, PCS is called a screen composition segment, and WDS, PDS, ODS, and END are called definition segments. The correspondence between PES buckets and functional segments is one-to-one and one-to-many.
- Fig. 4 (b) is a diagram showing a PES bucket obtained by converting functional segments.
- the PES packet consists of a packet header and a payload, and this payload corresponds to the functional segment entity.
- the packet header there are DTS and PTS corresponding to this functional segment.
- the DTS and PTS existing in the header of the PES packet storing the functional segment are treated as the DTS and PTS of the functional segment.
- FIG. 5 is a diagram showing a logical structure composed of various types of functional segments.
- the function segment is shown in the third row
- the Display Set is shown in the second row
- the Epoch is shown in the first row.
- the second-stage Display Set (abbreviated as DS) refers to a set of multiple functional segments that make up a graphics stream and that make up a Dallax for one screen.
- the dashed line in the figure indicates which DS the functional segment in the third row belongs to.
- PCS-TOS- PDS-0DS-END It can be understood that a series of functional segments described above constitute one DS.
- the playback device can compose graphics for one screen by reading the multiple function segments that make up this DS from the BD-ROM.
- the first stage Epoch refers to one period that has continuity of memory management on the playback time axis of AVC Lip, and a data group assigned to this period.
- the memory assumed here is a graphics plane for storing graphics for one screen, and an object buffer for storing decompressed graphics data.
- the continuity of memory management for these means that no flushing of these graphics planes and object buffers occurs during the period of this Epoch, and that a certain rectangular area in the graphics plane is used. This means that the graphics are erased and redrawn only in the case of (* Flash here means clearing all the stored contents of the plane and the buffer.)
- the vertical and horizontal size and position of this rectangular area are fixed throughout the Epoch period.
- FIG. 6 is a diagram showing a relationship between a subtitle display position and Epoch.
- Epochl the bottom margin of the screen becomes the subtitle drawing area.
- the upper margin of the screen becomes the subtitle drawing area (window2).
- the continuity of memory management in the buffer 'plane is guaranteed, so that the above caption display in the margin is performed seamlessly.
- Epoch Start a series of DSs called Epoch Start, Acquisition Point, and Normal Case constitute the first stage Epoch.
- Epoch Start a series of DSs called Epoch Start, Acquisition Point, and Normal Case constitute the first stage Epoch.
- O “Epoch Start:”, “Acquisition Point”, and “Normal Case” are types of DS. is there.
- the order of Acquisition Point. Normal Case in this figure is only an example, and either may be the first.
- EpochStart is a DS that has a display effect of "new display” and indicates the start of a new Epoch. Therefore, Epoch Start includes all the functional segments required for the next screen composition.
- the Epoch Start is located at a location known to be cueed, such as a chapter in a movie.
- “Acquisition Point” is a Display Set that has a display effect called “display refresh”, which is exactly the same as the preceding Epoch Start.
- the Acquisition Point DS is not at the start of Epoch, it includes all the functional segments required for the next screen composition, so if the cue is performed from the Acquisition Point DS, the graphics display will be assured. Can be realized.
- Normal CaseJ is a DS that has a display effect of" display update "and includes only the differences from the previous screen composition.
- the subtitle of a certain DSv has the same content as the preceding DSu, but if the screen configuration is different from the preceding DSu, a DSv of PCS only or a DSv of PCS only is provided and this DSv is Change to DS for Case.
- the DS of the Normal Case is only a difference, the screen cannot be composed by the Normal Case alone.
- “0bject—Definition_Segment” is a functional segment that defines a graphics object that is a bitmap type of graphics. This graphics object will be described below. Recorded on the BD-ROM The AVClip, which uses high-definition image quality as its selling point, has also set the resolution of graphics objects to a high-definition size of 1920 x 1080 pixels. With a resolution of 1920 x 1080, the BD-R0M can vividly reproduce subtitles for theatrical screenings, that is, rich handwritten subtitles. For the pixel color, the bit length of the index value (red color difference component (Cr value), blue color difference component (Cb value), brightness component Y value, transparency (T value)) per pixel is 8 bits. This makes it possible to select any 256 colors from the full-color 16,777,216 colors and set them as pixel colors. Subtitles by graphics objects are rendered by placing text on a transparent background.
- a graphics object is defined by 0DS with a data structure as shown in Fig. 7 (a).
- 0DS is a “segment—type” that indicates that it is itself a 0DS, “segment—length” that indicates the data length of the 0DS, and a graphic corresponding to this 0DS in Epoch.
- "Object-id” that uniquely identifies the graphics object, "object-version-number” that indicates the 0DS innovation in Epoch, riast_insequence_flagj, and part or all of the graphics object It consists of continuous byte length data robject_data_fragmentJ. The above is the description of 0DS.
- PDS Picture Definition Segment
- Figure 7 (b) shows the data structure of the PDS. As shown in Fig. 7 (b), the PDS has “segment one type” indicating that it is itself a PDS, “segment_length” indicating the data length of the PDS, and “pallet” that uniquely identifies the pallet included in this PDS. “Id”, “pallet—version—number”, which indicates the Epoch's PDS purge information in Epoch, and “palle entry”, which is information about each entry. “Pal let_entry” indicates the red color difference component (Cr value), blue color difference component (Cb value), luminance component Y value, and transparency (T value) in each entry.
- Cr value red color difference component
- Cb value blue color difference component
- T value transparency
- Window—definition—segment is a rectangle in the graphics plane This is a functional segment for defining an area. Epoch has already stated that continuity in memory management occurs only when clearing and redrawing are performed within a rectangular area in the graphics plane. The rectangular area in this graphics plane is called “window” and is defined in this WDS.
- FIG. 8A is a diagram showing the data structure of WDS.
- WDS consists of a “window_id” that uniquely identifies a window in the graphics plane, a “window—horizontal—position” that indicates the horizontal position of the upper left pixel in the graphics plane, and a graphics “Window—vertical—position”, which indicates the vertical position of the upper left pixel in the lane, “window-width”, which indicates the horizontal width of the window in the graphics plane, and “vertical” in the graphics plane It is expressed using “window_height” which indicates the width.
- PCS is a functional segment that composes interactive screens.
- the PCS has the data structure shown in Fig. 8 (b). As shown in this figure, PCS consists of “segment—type”, “segment—length”, “composion on—numberj”, “composition—state”, “pal let—update—flagj, and palLet— id ”and“ Composition—Object (1)-(! n) ”.
- Compositioii_number identifies the graphics update in the Display Set using a number from 0 to 15.
- Composition_state indicates whether the Display Set starting from this PCS is a Normal Case, the force obtained by Acquisition Point, and the force obtained by Epoch Start.
- Composition—Object (l) ⁇ ⁇ ⁇ (n) is information indicating how to control each window in the Display Set to which this PCS belongs.
- Figure The dashed line wdl in 8 (b) is a close-up of the internal composition of any Composition_Object (i).
- the composition—object (i) is “object—id”, “window—id”, “object—cropped—flag”, “object—horizontal—position” — “object”.
- Object_id indicates the identifier of the 0DS existing in the window corresponding to Composition_Object (i).
- Window-id indicates the window to be assigned to the graphics object in this PCS. Up to two graphics objects can be assigned to one window.
- “Object-cropped_flag” is a flag for switching between displaying the cropped graphics object in the object buffer and hiding the graphics object. When set to “1”, the cropped graphics object is displayed in the object buffer, and when set to "0", the graphics object is hidden.
- Object-horizontal-positionj indicates the horizontal position of the upper left pixel of the graphics ex- ject in the graphics plane.
- Object_vertical_position indicates the vertical position of the upper left pixel in the graphics plane.
- Object—cropping—horizontal—positioiU is a graphics The horizontal position of the upper left pixel of the cup rectangle in the drawing.
- the crop rectangle is a frame for cutting out a part of the graphic object.
- “Object—cropping—vertical—address” indicates the vertical position of the upper left pixel of the crop rectangle in the graphics plane.
- “Object—cropping—width” indicates the width of the crop rectangle in the graphics plane.
- “Object—cropping_height” indicates the height of the crop rectangle in the graphics plane.
- FIG. 9 is a description example for implementing such subtitle display.
- Epoch in this figure has DS1 (Epoch Start), DS2 (Normal Case) and DS3 (Normal Case).
- DS1 has a WDS that defines a window that serves as a subtitle display frame, a 0DS that represents the dialogue “You really were a kid,” and a first PCS.
- DS2 Normal Case
- DS3 Normal Case
- DS3 has a third PCS.
- FIG. 10 is a diagram illustrating a description example of a PCS in DS1.
- window_horizontal_position and window—vertical—position of WDS indicate the upper left coordinate LP1 of the window in the graphics plane
- window_width and windo_eigt indicate the width and height of the window display frame.
- the object_cropping—horizontal—position, object—cropping—vertical—position of the crop information in Figure 10 is based on the coordinate system whose origin is the upper left coordinate of the graphics object in the object tongue
- the crop area standard ST1 is shown.
- Object cropping width
- object The area indicated by croppingjieight (the thick frame in the figure) is the cropping area.
- the cropped graphics object is placed in the dashed line range cpl with object-horizontal-position and object-vertical-position as the reference points (upper left) in the coordinate system of the graphics plane. Be placed. In this way, "really” is written in the window in the graphics plane.
- the subtitle “Sure” is combined with the moving image and displayed.
- FIG. 11 is a diagram illustrating a description example of a PCS in DS2.
- the description of WDS in this figure is the same as that in FIG.
- the description of the crop information is different from that in FIG.
- object-cropping-horizontal-position, object-cropping-vertical-position is the subtitle on the object buffer.
- the upper left coordinate of "Potato” is shown in the "Your ga”, and the width and height of "object-cropping-height, object-cropping-width are Show. By doing so, "I was sorry” is written in the window on the graphics status plane.
- the subtitle “ ⁇ ” was combined with the moving image and displayed.
- FIG. 12 is a diagram illustrating a description example of a PCS in DS3.
- the description of WDS in this figure is the same as that in FIG.
- the description of the crop information is different from that in FIG.
- object—cropping—vertical—position is the subtitle on the object buffer.
- object—cropping—width indicates the width and height of ll . This will write “you” in a window in the graphics plane.
- the caption “You” is displayed in combination with the moving image.
- the effect of displaying subtitles can be realized.
- the display effects such as Fadeln / Out, Wipeln / Out, and Scroll can be realized by the description of the PCS. This is the advantage of PCS.
- 0DS is added to the function segments described above, and DTS and PTS are added to the PCS.
- the DTS of 0DS indicates the time to start decoding of 0DS with a time accuracy of 90 KHz, and the PTS of 0DS indicates the time to end decoding.
- the DTS of the PCS indicates when the PCS should load the buffer on the playback device.
- the PTS of the PCS indicates the timing of updating the screen using the PCS.
- the graphics stream forming the subtitle of the bitmap type has control information for displaying the subtitle and a time stamp indicating processing timing on the reproduction time axis. Therefore, the playback device can realize subtitle display by processing only the graphics stream.
- the above is the description of AVClip. Next, the Clip information will be described.
- Cli information (XXX.CLPI) is management information for each AVClip.
- Clip information (XXX.CLPI) consists of “attribute information” for video streams and audio streams, and “EP_map” which is a reference table at the time of cueing.
- the attribute information is: attribute information (Video attribute information) about the video stream, the number of attribute information (Number), and attribute information (Multi-distribution stream) multiplexed on the AVC1 ip. Audio attribute information # 1 to #m).
- the video attribute information includes the compression method used for the video stream (Coding), the resolution of the individual picture data that make up the video stream (Resolution), and the aspect ratio. (Aspect), and what the frame rate is (Framerate).
- the attribute information (Audio attribute information # 1 to #m) of the audio stream includes the compression method (Coding) of the audio stream and the compression method of the audio stream. What is the channel number of the system (Cli.), What language is supported (Lang), and what the sampling frequency is.
- the Resolution of the corresponding AV Clip indicates the resolution of the video stream in the Resolution of the Clip information described above.
- the subtitle content of the bitmap type consists of only the graphics stream among the multiple elementary streams that make up the AVCip shown above. Similar to the graphics stream in BD-R0M, the graphics stream that composes the subtitle content is composed of functional segments PCS, TOS, PDS, 0DS, and END, and PTS and DTS are added to each functional segment. . With these time stamps, the subtitle content is displayed in synchronization with the video stream on the BD-ROM side.
- FIG. 13 is a diagram showing a comparison between the contents of movie content and the contents of subtitle content.
- the upper part of Fig. 13 shows the video stream and graphics stream on the movie content side, and the lower part shows the graphics stream on the subtitle content side.
- the G0P and Display Set at the top of this figure are played back at 1 minute, 1 minute 40 seconds, and 2 minutes after the start of playback of the AV stream, respectively.
- the Display Set at the bottom of the figure is also played back 1 minute, 1 minute 40 seconds, and 2 minutes after the start of the playback of the AV stream, respectively.
- the adjustment of the reproduction timing is performed by setting desired values to ICS, TOS, PDS, and PTS and DTS attached to 0DS belonging to the Display Set.
- the HD Display Set is synchronized with the video with high time accuracy.
- this bitmap type subtitle content is prepared according to various resolutions. By downloading this subtitle content from the server device 500 in response to a request from the playback device, the combination of the display device and the content However, the control unit 200 can display subtitles at an appropriate resolution.
- Text-type caption content is captions composed of information necessary for caption display corresponding to text data. Text has a smaller amount of data than bitmap subtitles, and can be transferred in a short time even on a line with a relatively low transfer rate. For this reason, if there is a restriction on the line speed, it is desirable to use text subtitles.
- Figure 14 shows an example of a text-type subtitle content.
- the text type subtitle content includes a chapter number indicating the chapter where the subtitle exists, a “start time code” indicating the subtitle display start time, and an “end time code” indicating the subtitle display end time. ”, The“ display color ”of the subtitle, the“ size ”of the subtitle, and the“ display position ”of the subtitle in association with the text data.
- the “size” is determined by assuming that it is displayed on either SDTV or HDTV.
- the format used to develop this text subtitle is outline font (also called vector font), which is represented by outlines and endpoints.
- the contour can be enlarged smoothly.
- the present playback device enlarges and reduces the size of the trie font according to the resolution of the display device side, and sets the start time code and the end time code.
- Subtitles are displayed based on the subtitles.
- the term “enlargement” in this specification means that the original data is represented by a larger number of pixels.
- reduction in this specification refers to expressing original data with fewer pixels.
- FIG. 15 is a diagram showing the internal configuration of the playback device according to the present invention.
- the playback device according to the present invention is industrially produced based on the interior shown in the drawing.
- the playback device according to the present invention mainly includes two parts, a system LSI and a drive device, and can be industrially manufactured by mounting these parts on a cabinet and a substrate of the device.
- a system LSI is an integrated circuit that integrates various processing units that perform the functions of a playback device.
- the playback devices produced in this way include BD drive 1, read buffer 2, demultiplexer 3, video decoder 4, video plane 5, background still plane 6, synthesizing unit 7, switch 8, P-Graphics decoder 9, Presentation Graphics plane 10, synthesis unit 11, font generator 12, graphics decoder 13, switch 14, enhanced Interactive Graphics plane 15, synthesis unit 16, HDD 17 Read buffer 18, Demultiplexer 19, Audio decoder 20, Switch 21, Switch 22, Static scenario memory 23, Dynamic scenario memory 24, CLUT section 26, CLUT section 27, Switch 28, Control It consists of part 29.
- the BD-R0M drive 1 performs loading / ejection of the BD-ROM and executes access to the BD-R0M.
- the read buffer 2 is a FIFO memory in which TS buckets read from the BD-ROM are stored in a first-in first-out manner.
- the demultiplexer (De-MUX) 3 extracts the TS packet from the read buffer 2 and converts the TS packet into a PES bucket. Then, of the PES buckets obtained by the conversion, a desired one is output to any one of the video decoder 4, the audio decoder 20, the P-Graphics decoder 9, and the I-Graphics decoder 13.
- the video decoder 4 uses the multiple PES Decode the packet to obtain an uncompressed picture and write it to video plane 5.
- Video plane 5 is a plane for storing uncompressed pictures.
- the plane is a memory area for storing one screen of pixel data in the playback device. If a plurality of planes are provided in the playback device, and the stored contents of these planes are added for each pixel and video output is performed, the video output can be performed after combining the video content.
- the resolution in the video plane 5 is 1920 x 1080, and the picture data stored in the video plane 5 is composed of pixel data represented by a 16-bit YUV value.
- Background still plane 6 is a plane that stores a still image to be used as a background image.
- the resolution in this plane is 1920 x 1080
- the picture data stored in this Background still plane 6 is composed of 16-bit YUV pixel data.
- the combining unit 7 combines the uncompressed picture data stored in the video plane 5 with the still image stored in the Background still plane 6.
- the switch 8 is a switch for switching between outputting the uncompressed picture data in the video plane 5 as it is or outputting it in combination with the stored contents of the Background still plane 6.
- the P-Graphics decoder 9 decodes the graphics stream read from the BD-R0M and the HDD 17, and writes the raster graphics to the Presentation Graphics plane 10. Subtitles will appear on the screen by decoding the graphics stream.
- the Presentation Graphics plane 10 is a memory having an area for one screen, and can store one screen of raster graphics.
- the resolution in this plane is ⁇ 920 ⁇ 1080, and each pixel of the raster graph.ex in the Presentation Graphics plane 10 is represented by an 8-bit index color.
- CLUT Color Lookup Table
- Presentation Graphics The raster graphics stored in plane 10 are provided for display.
- the synthesizing unit 11 synthesizes any of the uncompressed picture data (i) and the picture data (ii) obtained by synthesizing the stored contents of the Background still plane 6 with the stored contents of the Presentation Graphics plane 10.
- the font generator 12 has an outline font, and performs character drawing by expanding the text code acquired by the control unit 29 using the outline font. This expansion is performed on the Enhanced Interactive Graphics plane 15.
- the I-Graphics decoder 13 decodes the interactive graphics stream read from the BD-R0M and HDD 17 and writes the raster graphics to the Enhanced Interactive Graphics plane 15. By decoding the interactive graphics stream, the buttons that make up the interactive screen will appear on the screen.
- the switch 14 selectively selects one of a font sequence generated by the font generator 12, a drawing content drawn directly by the control unit 29, and a button generated by the I-Graphics decoder 13. This is a switch to be used in the Enhanced Interactive Graphics plane 15.
- the Enhanced Interactive Graphics plane 15 is a display plane that can support a resolution of 1920 x 1080 and a resolution of 960 x 540.
- the synthesizing unit 16 synthesizes the uncompressed picture data (i), the picture data (ii) in which the stored contents of the Background still plane 6 are synthesized, and the stored contents of the Presentation Graphics plane 10 and the Background still plane 6.
- the picture data (iii) is combined with the contents stored in the Enhanced Interactive Graphics plane 15.
- the HDD 17 is an internal medium for storing subtitle content obtained by downloading from a server device.
- the read buffer 18 is a FIFO memory, and stores TS packets read from the HDD 17 in a first-in first-out manner.
- the demultiplexer (De-MUX) 19 extracts the TS packet from the read buffer 18 and converts the TS packet into a PES bucket. And convert Of the PES buckets obtained by the above, a desired one is output to one of the audio decoder 20 and the P-Graphics decoder 9.
- the audio decoder 20 decodes the PES bucket output from the demultiplexer 19 and outputs uncompressed audio data.
- the switch 21 is a switch for switching the input source to the audio decoder 20. With this switch, the input to the audio decoder 20 is switched between the BD-R0M side and the HDD side.
- the switch 22 switches the input source to the P-Graphics decoder 9 and is read from the presentation graphics stream and the BD-R0M read from the HDD 17 by the switch 22.
- the output presentation graphics stream can be selectively input to the P-Graphics decoder 9.
- the static scenario memory 23 is a memory for storing the current CI ip information.
- the current C Lip information is the information currently being processed from among the plurality of C Lip information recorded on the BD-R0M.
- the communication unit 24 accesses the server device 500 in response to an instruction from the control unit 29, and downloads subtitle content from the server device 500.
- the switch 25 stores various data read from the BD-ROM and the HDD 17 in one of the read buffer 2, the read buffer 18, the static scenario memory 23, and the dynamic scenario memory 24. This is a switch that is selectively turned on.
- the CLUT unit 26 converts the index colors in the raster graphics stored in the video plane 5 based on the Y, Cr, and Cb values indicated in the PDS.
- the 111 ”unit 27 calculates the index colors in the raster graphics stored in the Enhanced Interactive Graphics plane 15 based on the Y, Cr, and Cb values indicated in the PDS included in the presentation graphics stream. Convert.
- the switch 28 is a switch for switching the conversion by the CLUT unit 27 so as to output through.
- the control unit 29 acquires resolution information indicating the resolution on the display device side via HDMI. Then, this is compared with the resolution in the Clip information to calculate a resolution ratio. If the resolution ratio is 1.0, the graphics stream multiplexed on the AVClip is displayed as it is. If the resolution ratio is not 1, the subtitle content recorded on HDD17 is displayed.
- the font string is given to the font generator 12 by giving the text font type to the font generator 12 (Text and Font). Is generated and placed on the Enhanced Interactive Graphics plane 15. If the drawing of the Enhanced Interactive Graphics plane 15 is made in this way, the storage contents of the video plane 5 are ordered to be enlarged and reduced, and then the storage contents of the Enhanced Interactive Graphics plane 15 are combined by the combining unit 16. (Display layout ControDo
- the graphics decoder 9 includes a coded data buffer 33, a peripheral circuit 33a, a Stream Graphics Processor 34, an Object Buffer 35, a Composition Buffer 36, and a Graphics Controller 37.
- Coded Data Buffer 33 is a buffer in which functional segments are stored together with DTS and PTS.
- the peripheral circuit 33a is a wire logic that realizes transfer between the Coded Data Buffer 33 and the Stream Graphics Processor 34 and transfer between the Coded Data Buffer 33 and the Composition Buffer 36. In this transfer processing, if the current time point is the time indicated in the DTS of 0DS, the 0DS is transferred from the Coded Data Buffer 33 to the Stream Graphics Processor 34. If the current time is the time indicated in the DTS of the PCS and PDS, the PCS and PDS are transferred to the Composition Buffer 36.
- the Stream Graphics Processor 34 decodes the 0DS and writes the uncompressed bitmap composed of the index scalar obtained by the decoding to the Object Buffer 35 as a graphics object.
- the Object Buffer 35 a Dallax object obtained by decoding by the Stream Graphics Processor 34 is placed.
- Composition Buffer 36 is a memory in which PCS and PDS are arranged.
- the Graphics Controller 37 decodes the PCS placed in the Composition Buffer 36 and performs control based on the PCS.
- the execution timing of this control is based on the value of the PTS added to the PCS.
- the above is the description of the internal configuration of the P-Graphics decoder 9.
- Graphics Controller 37 is described. Graphics Controllers 7 is a flowchart for performing the processing procedure shown in FIG.
- Step S1 is the main routine of the flowchart, and waits for the establishment of the event specified in step S1.
- Step S1 is for judging whether or not the current reproduction time point is the DTS time of the PCS. If so, the processing of steps S1 to S13 is performed.
- Step S1 is for determining whether or not the composition_state of the PCS is Epoch-Start. If it is Epoch Start, the graphics plane 8 is completely cleared in Step S6. Otherwise, in step S7, clear the window indicated by WDS windowjiorizontal—position, indow_vertival_position window_width, window—height.
- Step S8 is a step executed after clearing in step S6 or step S7, and is a determination as to whether or not the PTS time of an arbitrary ODSx has already passed. In other words, when clearing the entire graphics plane 8, it takes a long time to clear the entire graphics plane 8, so that decoding of a certain ODS (ODSx) may have already been completed. Step S8 verifies that possibility. If not, return to the main routine. If the decoding time of any of the 0DS has passed, steps S9 to S11 are executed. Step S9 is a determination as to whether or not the object_crop_flag indicates 0. If the flag indicates 0, the graphics The object is hidden (step S10).
- step S11 If it does not show 0, object— cropping— horizontal— posi tioiu object— cropping— vertival— position, cropping— width, cropping_height Write to the position indicated by object_cropping_horizontl_position, object_crop mg_vertival_position in window of lane 8 (step S11).
- object_cropping_horizontl_position By the above processing, one or more graphics objects are drawn in the window.
- Step S12 is a determination as to whether or not the PTS time of another ODSy has elapsed. If ODSx is already written to the graphics plane 8 and another 0DS has already been decoded, this ODSy is changed to ODSx (step S13), and the process proceeds to step S9. As a result, the processing of steps S9 to S51 is repeated for another 0DS. '
- FIG. 18 is a flowchart showing a processing procedure of the movie content reproduction processing.
- step S21 the resolution in the Clip information is referred to, and in step S22, the resolution is obtained from the connected display device via HDMI.
- step S23 the resolution ratio between the movie content and the display device is calculated.
- step S24 the video stream multiplexed on the AVCUp is supplied to the video decoder 4 to start the reproduction of the moving image.
- step S25 it is determined whether or not the resolution ratio is 1. If it is 1, in step S26, switch 22 and switch 25 are switched, and the graphics stream multiplexed on the AVClip is input to the CLUT section 9 to display subtitles.
- step S27 it is determined whether or not subtitle content exists in the HD. If there is, the process skips step S28 and shifts to step S29. If there is, the subtitle content is downloaded from the server device to HD in step S28.
- step S29 whether the subtitle content is text type or Judgment is performed to determine whether or not it is a map type. If it is a bitmap type, the subtitle display is executed by switching the switch 22 and the switch 25 in step S30 and inputting the subtitle content on HD to the P-Graphics decoder 9.
- step S31 display processing of text subtitles is performed.
- FIG. 19 is a flowchart showing the processing procedure of the text subtitle display processing.
- the processing consisting of steps S33 to S37 of this flowchart is equivalent to steps S1 to S13 in FIG. 17, and it is said that subtitles are displayed as the playback of the video stream progresses. Things. That is, the reproduction of the graphics stream multiplexed on the AVClip may be left to the Graphics Controller 37, but the reproduction of the text data must be performed by the control unit 29 itself.
- FIG. 19 shows a processing procedure for the control unit 29 itself to reproduce text data.
- Step S33 to step S35 of step S33 to step S37 constitute a loop process, and determine any of the events defined in step S33 to step S35. .
- step S33 it is determined whether or not a start time code in the subtitle content that matches the current reproduction time point exists. If it exists, the process goes to step S36 with this time code as the start time code i.
- Step S36 is a process of expanding and displaying the characters in the text data corresponding to the start time code i using the out-line font.
- step S34 it is determined whether or not the current reproduction time point corresponds to the start time code i. If so, erase the character displayed in step S37.
- Step S35 is for judging whether or not the reproduction of the movie content has ended, and if so, the process of this flowchart ends.
- step S36 that is, the process of enlarging the image font based on the resolution ratio
- the size of the caption in the caption content is determined to be one of SDTVZHDTV. This is because, like the subtitle graph graphics, instead of producing subtitles for both SDTV and HDTV, there is a desire by movie producers to reduce production costs by omitting production on one side. is there. If the “size” of the subtitle content is SDTV-compatible, even though the display 300 is HDTV, the HDTV-type display 300 must display SDTV-compatible subtitles.
- FIG. 20 (a) is a diagram showing a pixel shape in each of SDTV and HDTV.
- the shape of one pixel is a horizontally long rectangle.
- the shape of one pixel in an HDTV type display device is a square. Therefore, if the font that assumes the resolution of SDTV is directly expanded, as shown in Fig. 20 (b), the characters that make up the subtitles appear vertically long and are unsightly. Therefore, when expanding, the vertical and horizontal magnifications are set to different values.
- the subtitle content compatible with SDTV is displayed on an HDTV type display device.
- the resolution of the display device is 1920 ⁇ 1080 and the resolution of the movie content is 720 ⁇ 480
- the out-of-print size determined for the SDTV-compatible size is 2.67 times the width and 2 heights as shown in Figure 20 (c). . Magnify 25 times.
- the subtitles will be displayed at the resolution corresponding to the resolution of the display device side be able to. If the playback device has an outline font for a set of characters used in one language system, the subtitles can be appropriately displayed on the display device.
- the outline font is reduced to 0.375 times in width and 0.444 times in height. If the text data is expanded using the font thus reduced and displayed according to the start time code and end time code, subtitles can be displayed at a resolution corresponding to the resolution of the display device side. . Since the outline can be expanded to any number of pixels, it is not necessary to equip the playback device with both SDTV-compatible fonts and HDTV-compatible fonts. If the playback device is equipped with a character font of a set of characters used in one language system, it is possible to display the subtitles properly on the display device. it can.
- the subtitle content obtained from the server device is used, it is possible to display subtitles that match the resolution of the display device without enlarging or reducing the presentation graphics stream multiplexed on the AV Clip. Can be done. Since it is not necessary to enlarge or reduce the bitmap, the subtitles can be displayed beautifully even if the subtitles multiplexed on the AVClip are drawn in the bitmap.
- Such substitution of subtitle data is performed when the ratio between the resolution on the content side and the resolution on the display device side is not 1, and is not performed when both resolutions match.
- the communication cost associated with downloading caption data from the server device can be minimized.
- the size of the subtitle is adjusted by enlarging / reducing the outline font according to the resolution ratio.
- the second embodiment realizes subtitle display using a bitmap font. Since bitmap fonts require a smaller processing load for expansion than autoinfor- mation, bitmap fonts are suitable for use in displaying subtitles on CPUs with limited performance.
- the text data in the subtitle content shown in Fig. 14 has been replaced with an HTML document described in HTML. Then, the control unit 29 realizes subtitle display by interpreting the HTML document and performing display processing.
- the control unit 29 according to the second embodiment converts the HTML document described in HTML so as to match the resolution of the display device side.
- it performs conversion processing. Referring to FIG. 21, the conversion processing by the control unit 29 according to the second embodiment will be described. Will be described.
- the upper half shows the HTML document before resolution conversion
- the lower half shows the HTML document after resolution conversion.
- the fonts that the browser can display are: There are multiple points, which can be indicated by a font size from 1 to 7. In the example in Figure 21, the font with the smallest number of points is selected for display in the HTML document. Means this.
- the disadvantage of this enlargement method is that when displaying subtitles in two lines, the subtitle display area changes. That is, as described above, in the SDTV type display device, the shape of one pixel is a horizontally long rectangle. In contrast, the shape of one pixel in an HDTV type display device is a square. Therefore, if the subtitles in two lines are to be HD-compatible, the characters that make up each line will change from the rectangular shape shown in Fig. 21 (b) to the square shape shown in Fig. 21 (c). The subtitles expand upward, occupying more area in the layout of the entire screen.
- FIGS. 22 (a) to 22 (c) are explanatory diagrams showing a processing procedure of line spacing adjustment. It is assumed that subtitles are displayed in two lines as shown in Fig. 22 (a). As shown in Fig. 20, if this subtitle is enlarged vertically by 2.25 times, or if a large font is used as shown in Fig. 21, the line spacing will be reduced as shown in Fig. 22 (b). Spread too much.
- the font is enlarged to 267% and the line spacing is set to 84%.
- the third embodiment is an explanation of page content composed of “document + still image”.
- Such content is a document obtained by embedding a still image in a document described in a markup language, and is often found on web pages. Such contents are also used as menu images in BD-ROM. In such content, the still image is displayed in a reduced size so as to fit within a predetermined frame in the document.
- Img src ". ./Picture/xxx.j pg" If this is to be displayed on an HDTV, it must be enlarged and displayed based on the horizontal and vertical resolution ratios obtained in the first embodiment. To realize the enlarged display, the above description of HTML can be rewritten as follows. Good,
- JFIF format still images are "appkication TypeO segment"
- start of frame typeO segment consists of multiple functional segments called "start of frame typeO segment", "Image—Width”, and "Image—Height”.
- JFIF still image data is shown below.
- the still image reduced in the horizontal and vertical directions is enlarged as described above. That is, when the still image is enlarged as described above, it is enlarged horizontally and vertically as shown in the following ratio.
- the ratio of the resolution of the display device to the resolution of the content side is obtained instead of enlarging the page content in a state where the still image is inserted.
- the above description does not show all the modes of carrying out the present invention.
- the embodiment of the present invention can also be implemented by the following forms of implementation (A), (B), (0 (D), etc.).
- This embodiment is an expanded description or generalized description of the embodiments and their modifications. The extent of the expansion or generalization is based on the characteristics of the state of the art in the technical field of the present invention at the time of filing.
- the above description does not show all the modes of carrying out the present invention.
- the embodiment of the present invention can also be implemented by the following forms of implementation (A), (B), (0 (D), etc.).
- This embodiment is an expanded description or generalized description of the embodiments and their modifications. The extent of the expansion or generalization is based on the characteristics of the state of the art in the technical field of the present invention at the time of filing.
- caption data has been described as an example of auxiliary data, but the present invention is not limited to this. Menus, buttons, icons, and banners can be used if they are played along with the video.
- the display target may be subtitle graphics selected according to the display settings on the device side.
- graph status for various display modes such as wide vision, pan scan, and letterbox is available.
- the device selects and displays any one of them according to the settings of the television connected to itself.
- the subtitle graphics displayed in this way are given a display effect based on the PCS, so that the appearance is improved.
- a display effect using characters as expressed in the main body of the moving image can be realized with captions displayed according to the display settings of the device, and thus has a great practical value.
- subtitles are displayed horizontally on the upper and lower sides of the screen.
- subtitles may be displayed on the right and left sides of the screen. In this way, Japanese subtitles can be displayed vertically.
- the AVC Lip may implement karaoke, and in this case, subtitles may be added according to the progress of the song.
- the display effect of changing the color of the image may be realized.
- the playback device receives the supply of subtitle data from the server device, but may receive the supply from a source other than the server device.
- a source other than the server device For example, a recording medium other than the BD-ROM may be separately purchased, and the caption data may be supplied by installing this in the HDD. Also, the connection with the semiconductor memory in which the caption data is recorded may allow the supply of the caption data.
- the recording medium and the playback device according to the present invention can realize subtitle display according to the combination of the display device and the content, so that a movie product with higher added value can be supplied to the market. It can stimulate the consumer equipment market. Therefore, the playback device according to the present invention has high applicability in the movie industry and the consumer electronics industry.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Television Signal Processing For Recording (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Indexing, Searching, Synchronizing, And The Amount Of Synchronization Travel Of Record Carriers (AREA)
- Studio Circuits (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04728900A EP1628477A4 (en) | 2003-04-22 | 2004-04-22 | REPRODUCTIVE DEVICE AND PROGRAM |
US10/549,608 US20060204092A1 (en) | 2003-04-22 | 2004-04-22 | Reproduction device and program |
JP2005505782A JPWO2004095837A1 (ja) | 2003-04-22 | 2004-04-22 | 再生装置、プログラム。 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/420,426 | 2003-04-22 | ||
US10/420,426 US20040213542A1 (en) | 2003-04-22 | 2003-04-22 | Apparatus and method to reproduce multimedia content for a multitude of resolution displays |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004095837A1 true WO2004095837A1 (ja) | 2004-11-04 |
Family
ID=33298506
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/005778 WO2004095837A1 (ja) | 2003-04-22 | 2004-04-22 | 再生装置、プログラム。 |
Country Status (5)
Country | Link |
---|---|
US (2) | US20040213542A1 (ja) |
EP (1) | EP1628477A4 (ja) |
JP (1) | JPWO2004095837A1 (ja) |
CN (1) | CN1778111A (ja) |
WO (1) | WO2004095837A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006287364A (ja) * | 2005-03-31 | 2006-10-19 | Toshiba Corp | 信号出力装置及び信号出力方法 |
JP2006295518A (ja) * | 2005-04-11 | 2006-10-26 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2009081610A (ja) * | 2007-09-26 | 2009-04-16 | Nippon Telegr & Teleph Corp <Ntt> | デジタルシネマ再生装置,デジタルシネマ再生方法およびデジタルシネマ再生プログラム |
WO2010146847A1 (ja) * | 2009-06-17 | 2010-12-23 | パナソニック株式会社 | 3d映像を再生するための情報記録媒体、及び再生装置 |
JP2011151851A (ja) * | 2011-04-18 | 2011-08-04 | Sony Corp | 再生装置、再生方法および再生プログラム |
WO2012017603A1 (ja) * | 2010-08-06 | 2012-02-09 | パナソニック株式会社 | 再生装置、集積回路、再生方法、プログラム |
US8638861B2 (en) | 2006-02-22 | 2014-01-28 | Sony Corporation | Reproducing apparatus, reproducing method and reproducing program |
US8928669B2 (en) | 2009-07-17 | 2015-01-06 | Seiko Epson Corporation | OSD display control program product, OSD display control method, and OSD display device |
JPWO2016038791A1 (ja) * | 2014-09-10 | 2017-06-22 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 記録媒体、再生装置および再生方法 |
Families Citing this family (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8737810B2 (en) * | 2002-11-15 | 2014-05-27 | Thomson Licensing | Method and apparatus for cropping of subtitle elements |
US9171577B1 (en) | 2003-04-25 | 2015-10-27 | Gopro, Inc. | Encoding and decoding selectively retrievable representations of video content |
CN101702757B (zh) * | 2003-04-28 | 2013-07-31 | 松下电器产业株式会社 | 记录介质和方法、再现装置和方法、程序和集成电路 |
EP1618562A4 (en) * | 2003-04-29 | 2011-03-16 | Lg Electronics Inc | RECORDING MEDIUM HAVING A DATA STRUCTURE FOR MANAGING REPRODUCTION OF GRAPHIC DATA, METHODS AND APPARATUSES FOR RECORDING AND REPRODUCING THEM |
KR101029132B1 (ko) * | 2003-05-05 | 2011-04-13 | 톰슨 라이센싱 | 프로그램을 레코딩하기에 충분한 공간이 존재하는지 여부를 나타내는 방법 및 장치 |
KR20040099058A (ko) * | 2003-05-17 | 2004-11-26 | 삼성전자주식회사 | 서브타이틀 처리 방법, 그 재생 장치 및 그 정보저장매체 |
CN1849821B (zh) * | 2003-07-11 | 2012-06-20 | 松下电器产业株式会社 | 记录方法、再现装置和方法 |
EP1511004A3 (en) * | 2003-08-19 | 2010-01-27 | Sony Corporation | Memory controller, memory control method, rate conversion apparatus, rate conversion method, image-signal-processing apparatus, image-signal-processing method, and program for executing these methods |
KR100565058B1 (ko) * | 2003-08-22 | 2006-03-30 | 삼성전자주식회사 | 최적의 디스플레이 환경을 설정하는 dvd 플레이어 및그 동작 방법 |
US8233779B2 (en) * | 2004-07-09 | 2012-07-31 | Panasonic Corporation | Recording medium, recording method, reproduction apparatus and method, and computer-readable program |
WO2006051037A1 (en) | 2004-11-09 | 2006-05-18 | Thomson Licensing | Bonding contents on separate storage media |
JP2006179973A (ja) * | 2004-12-20 | 2006-07-06 | Toshiba Corp | 電子機器及びその制御方法 |
US9329827B2 (en) * | 2004-12-29 | 2016-05-03 | Funmobility, Inc. | Cropping of images for display on variably sized display devices |
KR100615676B1 (ko) * | 2005-01-11 | 2006-08-25 | 삼성전자주식회사 | 콘텐츠 재생장치 및 그의 gui화면 디스플레이방법 |
JP5124912B2 (ja) * | 2005-06-23 | 2013-01-23 | ソニー株式会社 | 電子広告システム及び電子広告方法 |
TW200826584A (en) * | 2005-12-21 | 2008-06-16 | Koninkl Philips Electronics Nv | A method and apparatus for sharing data content between a transmitter and a receiver |
JP2008067223A (ja) * | 2006-09-08 | 2008-03-21 | Toshiba Corp | データ放送コンテンツ再生装置及びデータ放送コンテンツ再生方法 |
JP4715734B2 (ja) * | 2006-12-05 | 2011-07-06 | 船井電機株式会社 | 光ディスク装置 |
US8625663B2 (en) * | 2007-02-20 | 2014-01-07 | Pixar | Home-video digital-master package |
US9098868B1 (en) | 2007-03-20 | 2015-08-04 | Qurio Holdings, Inc. | Coordinating advertisements at multiple playback devices |
US8756103B1 (en) * | 2007-03-28 | 2014-06-17 | Qurio Holdings, Inc. | System and method of implementing alternative redemption options for a consumer-centric advertising system |
US8560387B2 (en) | 2007-06-07 | 2013-10-15 | Qurio Holdings, Inc. | Systems and methods of providing collaborative consumer-controlled advertising environments |
US9881323B1 (en) * | 2007-06-22 | 2018-01-30 | Twc Patent Trust Llt | Providing hard-to-block advertisements for display on a webpage |
US20110019088A1 (en) * | 2008-04-17 | 2011-01-27 | Daisuke Kase | Digital television signal processor and method of displaying subtitle |
CN101668132A (zh) * | 2008-09-02 | 2010-03-10 | 华为技术有限公司 | 一种字幕匹配处理的方法和系统 |
WO2010058546A1 (ja) * | 2008-11-18 | 2010-05-27 | パナソニック株式会社 | 立体視再生を行う再生装置、再生方法、プログラム |
US8335425B2 (en) * | 2008-11-18 | 2012-12-18 | Panasonic Corporation | Playback apparatus, playback method, and program for performing stereoscopic playback |
JP2010226705A (ja) * | 2009-02-27 | 2010-10-07 | Sanyo Electric Co Ltd | 撮像システム |
EP2230839A1 (en) * | 2009-03-17 | 2010-09-22 | Koninklijke Philips Electronics N.V. | Presentation of video content |
US8019903B2 (en) * | 2009-03-27 | 2011-09-13 | Microsoft Corporation | Removable accessory for a computing device |
WO2010137261A1 (ja) * | 2009-05-25 | 2010-12-02 | パナソニック株式会社 | 記録媒体、再生装置、集積回路、再生方法、プログラム |
US20110013888A1 (en) * | 2009-06-18 | 2011-01-20 | Taiji Sasaki | Information recording medium and playback device for playing back 3d images |
CN102474603B (zh) * | 2009-07-04 | 2015-04-22 | 杜比实验室特许公司 | 帧兼容三维传输中全分辨率图形、菜单和字幕的支持 |
CN101996206B (zh) * | 2009-08-11 | 2013-07-03 | 阿里巴巴集团控股有限公司 | 一种呈现网页页面的方法、装置及系统 |
CN102014258B (zh) * | 2009-09-07 | 2013-01-16 | 艾比尔国际多媒体有限公司 | 多媒体字幕显示系统与方法 |
KR20110032678A (ko) * | 2009-09-23 | 2011-03-30 | 삼성전자주식회사 | 디스플레이장치, 시스템 및 그 해상도 제어방법 |
CN102194504B (zh) * | 2010-03-15 | 2015-04-08 | 腾讯科技(深圳)有限公司 | 媒体文件播放方法、播放器和用于媒体文件播放的服务器 |
JP6068329B2 (ja) | 2010-04-01 | 2017-01-25 | トムソン ライセンシングThomson Licensing | 立体表示用のサブタイトルを生成する方法及びシステム |
US20120134529A1 (en) * | 2010-11-28 | 2012-05-31 | Pedro Javier Vazquez | Method and apparatus for applying of a watermark to a video during download |
US8537195B2 (en) * | 2011-02-09 | 2013-09-17 | Polycom, Inc. | Automatic video layouts for multi-stream multi-site telepresence conferencing system |
JP2015050655A (ja) * | 2013-09-02 | 2015-03-16 | ソニー株式会社 | 情報表示装置及び情報表示方法、並びにコンピューター・プログラム |
JP6512458B2 (ja) * | 2014-06-30 | 2019-05-15 | パナソニックIpマネジメント株式会社 | データ再生方法及び再生装置 |
US10595099B2 (en) * | 2015-04-05 | 2020-03-17 | Lg Electronics Inc. | Method and device for transmitting and receiving broadcast signal for broadcast service on basis of XML subtitle |
CA2991102A1 (en) * | 2015-07-09 | 2017-01-12 | Sony Corporation | Reception apparatus, reception method, transmission apparatus, and transmission method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001045436A (ja) * | 1999-07-27 | 2001-02-16 | Nec Corp | デジタル放送受信機及びデータ伝送装置 |
JP2002016884A (ja) * | 2000-06-29 | 2002-01-18 | Matsushita Electric Ind Co Ltd | 映像信号再生装置 |
JP2002027429A (ja) * | 2000-05-18 | 2002-01-25 | Deutsche Thomson Brandt Gmbh | オーディオ翻訳データをオンデマンドで与える方法及びその受信器 |
JP2002152691A (ja) * | 2000-11-16 | 2002-05-24 | Pioneer Electronic Corp | 情報再生装置及び情報表示方法 |
JP2002247526A (ja) * | 2001-02-19 | 2002-08-30 | Toshiba Corp | 内外ストリームデータの同期再生装置とストリームデータ配信装置 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5307171A (en) * | 1989-07-24 | 1994-04-26 | Hitachi, Ltd. | Video tape recorder/player |
US6771888B1 (en) * | 1993-10-29 | 2004-08-03 | Christopher J. Cookson | Data structure for allowing play of a video program in multiple aspect ratios |
JPH09182109A (ja) * | 1995-12-21 | 1997-07-11 | Sony Corp | 複合映像機器 |
JP3345019B2 (ja) * | 1996-03-29 | 2002-11-18 | 松下電器産業株式会社 | インタラクティブな再生進行の性能を向上させた記録媒体の記録方法、再生装置および再生方法 |
EP0838948B1 (en) * | 1996-05-09 | 2000-03-15 | Matsushita Electric Industrial Co., Ltd. | Multimedia optical disk, reproducing device, and reproducing method capable of superposing sub-video upon main video in well-balanced state irrespective of position of main video on screen |
JP3742167B2 (ja) * | 1996-12-18 | 2006-02-01 | 株式会社東芝 | 画像表示制御装置 |
JP4346114B2 (ja) * | 1997-03-12 | 2009-10-21 | パナソニック株式会社 | 複数の標準的な出力信号を提供するmpegデコーダ |
US6141457A (en) * | 1997-09-12 | 2000-10-31 | Samsung Electronics Co., Ltd. | Method and apparatus for processing a high definition image to provide a relatively lower definition image using both discrete cosine transforms and wavelet transforms |
JPH11252518A (ja) * | 1997-10-29 | 1999-09-17 | Matsushita Electric Ind Co Ltd | 字幕用副映像ユニット作成装置および記憶媒体 |
JP2000023061A (ja) * | 1998-07-02 | 2000-01-21 | Sony Corp | テレビジョン受信機 |
US6798420B1 (en) * | 1998-11-09 | 2004-09-28 | Broadcom Corporation | Video and graphics system with a single-port RAM |
KR100631499B1 (ko) * | 2000-01-24 | 2006-10-09 | 엘지전자 주식회사 | 디지털 티브이의 캡션 표시 방법 |
US6633725B2 (en) * | 2000-05-05 | 2003-10-14 | Microsoft Corporation | Layered coding of image data using separate data storage tracks on a storage medium |
JP3670934B2 (ja) * | 2000-06-01 | 2005-07-13 | 三洋電機株式会社 | デジタルテレビ放送受信機における文字データの表示方法 |
US6850571B2 (en) * | 2001-04-23 | 2005-02-01 | Webtv Networks, Inc. | Systems and methods for MPEG subsample decoding |
KR100456024B1 (ko) * | 2002-02-28 | 2004-11-08 | 한국전자통신연구원 | 디브이디 플레이어의 자막정보 재생 장치 및 방법 |
JP4008745B2 (ja) * | 2002-04-25 | 2007-11-14 | アルパイン株式会社 | Dvdビデオ再生装置及びサブピクチャストリームの再生制御方法 |
KR100910975B1 (ko) * | 2002-05-14 | 2009-08-05 | 엘지전자 주식회사 | 인터넷을 이용한 대화형 광디스크 재생방법 |
WO2004030356A1 (ja) * | 2002-09-25 | 2004-04-08 | Matsushita Electric Industrial Co., Ltd. | 再生装置、光ディスク、記録媒体、プログラム、再生方法 |
PL376934A1 (pl) * | 2002-11-27 | 2006-01-09 | Samsung Electronics Co., Ltd. | Urządzenie i sposób odtwarzania zawartości interakcyjnej przez sterowanie fontem zgodnie z przekształceniem współczynnika kształtu obrazu |
US7106383B2 (en) * | 2003-06-09 | 2006-09-12 | Matsushita Electric Industrial Co., Ltd. | Method, system, and apparatus for configuring a signal processing device for use with a display device |
JP2005100585A (ja) * | 2003-09-05 | 2005-04-14 | Toshiba Corp | 情報記憶媒体、情報再生装置、情報再生方法 |
-
2003
- 2003-04-22 US US10/420,426 patent/US20040213542A1/en not_active Abandoned
-
2004
- 2004-04-22 WO PCT/JP2004/005778 patent/WO2004095837A1/ja active Application Filing
- 2004-04-22 US US10/549,608 patent/US20060204092A1/en not_active Abandoned
- 2004-04-22 EP EP04728900A patent/EP1628477A4/en not_active Withdrawn
- 2004-04-22 JP JP2005505782A patent/JPWO2004095837A1/ja active Pending
- 2004-04-22 CN CNA2004800109776A patent/CN1778111A/zh active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001045436A (ja) * | 1999-07-27 | 2001-02-16 | Nec Corp | デジタル放送受信機及びデータ伝送装置 |
JP2002027429A (ja) * | 2000-05-18 | 2002-01-25 | Deutsche Thomson Brandt Gmbh | オーディオ翻訳データをオンデマンドで与える方法及びその受信器 |
JP2002016884A (ja) * | 2000-06-29 | 2002-01-18 | Matsushita Electric Ind Co Ltd | 映像信号再生装置 |
JP2002152691A (ja) * | 2000-11-16 | 2002-05-24 | Pioneer Electronic Corp | 情報再生装置及び情報表示方法 |
JP2002247526A (ja) * | 2001-02-19 | 2002-08-30 | Toshiba Corp | 内外ストリームデータの同期再生装置とストリームデータ配信装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1628477A4 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006287364A (ja) * | 2005-03-31 | 2006-10-19 | Toshiba Corp | 信号出力装置及び信号出力方法 |
JP2006295518A (ja) * | 2005-04-11 | 2006-10-26 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
US8638861B2 (en) | 2006-02-22 | 2014-01-28 | Sony Corporation | Reproducing apparatus, reproducing method and reproducing program |
JP2009081610A (ja) * | 2007-09-26 | 2009-04-16 | Nippon Telegr & Teleph Corp <Ntt> | デジタルシネマ再生装置,デジタルシネマ再生方法およびデジタルシネマ再生プログラム |
JP4647645B2 (ja) * | 2007-09-26 | 2011-03-09 | 日本電信電話株式会社 | デジタルシネマ再生装置,デジタルシネマ再生方法およびデジタルシネマ再生プログラム |
US8121460B2 (en) | 2009-06-17 | 2012-02-21 | Panasonic Corporation | Information recording medium and playback device for playing back 3D images |
JP4733785B2 (ja) * | 2009-06-17 | 2011-07-27 | パナソニック株式会社 | 3d映像を再生するための情報記録媒体、記録媒体の記録方法、再生装置、及び記録媒体再生システム |
WO2010146847A1 (ja) * | 2009-06-17 | 2010-12-23 | パナソニック株式会社 | 3d映像を再生するための情報記録媒体、及び再生装置 |
US8928669B2 (en) | 2009-07-17 | 2015-01-06 | Seiko Epson Corporation | OSD display control program product, OSD display control method, and OSD display device |
WO2012017603A1 (ja) * | 2010-08-06 | 2012-02-09 | パナソニック株式会社 | 再生装置、集積回路、再生方法、プログラム |
CN102598686A (zh) * | 2010-08-06 | 2012-07-18 | 松下电器产业株式会社 | 再现装置、集成电路、再现方法、程序 |
US8737811B2 (en) | 2010-08-06 | 2014-05-27 | Panasonic Corporation | Playback device, integrated circuit, playback method, and program |
JP5728649B2 (ja) * | 2010-08-06 | 2015-06-03 | パナソニックIpマネジメント株式会社 | 再生装置、集積回路、再生方法、プログラム |
JP2011151851A (ja) * | 2011-04-18 | 2011-08-04 | Sony Corp | 再生装置、再生方法および再生プログラム |
JPWO2016038791A1 (ja) * | 2014-09-10 | 2017-06-22 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 記録媒体、再生装置および再生方法 |
JP2017182873A (ja) * | 2014-09-10 | 2017-10-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 記録媒体 |
JP2017182874A (ja) * | 2014-09-10 | 2017-10-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 再生装置 |
JP2017184263A (ja) * | 2014-09-10 | 2017-10-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 再生装置 |
Also Published As
Publication number | Publication date |
---|---|
EP1628477A1 (en) | 2006-02-22 |
CN1778111A (zh) | 2006-05-24 |
JPWO2004095837A1 (ja) | 2006-07-13 |
EP1628477A4 (en) | 2010-06-02 |
US20040213542A1 (en) | 2004-10-28 |
US20060204092A1 (en) | 2006-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004095837A1 (ja) | 再生装置、プログラム。 | |
KR100984412B1 (ko) | 기록매체, 재생장치, 기록방법, 재생방법 | |
US8498515B2 (en) | Recording medium and recording and reproducing method and apparatuses | |
US8350870B2 (en) | Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit | |
US20050207736A1 (en) | Recording medium and method and apparatus for decoding text subtitle streams | |
KR100882079B1 (ko) | 디스플레이 세트의 병렬처리를 수행할 수 있는 재생장치, 기록매체, 기록방법, 재생방법, 및 컴퓨터 판독가능한 기록매체 | |
KR102558213B1 (ko) | 재생 장치, 재생 방법, 프로그램, 및 기록 매체 | |
JPH10304308A (ja) | サブピクチャデータ作成方法および装置、ならびにサブピクチャデータ作成用プログラムを記録したコンピュータ読み取り可能な記録媒体 | |
WO2006018786A1 (en) | Method of storing and transferring image signals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005505782 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20048109776 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004728900 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2004728900 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10549608 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 10549608 Country of ref document: US |