WO2004098193A2 - Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit for recording a video stream and graphics with window information over graphics display - Google Patents

Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit for recording a video stream and graphics with window information over graphics display Download PDF

Info

Publication number
WO2004098193A2
WO2004098193A2 PCT/JP2004/006074 JP2004006074W WO2004098193A2 WO 2004098193 A2 WO2004098193 A2 WO 2004098193A2 JP 2004006074 W JP2004006074 W JP 2004006074W WO 2004098193 A2 WO2004098193 A2 WO 2004098193A2
Authority
WO
WIPO (PCT)
Prior art keywords
graphics
window
stream
plane
reproduction apparatus
Prior art date
Application number
PCT/JP2004/006074
Other languages
French (fr)
Other versions
WO2004098193A3 (en
Inventor
Joseph Mccrossan
Tomoyuki Okada
Tomoki Ogawa
Original Assignee
Matsushita Electric Industrial Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CA 2523597 priority Critical patent/CA2523597C/en
Application filed by Matsushita Electric Industrial Co., Ltd. filed Critical Matsushita Electric Industrial Co., Ltd.
Priority to EP20040729754 priority patent/EP1620855B1/en
Priority to JP2005518547A priority patent/JP3803366B1/en
Priority to AU2004234798A priority patent/AU2004234798B2/en
Priority to KR1020057020409A priority patent/KR100760118B1/en
Priority to CNB2004800115550A priority patent/CN100521752C/en
Priority to US10/554,627 priority patent/US7505050B2/en
Priority to ES04729754T priority patent/ES2383893T3/en
Priority to AT04729754T priority patent/ATE557392T1/en
Publication of WO2004098193A2 publication Critical patent/WO2004098193A2/en
Publication of WO2004098193A3 publication Critical patent/WO2004098193A3/en
Priority to US12/341,265 priority patent/US8350870B2/en
Priority to US12/626,498 priority patent/US20100067876A1/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10527Audio or video recording; Data buffering arrangements
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/12Formatting, e.g. arrangement of data block or words on the record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/14Digital recording or reproducing using self-clocking codes
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/327Table of contents
    • G11B27/329Table of contents on a disc [VTOC]
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4112Peripherals receiving signals from specially adapted client devices having fewer capabilities than the client, e.g. thin client having less processing power or no tuning capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42646Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4314Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/162Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing
    • H04N7/163Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing by receiver means only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • G09G2340/125Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels wherein one of the images is motion video
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/12Formatting, e.g. arrangement of data block or words on the record carriers
    • G11B2020/1264Formatting, e.g. arrangement of data block or words on the record carriers wherein the formatting concerns a specific kind of data
    • G11B2020/1288Formatting by padding empty spaces with dummy data, e.g. writing zeroes or random data when de-icing optical discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/21Disc-shaped record carriers characterised in that the disc is of read-only, rewritable, or recordable type
    • G11B2220/213Read-only discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2541Blu-ray discs; Blue laser DVR discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/44504Circuit details of the additional information generator, e.g. details of the character or graphics signal generator, overlay mixing circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/775Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape
    • H04N5/783Adaptations for reproducing at a rate different from the recording rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/806Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
    • H04N9/8063Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8227Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal

Definitions

  • the present invention relates to a recording medium such as a BD-ROM, and a reproduction apparatus, and more specifically, to a technique of subtitling by reproducing a digital stream constituted by multiplexing a video stream and a graphics stream..
  • Subtitling realized by rendering graphics streams is an important technique for allowing people in different linguistic areas to appreciate a film, produced in a language other than their native languages.
  • An example of a conventional technique of subtitling is a memory allocation scheme for a Pixel Buffer based on the ETSI EN 30C 743 standard set forth by European Telecommunications Standards Institute (ETSI).
  • the Pixel Buffer is a memory for temporarily storing decompressed graphics, and a reproduction apparatus writes the graphics in the Pixel Buffer to a display memory called a Graphics Plane, and thus the graphics is displayed.
  • a definition of a Region is included in the Pixel Buffer, and a part of the decompressed graphics that corresponds to the Region is written to the Graphics Plane.
  • the subtitle “Goodbye...” is displayed on the screen gradually, i.e., first “Go”, next “Good”, then “Goodbye”, and finally the whole subtitle “Goodbye" is displayed.
  • the ETSI EN 30C 743 standard does not at all consider to guarantee the sync between a graphics display and a picture display when a burden for writing to the Graphics Plane is high.
  • the graphics written to the Graphics Plane is not compressed, and accordingly, the burden for writing to the Graphics Plane increases as a resolution of the graphics becomes higher.
  • a size of the graphics to be written to the Graphics Plane is up to 2 Mbytes when rendering the graphics in a resolution of 1920x1080, which is a proposed standard resolution for a 3D-R0M, and a higher bandwidth for a graphics data transfer from the Pixel Buffer tc the Graphics Plane is necessary in order to render graphics as large as 2 Mbytes synchronously with the picture display.
  • An object of the present invention is to provide a recording medium w th which graphics may be updated synchronously with a picture display even when an amount of data to be written to a Graphics Plane is large.
  • an example of the recording maxim according to the present invention is a recording medium, used for storing data, said recording medium comprising: a digital strea constituted by multiplexing a video stream and a graphics stream, wherein said video stream, represents a moving picture .ade of a plurality of pictures, and the graphics stream, includes : graphics data representing graphics to be combined with the pictures; and window information that specifies a window for r.endering the graphics therein, the window information indicating a width, a height and a position of the window on a plane, the plane being a plane memory of a reproduc ion apparatus that combines the graphics with the pictures.
  • the reproduction apparatus By specifying a part of the Plane corresponding to each picture as the window for rendering the graphics, it is not necessary that the reproduction apparatus renders the graphics for an entire plane, and it is sufficient that the reproduction apparatus renders the graphics only in a limited size of window. Because it is not necessary to render the graphics outside the window in the plane, the load of software in the reproduction apparatus may be reduced. Further, by setting a size of the window so as to ensure a sync display between the graphics and the picture, it becomes possible for a producer who performs authoring to guarantee the sync display in any kind of reproduction apparatus, even when update of the graphics is performed in a worst case.
  • the window information by setting a position and a size of the window by the window information, it is possible to adjust the position and size of the window in the authoring, so that the subtitles are out of the way of pictures when viewing the screen. Therefore, the visibility of the graphics are maintained even when the picture on the screen changes as time passes, and thus it is possible to maintain the quality of a film.
  • the worst case in updating the graphics means a case in which the graphics is updated in a least efficient operation, i.e. all clear and re-drawing of the window.
  • the above recording medium is such that the width and height of the window are set so that a size of the window is 1/x of the plane, the plane corresponding to a size of each picture and x being a real number based on a ratio between a window update rate and a picture display rate.
  • FIG.l illustrates an example of use of a recording medium according to the present invention.
  • FIG.2 illustrates a structure of a 3D-R0M.
  • FIG.3 is a diagram schematically illustrating a structure of an AVCiip.
  • FIG.4A illustrates a structure of a presentation graphics stream.
  • FIG.43 illustrates a PES packet obtained after functional segments are converted.
  • FIG.5 illustrates a logical structure that is made of various kinds cf functional segments.
  • FIG.6 illustrates a relation between a display position of a subtitle and an Epoch.
  • FIG.7A illustrates syntax to define a Graphics Object in an Object Definition Segment (ODS) .
  • FIG.7B illustrates syntax of a Palette Definition Segment (PDS) .
  • FIG.8A illustrates syntax of a Window Definition Segment (WDS) .
  • FIG.83 illustrates syntax of a Presentation Composition Segment (PCS) .
  • FIG.9 illustrates an example of a description of a Display Set for subtitling.
  • FIG.10 illustrates an example of a description of the WDS and PCS in a DS1.
  • FIG.11 illustrates an example of a description of the PCS in a D32.
  • FIG.12 illustrates an example of a description of the PCS in a DS3.
  • FIG.13 is an example of a description of a Display Set when Cut-In/Out is performed, illustrating along a timeline.
  • FIG.14 is an example of a description of a Display Set wnen Fade-In/Out is performed, illustrating along a timeline.
  • FIG.15 is an example of a description of a Display Set when Scrolling is performed, illustrating along a timeline.
  • FIG.16 is an example of a description of a Display Set when Wipe-In/Out is performe ⁇ , illustrating along a timeline.
  • FIG.17 is a diagram comparing two cases: a window has four Graphics Objects, and a window has two Graphics Objects.
  • FIG .18 illustrates an example of an algorithm for calculating a decode duration.
  • FIG.19 is a flowchart of the algorithm, of FIG.18.
  • FIGs.20A and B are flowcharts of the algorithm of FIG.18.
  • FIG.21A illustrates a case in which each window has an Object Definition Segment.
  • FIGs.21B and C are timing charts showing orders among numerals referred to in FIG.18.
  • FIG.22A illustrates a case in which each window has two Ob ect Definition Segments.
  • FIGs.22B and 0 are timing charts showing orders among numerals referred to in FIG.18.
  • FIG .23A describes a case in which each of two Windows includes an ODS.
  • FIG.23B illustrates a case in which a decode period (2) is longer than a total of a clearing period (1) and a write period ( 3 1 ) .
  • FIG.23C illustrates a case in which a total of the clearing period (1) and the write period (31) is longer than the decode period (2 ) .
  • FIG.24 illustrates shifts in time of update described in an example in the present specification.
  • FIG.25A illustrates four Display Sets that are described so as to perform the above explained update.
  • FIG.25B is a timing chart showing settings of DTS and PTS of functional segments included in the four Display Sets.
  • FIG.26 illustrates an internal structure of a reproduction apparatus according to the present invention.
  • FIG.27 illustrates sizes of write rates Rx, Re , and Rd, Graphics Plane 8, Coded Data Buffer 13, Object Buffer 15, and Composition Buffer 16.
  • FIG.28 is a timing chart illustrating a pipeline processing by the reproduction apparatus.
  • FIG.29 illustrates a timing chart in a pipeline processing of a case in which decoding of the ODS ends before clearing of the Graphics Plane is completed.
  • FIG.30 is a flowchart illustrating a process of a loading operation of a functional segment.
  • FIG.31 shows an example of multiplexing.
  • FIG.32 illustrates a manner in which a DS1C s loaded to the Coded Data Buffer 13.
  • FIG.33 illustrates loading of a DS1, the DS10, and a DS20 in a normal reproduction.
  • FIG.34 illustrates loading of the DS1, DS10, and DS20 in the normal reproduction as shown in FIG.33.
  • FIG. 5 illustrates a flowchart showing a process performed by the Graphical Controller 17.
  • FIG.36 illustrates a flowchart showing the process performed by the Graphical Controller 17.
  • FIG.37 illustrates a flowchart showing the process performed by the Graphical Controller 17.
  • FIG.38 illustrates a pipeline process of the reproduction apparatus based on the PTS of the PDS.
  • FIG.39 is a diagram describes a significance of the END m the pipeline process of the reproduction apparatus.
  • FIG.40 illustrates an internal structure of the reproduction apparatus according to a secon ⁇ emoodiment .
  • FIG. 1 schematically illustrates an operation of reading cut and writing to the Graphics Planes that constitute a double buffer .
  • FIG.42 isa flowchart illustrating the manufacturing process of the BD-ROM according to a third embodiment.
  • FIG .1 illustrates an example of use of the recording medium.
  • BD-ROM 100 is the recording medium, according to the present invention.
  • Tne BD-ROM 100 is used for providing data of movie works to a Home Theatre System structured by a reproduction apparatus 200, a television 300, and a remote controller 400.
  • the recording medium according to the present invention is manufactured by an improvement in an application layer of a BD-ROM.
  • FIG.2 illustrates a structure of the BD-ROM.
  • the BD-ROM is shown at a bottom of the drawing, and a track on the 3D-R0M is shown above the BD-ROM.
  • the track is actually in a spiral shape on the disc, but shown in a line in the drawing.
  • the track includes a lead-in area, a volume area, and a lead-out area .
  • the volume area in this drawing has a physical layer, a file system, layer, and an application layer.
  • an application format of the BD-ROM is illustrated using a directory structure.
  • the BD-ROM has a directory BDMV under the root directory, and the BDMV directory contains a file for storing an AVClip with an extension M2TS (XXX.M2TS), a file for storing administrative info for the
  • AVClip with an extension CLPI (XXX.CLPI)
  • the AVClip (XXX.M2T3) is a digital stream in MPEG-T ⁇ format (TS is Transport Stream) obtained by multiplexing a video stream, at least one audio stream, and a presentation graphics stream.
  • the video stream represents pictures of the film
  • the audio stream represents sound of the film
  • the presentation graphics stream represents subtitles of the film.
  • FIG.3 isa diagram schematically illustrating a structure of the AVClip.
  • the AVClip (XXX.M2TS) is structured in a following manner .
  • top row of the drawing are converted into a line of PES packets (second row of the drawing) , and then into a line of TS packets
  • FIG.4A illustrates a structure of the presentation graphics stream.
  • a top row indicates the TS packet line to be multiplexed to the AVClip .
  • a secondto the top row indicates the PES packet line that constitutes a graphics stream..
  • the PES packet line is structured by retrieving payloads out of TS packets having a predetermined ?ID, and connecting the retrieved payloads.
  • a third to the top row indicates the structure of tne graphics stream.
  • the graphics stream is made of functional segments named a Presentation Composition Segment (PCS) , a Window Definition Segment (WDS) , a Palette Definition Segment (PDS) , an Object Definition Segment (ODS) , and an END of Display Set Segment (END) .
  • the PCS is called a screen composition segment
  • the WDS, PDS, ODS, and END are called definition segments.
  • the PES packet and each of the functional segments correspond one to one, or one to plurality. In other words, one functional segment is either recorded in the BD-ROM after converted into one PES packet , orafterdividedintc fragments and converted into mere than one PES packet.
  • FIG.43 illustrates the PES packet obtained by converting the functional segments.
  • the PES packet is made of a packet header and the payload, and the payioad is a substantial body of a functional segment.
  • the packet header includes a DTS and a PTS corresponding to the functional segment.
  • the DTS and PTS included in the packet header are hereinafter referred to as the DTS and PTS of the functional segment.
  • FIG.5 illustrates the logical structure that is made of the various kinds of functional segments.
  • a top row illustrates Epochs
  • a middle row illustrates Display Sets (DS)
  • DS Display Sets
  • Each of the DS shown in the middle row is a group of functional segments that compose graphics for one screen, among all of the plural functional segments that constitute the graphics stream.
  • Broken lines in the drawing indicate the DS to which the functional segments in the bottom, row belong, and show that a series of the functional segments of the PCS, WDS, PDS, ODS, and END constitute one DS .
  • the reproduction apparatus is able to generate graphics for one screen by reading the functional segments that constitute the DS.
  • the Epochs shown in the top row indicate time periods, and memory management is consecutive timewise along a timeline of the AVClip reproduction in one Epoch.
  • One Epoch also represents a group of ⁇ ata that is assigned to the same period of time.
  • the memory referred to here are the Graphics Plane tnat stores the graphics for one screen, and an Object Buffer that stores decompressed graphics data.
  • the consecutiveness of tne memory management means a flash of the Graphics Plane or of the Object Buffer does not occur in the Epoch, and erasing and rendering of the graphics are only performed in a predetermined rectangular area on the Graphics Plane (the flash here indicates clearing of all contents of the stored data in a plane or a buffer) .
  • a size and a position of the rectangular area are fixed during one Epocn. As long as the erasing and rendering of the graphics are only perfor.ed in the predetermined rectangular area on the Graphics Plane, a sync reproduction between tne picture and the graphics is guaranteed.
  • the Epoch is a unit n tne reproducing timeline, and in this unit , the picture and the graphics are guarantee ⁇ to be reproduced synchronously. Wnen moving the area, in which the graphics are erased and rendered, to a different position, it s necessary to define a point on the timeline to move the area, and a period after the point becomes a new Epoch. The sync reproduction is not guaranteed at a boarder between two Epochs .
  • one Epoch is a time period in which subtitles are displayed in the same rectangular area on the screen.
  • FIG.6 illustrates a relation between the position of the subtitles and the ⁇ pocns .
  • the positions at which the five subtitles "Actually", “I was hiding”, “my feelings.”, “I always”, and “loved you.” are shown move according to the picture in the film.
  • a time period during which the subtitles appear at the bottom is an Epoch 1
  • a subsequent tiir.e period during which the subtitles appear at the top is an Epoch 2.
  • the Epochs i and 2 each have a different area in which the subtitles are rendered.
  • the area in the Epoch 1 is a Window 1 positioned at the bottom, of the screen, and the area m the Epoch 2 is a Window 2 positioned at the top of the screen.
  • the memory management is consecutive in each of the Epochs 1 and 2 , and accordingly, rendering of the subtitles in the Windows 1 and 2 is synchronous with the pictures .
  • FIG.5 indicate which functional segment at the middle row belongs to which Epoch.
  • a series of DS "Epoch Start”, “Acquisition Point”, and “Normal Case” constitute the Epoch at the top raw.
  • the "Epoch Start”, “Acquisition Point”, and Normal Case” are types of the DS, and an order between the "Acquisition Point” and “Normal Case” does not matter and either of them may come first.
  • the Epoch Start is a DS that has a display effect of "new display” , which indicates a start of a new Epoch . Because of this , the Epoch Start contains all functional segments needed to display a new composition of the screen.
  • the Epoch Start is provided at a position which is a target of a skip operation of the AVClip, such as a chapter in a film.
  • the Acquisition Point is a DS that has a display effect of "display refresh", and is identical in content used for rendering graphics with the Epoch Start which is a preceding DS.
  • the Acquisition Point is not provided at a starting point of the Epoch, but contains all functional segments needed to display the new composition of the screen. Therefore, it is possible to display the graphics without fail when a skip operation to the Acquisition Point is performed. Accordingly, with the Acquisition Point, it is possible to compose a screen in the middle of the Epoch.
  • the Acquisition Point is provided at a position that could be a target for the skip operation.
  • An example of such a position is a position that could be specified when performing a time search .
  • the tim.e search is an operation in response to a user' s input of a time to start reproducing from a reproducing point corresponding to the time specified by the user.
  • the time is specified roughly, such as by 10 minutes or by 10 seconds, and accordingly, points at which the reproduction starts are provided at such as a 10 minute interval, or a 10 second interval.
  • the Normal Case is a DS that has a display effect of "display update", and contains only elements that are different from, the preceding composition of the screen.
  • the DSv is provided so as to include only the PCS and makes the DSv the Normal Case.
  • it does not necessary to provide an ODS with the same content as the content of the ODS in the preceding DS, and a data size in the BD-ROM may be reduced.
  • the DS as the Normal Case contains only the difference, it is not possible to compose the screen using the Normal Case alone.
  • the Object Definition Segment is a functional segment that defines the Graphics Object.
  • An explanation of the Graphics Obj ect is given first .
  • a selling point of the AVClip recorded in the BD-ROM is its resolution as high as hi-vision, and therefore the resolution for the Graphics Object is set at 1920 ⁇ 1080 pixels. 3ecause of the high resolution of 1920x1080 pixels, it is possible to display a specific character style for the subtitles clearly on the screen.
  • a bit length of an index value for each pixel (Color Difference Red Cr, Color Difference Blue Ob, Luminance Y, and Transparency T) is 8 bits, and thus it is possible to chose any 256 colors out of full color ( 16, 777 , 216 colors) for the subtitles .
  • the subtitles realized by the Graphics Object are rendered by positioning texts on a transparent background.
  • the ODS is made of segment_type indicating that the segment is the ODS, segment_length indicating a data length of the ODS, ob ect_id uniquely identifying the Graphics Object corresponding to the ODS in the Epoch, obj ect_version_number indicating a version of the ODS within the Epoch, last_insequence__flag, and obj ect_data_fragment which is a consecutive sequence of bytes corresponding to a part or all of Graphics Object.
  • the obj ect_id is for uniquely identifying the Graphics Obj ect corresponding to the ODS in the Epoch.
  • the Epoch of the graphics stream contains more than one ODS having the same ID.
  • the ODS having the same ID also have the same width and height, and are assigned with a common area in the Object Buffer. After one of the ODS having the sam.e ID is read in the common area, the read ODS is overwritten by a subsequent ODS having the same ID. By overwriting the ODS that is read to the Object Buffer by the subsequent ODS having the same ID as the reproduction of the vide stream proceeds, the graphics by the ODS is updated accordingly.
  • a size constraint that the width and height of the Graphics Object having the same ID should be the same is applied only during one Epoch, and the Graphics Objects m different Epochs may have different sizes.
  • the PDS is used to define a palette for a color conversion .
  • FIG.73 shows syntax of the PDS .
  • the PDS is made of segment_type indicating that the segment is the PDS, segment_length indicating a data length of the PDS, palette_id uniquely identifying the palette contained in the PDS, palette_ version_number indicating a version of the PDS within the Epoch, and palette_entry_id specifying an entry number of the palette.
  • the palette_entry_id indicates the Color Difference Red (Cr__value) , the Color Difference Blue (Cb_value) , Luminance (Y_value) , and Transparency (T_value) .
  • WDS Window Definition Segment
  • the WDS is used to define the rectangular area on the Graphics Plane. As described in the above, the memory management is sequential only when erasing and rendering is performed within a certain area on the Graphics Plane.
  • the area on the Graphics Piane is definedbythe WDS and called "Window" .
  • FIG.8A illustrates syntax of the WDS.
  • the WDS is made of segment type indicating that the segment is theWDS, segment_length indicating a data length of the WDS, window_ ⁇ d uniquely identifying the Window on the Graphics Plane, window_horizontaI__pos ⁇ t ⁇ on specifying a horizontal address of a top left pixel of the Window on the Graphics Plane, window_vertical_position specifying a vertical address of the top left pixel of the Window on the Graphics Plane, wmdow_width specifying a width of the Window on the Graphics Plane, and window_height specifying a height of the Window o .
  • the Graphics Plane is made of segment type indicating that the segment is theWDS, segment_length indicating a data length of the WDS, window_ ⁇ d uniquely identifying the Window on the Graphics Plane, window_horizontaI__pos ⁇ t ⁇ on specifying a horizontal address of a top left pixel of the Window on the Graphics Plane, window_vertical_position specifying a vertical address of the top
  • window_horizontai_position Ranges of values that the window_horizontai_position, window_verticai_position, wmdow_width, and window_height nay take are explained below.
  • a coordinate system for those values is within an area on the Graphics Plane, and whose size is indicated two-di ensionally by the window_height for a height and the window_width for a width.
  • the window_horizontal_posit ⁇ on specifies the horizontal address of the top left pixel of the Window on the Graphics Piane, and is within a range of 0 to (window_width) -1.
  • the window_vertical_position specifies the vertical address of the top left pixel of the Window on the Graphics Plane, and is within a range of C to (window_he ⁇ ght ) -1.
  • the window_width specifies the width of the Window on the Graphics Plane. The specified width falls within a range of 1 to (video_width) - (window_horizontal_pos ⁇ tion) .
  • the window_height specifies the height of the Window on the Graphics Plane, and the specified height is within a range of 1 to (video_height ) - (window_vertical_position) .
  • the position and size of the Window on the Graphics Plane for each Epoch are defined by the window_horizontal_position, wmdow_vertical_posit ⁇ on, windcw_width, and window_height . Accordingly, it is possible to adjust the position and size of the Window at authoring, so that the Window in one Epoch appears at the position that does not come in the way of the picture when viewing the film. By this, thev sibilityofthesubtitlesbecomes higher. Because the WDS is defined for each Epoch, it is possible to adjust the position of the Window according to the picture, even if the picture changes in the course of time. As a result, the quality of the film is maintained as high as m a case where the subtitles are incorporated in the mam body of the film.
  • the End of Display Set Segment provides an indication that a transmission of the DS is completed.
  • the End is inserted into a stream immediately after a last ODS in one DS .
  • the End is made of segment_type indicating that the segment is the END and segment_length indicating a data length of the END.
  • ⁇ ' r.e END does not include any other element that requires a further explanation.
  • PCS Presentation Composition Segment
  • the PCS is a functional segment that is used for composing an interactive display.
  • FIG.83 illustrate syntax of the PCS. As shown in the drawing, the PCS is made of segment_type, segment_Iength, composit ⁇ on_number, composition_state, palette_update_flag, palette_id, and window information 1-m.
  • the composition_number identifies the Graphics Update in the DS by values in a range of 0 to 15. If the Graphics Update exists between the head of the Epoch and the PCS, the composit ⁇ on_number is incremented every time the Graphics Update occurs.
  • the composition_state indicates the type of the DS in which the PCS s contained, Norm.al Case, Acquisition Point, or Epoch Start.
  • the palette_update_flag indicates that the PCS describes a Palette only Display Update.
  • the Palette only Display Update indicates that only the palette is updated from an immediately previous palette.
  • the palette_update_ lag field is set to "1", if the Palette only Display Update is performed.
  • the palette_id identifies the palette to be used in the Palette only Display Update.
  • the window information 1-m indicate how to control each Window in the DS to which the PCS belong.
  • a broken line wdl in FIG.8B is to detail an internal syntax for window information i.
  • the window information i is made of object_id, window_id, obj ect_cropped_flag, obj ect_hor ⁇ zontal_position, an obj ect_vertical_pcsition, and cropping_rectang!e information 1-n.
  • the object_ ⁇ d identifies the ODS in a Window corresponding to the window information i.
  • the window_id identifies the Window to which the Graphics Object is allocated in the PCS. Up to two Graphics Objects may be assigned to one Window.
  • the object_cropped_flag is used to switch between display and no-display of a cropped Graphics Object m the Object Buffer.
  • the object__cropped_flag is set to "1"
  • the cropped Graphics Object is displayed in the Object Buffer, and if set to "0", the Graphics Object is not displayed.
  • the obj ect_horizontal_position specifies a horizontal address of a top left pixel of the Graphics Object in the Graphics Plane .
  • the object_vertical_position specifies a vertical address of the top left pixel of the Graphics Object in the Graphics Plane.
  • the cropping_rectangIe information i-n are elements used when the obj ect_cropped_flag is set to "1".
  • a broken line wd2 is to detail an internal syntax for cropping_rectangle information i.
  • the cropping_rectangle information i is made of four fields, obj ect_cropping_hor ⁇ zontal_pcsition, obj ect_cropp ⁇ ng_ vertical_position, obj ect_cropping_width, and ob ect_ cropping_height .
  • the obj ect_cropping_horizontal_position specifies a horizontal address of a top left corner of a cropping rectangle to be used during rendering of the Graphics Object in the Graphics Plane.
  • the cropping rectangle is a cropping frame that is used to specify and crop a part of the Graphics Object, and corresponds to Region m the ETSI EN 300 743 standard.
  • the obj ect_cropping_vert ⁇ caI_posit on specifies a vertical address of the top left corner of the cropping rectangle to be used during rendering of the Graphics Obj ect in the Graphics Plane .
  • the object_cropping_width specifies a width of the cropping rectangle .
  • the obj ect_cropping_height specifies a height of the cropping rectangle.
  • a specific example of the PCS is detailed below.
  • the subtitles "Actually", "I was hiding", and "my feelings.” as shown in FIG.6 appear gradually by writing to the Graphics Piane 3 tim.es as tne picture proceeds .
  • FIG.9 is an example of description for realizing such a subtitle ⁇ ispiay.
  • An Epoch in the drawing includes a DS1 (Epoch Start) , a DS2 (Normal Case) , and a DS3 (Normal Case) .
  • the DS1 contains a WDS for specifying the Window in which the subtitles are displayed, an ODS for specifying the line "Actually... I was hiding m.y feelings.”, and a first PCS.
  • Tne DS2 contains a second PCS
  • the DS3 contains a third PCS.
  • FIGs .10-12 illustrate examples of the WDS and PCS contained m the DS.
  • FIG.10 snows an example of the PCS m the DS1.
  • the window_norizontal_position and the w ndow_vert ⁇ cal_pos ⁇ tion of tne WDS are indicated oy a LP1, a position of the top left pixel of the Window on tne Graphics Plane.
  • the window_w ⁇ dth and w ⁇ ndow_ne ⁇ ght indicate the width and height of the Window, respectively.
  • the ooj ect_cropping_hc ⁇ zontal_position and object_cropping_vertical_position indicate a reference point ST1 of the cropping rectangle in the coordinate system in which an origin is the top left pixel of the Graphics Object.
  • the cropping rectangle is an area having the width from the ST to the obj ect_cropp ⁇ r.g_width, and the height from the ST to the obj ect_cropping_height -(a rectangle shown by a heavy-line frame) .
  • the cropped Graphics Object is positioned within a rectangle shown by a broken-line frame cpl, with a reference point in the coordinate system, with an origin at the object_horizontal_posit ⁇ on and object_verticaI_position (the top left pixel of the Graphics Object) in the Graphics Plane .
  • the subtitle “Actually... " is written to the Window on the Graphics Piane, and then composed with the .ovie picture and displayed on the screen.
  • FIG.11 shows an example of the PCS in the D ⁇ 2.
  • the WDS in the DS2 is not explained, because the WDS m the DS2 is the same as the WDS in the DSl.
  • a description of the cropping information in the DS2 is different from the description of the cropping information shown in FIG.10.
  • the ob ect_cropping_horizontal_position and object_cropping_vertical_position m the cropping information indicate a top left pixel of the subtitle "I was hiding” out of "Actually... I was hiding my feelings.”
  • the object_cropping_width and obj ect_cropping_height indicates a width and a height of a rectangle containing the subtitle "I was hiding”. 3y this, the subtitle "I was hiding" is written to the Window on the Graphics Plane, and then composed with the movie picture and displayed on the screen.
  • FIG.12 shows an example of the PCS in the DS3.
  • the WDS m the DS3 is not explained, because the WDS in the DS3 is the same as the WDS in the DSl.
  • a description of the cropping information in the DS3 is different from the description of the cropping information shown in FIG.10.
  • the obj ect_cropping_horizontal_positicn and obj ect_cropping_vertical_position in the cropping information indicate a top left pixel of the subtitle "my feelings.” cut of "Actually... I was hiding my feelings.” in the Object Buffer.
  • the obj ect_cropping_width and obj ect_cropping_height indicates a width and a height of a rectangle containing the subtitle "m.y feelings.”. 3y this, the subtitle "my feelings.” is written to the Window on the Graphics Piane, and then composed with the movie picture and displayed on the screen. 3y describing the DSl, DS2, and DS3 as explained above, it
  • FIG.13 shows an example of the description of the DS when Cut-In/Out is performed, illustrating along a timeline.
  • x and y in Window respectively indicate values of the window__vert ⁇ cal_position and wmdow_horizontal_position
  • u and v respectively indicate values of the window_width and window_height
  • a andb m Cropping Rectangle respectively indicate values of the obj ect_cropping_vertical_position and obj ect_cropping_horizontal_position
  • c and d indicate values of the obj ect_croppir.g_width and obj ⁇ ct_croppmg_height , respectively.
  • Display Sets DS11, DS12, and DS13 are at points til, t!2, and t!3 on the reproduction timeline in the drawing.
  • the D ⁇ ll at the point til includes a PCS 0 in which the compos ⁇ tion_state is "Epoch Start" and the obj ect_crcpped_flag is"C" (no_cropping_rectangle_v ⁇ sible) , a WDS-0 having a statement for a Window in a width 700 x height 500 at (100,100) in the Graphics Plane, a PDS#0, an ODS#0 indicating a subtitle "Credits:", and an END.
  • the DS12 at the point tl2 includes a PCS ⁇ l whose composition_state is "Normal Case" and indicating a crop operation of the Graphics Object to be in a 600x400 size from (0,0) m the Object Buffer (cropping_rectangle ⁇ f0 ( 0 , C , 600 , 400 ) ) , and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Piane (on Window#0 (0 , 0 ) ) .
  • the DS13 at the point tl3 includes a PCS 2 whose composition_state is "Normal Case” and in which the object_cropped_flag issetto"0"soastoerasethe cropped Graphics Object (no_cropping_rectangle_visible) .
  • FIG.14 shows an example of the description of the DS when Fade-In/Out is performed, illustrating along a timeline.
  • Display Sets DS21, DS22, DS23, and DS24 are at points t2I, -22, t23, and t24 on the reproduction timeline in the drawing.
  • the DS21 at the point t21 includes a PCS#C whose composition_state is "Epoch Start" and indicating the crop operation of the Graphics Object to be in a 600x400 size from (0,0) in the Object Buffer (cropping_rectangle : ⁇ i 0 ( 0 , 0 , 600 , 400 ) ) , and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Plane (on Window#0 ( 0 , 0) ) , a WDS 0 having a statement for a Window in a width 700 height 500 at (100,100) in the Graphics Plane, a PDStO, an OD3 0 indicating a subtitle "Fi ", and an END.
  • the DS22 at the point t22 includes a PCSrl whose composition_state is "Normal Case", and a ?DS 1.
  • the ?DS 1 indicates the same level of Cr and Cb as the PDS C, but a luminance indicated by the PDS ⁇ l is higher than the luminance in the PDSrO.
  • the DS23 at the point t23 includes a PCS#2 whose composition_state is "Normal Case", a?DS#2, and an END.
  • the ?DSr2 indicates the same level of Cr and Cb as the PDS ⁇ l, but the luminance indicated by the PDSr2 is lower than the luminance in the PDST 4 !.
  • the DS24 at the point t24 includes a PCS whose composition_state is "Normal Case” and the object_cropped_flag is "0" (no_cropping_rectangle_visible) , and an END.
  • Each DS specifies a different PDS from a preceding DS, and accordingly, the luminance of the Graphics Object that is rendered with more than one PCS in one Epoch becomes gradually high, or low. Bythis, it is possible to realize the effect of Fade-In/Out .
  • FIG.15 shows an example of the description of the DS when Scrolling is performed, illustrating along a timeline.
  • DS32, DS33, and DS34 are at points t31, t32, t33, and t34 on the reproduction timeline in the drawing.
  • the DS31 at the point t31 includes a PCS ⁇ O whose compos ⁇ tion_state is set to "Epoch Start" and obj ect_cropped_flag is"0" (no_cropping_rectangle_visible) , a WDS#0 having a statement for a Window in a width 700 x height 500 at (100,100) in the Graphics Plane, aPDS#0, an ODS#C indicating a subtitle "Credits : Company", and an END.
  • the DS32 at the point t32 includes a ?0S#1 whose composition_state is "Norrr;al Case” and indicating the crop operation of the Graphics Object to be in a 600x400 size from (0,0) in the Object Buffer (cropping ⁇ ectangie ⁇ O ( C , 0 , 600 , 400 ) ) , and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Plane (on Window ⁇ C (0,0)).
  • An area of the 600x 00 size from (0,0) in the Object Buffer includes a part "Credits:" of the subtitle “Credits: Company” shown in two lines, and thus the part “Credits:” appears on the Graphics Plane.
  • the DS33 at the point t33 includes a PCS 2 whose composition_state is "Normal Case” and indicating the crop operation of the Graphics Object to be in a 600x400 size from (0, 100) m the Object Buffer (cropping_rectangle 0 (0,100,600,400)), and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Flane (on Window#0 ( 0 , 0) ) .
  • the area of the 600x400 size f om (0,100) in the Obj ect Buffer includes the part “Credits : " and a part “Company” of the subtitle “Credits: Company” shown in two lines, and thus the parts “Credits:” and “Company” appear in two lines on the Graphics Piane.
  • the DS34 at the point t34 includes a PCS 3 whose compos ⁇ tion_state is "Normal Case” and indicating the crop operation of the Graphics Obj ect to be in a 600x400 size from (0,200) in the Object Buffer (cropping_rectangle ⁇ C (0,200,600,400)), and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Plane (on Window ⁇ O (0,0)) .
  • the area of the 600x400 size from (0,200) in the Obj ect Buffer includes the part "Company” of the subtitle “Credits: Company” shown in two lines, and thus the part “Company” appears on the Graphics Plane.
  • FIG.16 shows an example of the description of the DS when Wipe-In/Out is performed, illustrating along a timeline.
  • Display Sets DS21, DS22, DS23, and DS24 are at points -21, t22, t23, and t24 on the reproduction timeline in the drawing.
  • the D351 at the point t51 includes a PCS O whose compcs ⁇ tion_state is set tc "Epoch Start" and the obj ect_cropped_flag is. "0" (no_cropping_rectangle_visible) , a WDS#0 having a statement for a Window in a width 700 x height 500 at (100,100) in the Graphics Plane, a ?DS#0, an ODS ⁇ O indicating a subtitle "Fm”, and an END.
  • the DS52 at the point t52 includes a PCS l whose composition__state is "Normal Case" and indicating the crop operation of the Graphics Object to be in a 600x400 size from (0,0) m the Object Buffer (cropping_rectangle#0 (0, 0 , 600, 400 ) ) , and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Piane (on Window#0 (0,0)).
  • An area of the 600x400 size from. (0,0) in the Object Buffer includes the subtitle "Fin", and thus the subtitle "Fin" appears on the Graphics Plane.
  • the DS53 at the point t53 includes a PC3 2 whose composition_state is "Normal Case" and indicating the crop operation of the Graphics Object tobe ina 400x400 size from. (200, 0) in the Object Buffer (cropping_rectangle 0 (200,0,400,400)), and positioning the cropped Graphics Obj ect at the coordinates (200,0) m the Graphics Plane (on Windowjf0 (200 , 0 ) ) . 3y this, an area indicated by coordinates (200,0) and (400,400) in the Window becomes a display area, and an area indicated by coordinates (0,0) and (199,400) becomes a no-dispiay area.
  • the DS54 at the point t54 includes a PCS#3 whose compcsition_state is "Normal Case" and indicating the crop operation of the Graphics Obj ect to be in a 200 ⁇ 40C size from (400,0) in the Obj ect Buffer (cropping_rectangle : f i 0 (400,0,200,400)), and positioning the cropped Graphics Object at the coordinates (400, 0) m the Graphics Plane (on Window ⁇ 0 ( 400 , 0 ) ) .
  • an area indicatedby coordinates (0,0) and (399, 400) becomes the no-display area .
  • Constraints for realizing the above effects are as follows. In order to realize the Scrolling effect, operations for clearing and redrawing of the Window becomes necessary. Taking the example of FIG.15, it is necessary to perform "window clear” to erase the Graphics Object "Credits:” at the t32 from the Graphics Piane, and then to perform "window redraw” to write a lower part of "Credits:” and an upper part of "Company” to the Graphics Plane during an interval between the t32 and t33. Given that the interval is the same as an interval of video frames, a transfer rate between the Object Buffer and the Graphics Piane desirable for the Scrolling effect becomes an important point.
  • An Re is the transfer rate between the Object Buffer and the Graphics Plane.
  • a worst scenario here is to perform both of the Window clear and Window redraw at the rate Re. In this case, each of the Window clear andWindow redraw is required tobeperform.ed at a rate half of Re (Rc/2) .
  • the Window size accounts for at least 25% to 33% of the- Graphics Plane .
  • a total number of pixels in the Graphics Plane is 1920x1080. Taking that an index bit length per pixel is 8 bits, a total capacity of the Graphics Plane is
  • the rate for the Window clear and Window redraw may oe a half or a quarter of the frame rate, it is possible to double or quadruple the size of the Window even if the Re is the same.
  • the Window size 25% to 33% of the Graphics Plane and displaying the subtitles at the transfer rate of 256 Mbps it is possible to maintain the sync display between the graphics and the movie picture, no matter what kind of display effect is to be realized.
  • the position, size, and area of the Window are explained.
  • the position and area of the Window does not change m one Epoch.
  • the position and the size of the Window set to be the same during one Epoch because it is necessary to change a target write address of the Graphics Plane if the position and the size change, and changing the address causes an overhead that lowers the transfer rate from the Object Buffer to the Graphics Piane .
  • a number of Graphics Objects per Window has a limitation.
  • the limitation of the number is provided m order to reduce the overhead in transferring decoded Graphics Object.
  • the overhead here is generated when setting the address ofanedgeofthe Graphics Object, and the more a number of edges, the .ore the overhead is generated.
  • FIG.17 shows examples in comparison, an example in which a Window has four Graphics Objects and another example in which a Window has two Graphics Objects.
  • the number of the edges of the example with four Graphics Objects is twofold of the number of the edges of the example with two Graphics Objects.
  • the transfer rate may be set taking up to 4 overhead into account . Accordingly, it is easier to set the number of a minimum transfer rate.
  • the Epoch is a period of time in which a memory management is consecutive along the reproduction timeline. Since the Epoch is made of more than one DS, how to assign the D3 to the reproduction tim.eline of the AVClip is important.
  • the reproduction timeline of the AVClip is a timeline for specifying timings for decoding and reproducing of each piece of picture data that constitute the video stream, multiplexed to the AVClip. The decoding and reproducing timings on the reproduction timeline are expressed at an accuracy of 90 KHz.
  • a DTS and PTS that are attached to the PCS and ODS in the DS indicate timings for a synchronic control on the reproduction tim.eline.
  • the assigning of the Display Set to the reproduction timeline means p ⁇ rfor .ing the synchronic control using the DTS and PTS attached to the PCS and ODS.
  • the DTS indicates, at the accuracy of 90 KHz, a time when the decoding of the ODS starts, and the PTS indicates a time when the decoding ends .
  • the decoding of the ODS does not finish at once, and has a certain length of time.
  • the DTS and PTS of the ODS respectively indicate the times when the decoding starts and ends.
  • the value of the PTS indicates the deadline, and therefore it is necessary that the decoding of the ODS has to be completed by the time indicated by the PTS and the decompressed Graphics
  • Object is written to the Object Buffer on the reproduction apparatus.
  • the decode starting time of any ODSj in a DSn is indicated by a DTS (DSn [ODS " ) at the accuracy of 90 KHz.
  • Adding a maximum length of the decode duration to the DTS (DSn [ODS] ) is the time when the decoding of the ODSj ends.
  • a size of the ODSj is "SIZE (DSn [ODSj ] ) " and a decoding rate of the ODS is an "Rd”
  • the maximum time required for decoding indicated by second s expressed in "SIZE ( DSn [ODSj ]) //Rd” .
  • the symbol "//” indicates an operator for a division with rounding up after a decimal place.
  • the PTS of the ODSj n the DSn is expressed in a following equation.
  • DTS DTS (DSn[ODSl] )
  • DTS DTS (DSn[ODSl] )
  • PDS PDS1]
  • the "decodeduration (DSn)" indicates a time duration for decoding all the Graphics Objects used for updating PCS .
  • the decode duration is not a fixed value, but does not vary according to a status of the reproduction apparatus and a device or a software mounted to the reproduction apparatus.
  • the decodeduration (DSn) is affected by time (i) needed for clearing the Window, decode durations (ii) for decoding a DSn. PCSn. OBJ, and time (iii) needed for writing of the DSn. PCSn . OBJ.
  • the decode_duraticn (DSn) is always the same. Therefore, the PTS is calculated by calculating lengths of these durations in authoring.
  • FIGs.19, 20A and 203 are flowcharts schematically showing algorithms of the program. An explanation about the calculation of the decode_duration is given below referring to these drawings.
  • a PLAN ⁇ INITIALZE- function is called (Step SI m FIG.19) .
  • the PLANEINITIALZ ⁇ function is used for calling a function for calculating a time period necessary to initialize the Graphics
  • Step S4 a time period necessary to clear Window [i] defined by the WDS is added to the in ⁇ tialize_duration for all Windows (Step S4) .
  • the transfer rate Re between the Obj ect Buffer and the Graphics Plane is 256,000,000 as described in the above and a total size of Wmodow]i] that belongs to the WDS is ⁇ SIZE (WDS . WIN [i] )
  • the time period necessary to clear is " ⁇ SIZE (WDS. WIN [i] )//256,0C0,000".
  • FIG.20B is a flowchart showing an operation of the WAIT function .
  • the decode_durat ⁇ on of an invoker is set as a current_duration.
  • An obj ect_definition_ready_time is a variable set to the PTS of the Graphics Object of the DS .
  • a current_t ⁇ me is a variable set to a total value of the current_duration and the DTS of the PCS m the DSn.
  • the decode_duration is set to the tim.e period that the return value of the WAIT function added to the time period necessary for re-drawing the Window, (90,000 * (SIZE (DSn. DS .WIN [ 0] ) ) // 256, 000, 000) .
  • Step Sll the time necessary to redraw the Window is added to which O3J[0] belong (90,000 * (SIZE (DSn . DS . OBJ [ 0] . window_id) ) // 256, 000, 000) to the decode_duration (Step 315), the WAIT function is called using 03J]1] as an argument, and add a return value wa ⁇ t_duration to the decode_duration (Step S16) , and the time necessary to redraw the Window to which OBJ[l] belong (90, 000 * (SIZE (DSn.WDS.O3J]0] . window_id) ) // 256, 000, CCC) to the decode_duration (Step S17).
  • the decode_duration is calculated by the above algorithm. A specific manner in which the PTS of the OCS is set is explained belo .
  • FIG.21A illustrates a case in which one ODS is included in one Window.
  • FIGs.2IB and 21C are timing charts showing values in an order of time that are referred to in FIG.18.
  • Abottom line “ODS Decode” and a middle line “Graphics Plane Access” in each chart indicate two operations that are performed simultaneously when reproducing. The above algorithm is described assuming that these two operations are performed in parallel.
  • the Graphics Plane Access includes a clearing period (1) and a write period (3) .
  • the clearing period (1) indicates either a time period necessary to clear an entire Graphics Plane (90, OOOx (size of Graphics Plane //256, 000, C00) ) , or a time period necessary to clear all Windows on the Graphics Plane ( ⁇ (9C, OOOx (size of Window [i] //256, CCC , 000 ) ) .
  • the write period (3) indicates a time period necessary to render an entire Window ( 90 , 000 ⁇ ( size of Window [i] //256,000,000) ) . Further, a decode period (2) indicates a time period between the DTS and the PTS of the ODS.
  • Lengths of the clearing period (1) , the decode period (2) , and the write period ( 3 ) may vary depending on a range to be cleared, a size of ODS to be decoded, and a size of the Graphics Object to be written to the Graphics Plane.
  • a starting point of the decode period (2) in the drawing is the same as a starting point of the clearing period (1).
  • FIG.213 illustrates a case in which the decode period (2) is long, and the decode_duration equals to a total of the decode period (2) and the write period (3).
  • FIG.21C illustrates a case in which the clearing period (1) is long, and the decode_duration equals to a total of the clearing period (1) and the write period (3).
  • FIGs .22A to 22C illustrate a case in which two ODS is included in one Window.
  • the decode period (2) in both FIGs.22B and 22C indicates a total time period necessary for decoding two Graphics .
  • the write period (3) indicates a total time period necessary for writing two Graphics to the Graphics Plane.
  • the decode_durat ⁇ on equals to a total of the decode period
  • FIG .23A describes a case in which each of two Windows includes an ODS.
  • the decode_duration equals to a total of the clearing period (1) and the decode period (2) .
  • the clearing period (1) is shorter than the decode period (3), it is possible to write to a first Window before the decode period (2) ends. Accordingly, the decode_duraticn does not equal to either of a total of the clearing period (1) and the write period (3), or a total of the decode period (2) and the write period (3) .
  • FIG.233 illustrates a case in which the decode period (2) is longer than a total of the clearing period (1) and the write period (31) .
  • the decode_duration equals to a total of the decode period (2) and the write period (32) .
  • FIG.23C illustrates a case in which a total of the clearing period (1) and the write period (31) is longer than the decode period (2) .
  • the decode_duration equals to a total of the clearing period (1) , the write period (31) , and the write period (32) .
  • the size of the Graphics Plane is known from a model of the reproduction apparatus in advance. Also, the size of the Window, and the size and number of the ODS are also known at the authoring.
  • the decode_duration equals: the clearing period (1) and the write period (3) , the decode period (2) and the write period (3) , the decode period (2) and hewriteperiod (32) , or the clearing period (1), the write period (3) and the write period (32).
  • the DTS of the WDS may be set so as to satisfy an equation below.
  • the OT5 of the WDS in the DSn indicates a deadline to start writing to the Graphics Plane. Because it is sufficient to write to the Window on the Graphics Plane, the time to start writing to the Graphics Plane is determined oy subtracting a time length indicated by the PTS of the PCS from. a time period necessary for writing the WDS .
  • a total size of the WDS is ⁇ SIZE ( " WDS .-WIN [i] )
  • the time necessary for clearing and re-drawing is " ⁇ SIZE (WDS . WIN [i] ) //256, 000, 000" .
  • the time is
  • PTS (DSn [WDS] ) PTS (DSn] PCS] )-90000 ⁇ ⁇ SIZE ( DS. I [i] )//256,000,000
  • the PTS indicated in the WDS is the deadline, and it is possible to start writing to the Graphics Plane earlier than the PTS.
  • writing of the Graphics Object obtained by the decoding may start at this point.
  • FIG.24-25 illustrates specific example of settings of the DTS and PTS in a Display Set based on the settings.
  • the example is about a case m which subtitles are displayed by writing to the Graphics Piane four times, and an update is performed for displaying each of two subtitles "what is blu-ray.” and "blu-ray is everywhere.”
  • FIG.24 illustrates shifts in time of the update in the example.
  • FIG.25A illustrates four Display Sets that are described so as to perform the above explained update.
  • a DSl includes a PCS1.2 for controlling an update at the tl, a PDS1 for coloring, an OD31 corresponding to the subtitle "what is blu-ray.”, and an END as an ending code of the DSl.
  • a DS2 includes a PCS1.2 for controlling an update at the t2, and an END.
  • a DS 3 includes a PCS1.3 for controlling an update at a t3 and an END.
  • a DS 4 includes a PCS2 for controlling an update at the t2, a PDS2 for color conversion, an ODS2 corresponding to the subtitle "blu-ray is everywhere.”, and an END.
  • PTS(PCSl.l), PTS(?CS1.2), ?TS(?CS1.3), and PTS(?CS2) are respectively set at a display point tl for displaying "what", a display point t2 for displaying "what is”, a display point t3 for displaying "what is blu-ray. ", and a display point 14 for displaying
  • PTS(ODSl) and PTS(ODS2) are set so as to indicate points that are calculated by subtracting decode__duration from the points indicated by PTS(PCSl.l) and ?TS(PCS2), respectively, because
  • PTS PCS
  • PTS(ODS2) is set so as to indicate a point t5 that comes before the point t4
  • PTS (0DS1) issetsoasto indicate a point tO that comes before the point tl.
  • DTS(ODSl) and DTS(ODS2) are set so as to indicate points that are calculated by subtracting decode_duration from the points indicated by PIS(ODSl)- and ?TS(ODS2), respectively, because DTS (ODS) is required to be set so as to satisfy an equation below.
  • PTS(OD ⁇ 2) is set so as to indicate the point t5 that comes before the point tO
  • PTS(ODSl) is set so as to indicate a point that comes before the point tO.
  • the PTS of ODS1, the DTS of ODS2, and the PTS of the PCSI.2, PCS1.3, and PCS2 are set at the point tO , so as to satisfy a relation indicated by an equation below.
  • Data structures of the Display Sets (PCS, WDS, PDS, ODS) explained above is an instance of the class structure described in a programming language.
  • Producers that perform authoring may obtain the data structures on the BD-ROM by describing the class structure according to the syntax provided in the Blu-ray Disc Prerecording Format.
  • FIG.26 illustrates an internal structure of the reproduction apparatus according to the present invention.
  • the reproduction apparatus according to the present invention is industrially produced based on the internal structure shown in the drawing.
  • the reproduction apparatus according to the present invention is mainly structured by three parts: a system. LSI, a drive device, and a microcomputer system., and it is possible to industrially produce the repreduction apparatus by mounting the three parts to a cabinet and a substrate of the apparatus .
  • the system. LSI is an integrated circuit in which various processing units for carrying out a function of the reproduction apparatus are integrated.
  • the reproduction apparatus manufactured in the above manner comprises a BD drive 1, a Read Buffer 2, a PID filter 3, Transport Buffers 4a-4c, a peripheral circuit 4d, a Video Decoder 5, a Video Piane 6, an Audio Decoder 7 , a Graphics Plane 8 , a CLUT unit 9, an adder 10, a Graphics Decoder 12, a Coded Data Buffer 13, a peripheral circuit 13a, a Stream Graphics Processor 14, an Object Buffer 15, a Composition Buffer 16, and a Graphical Controller 17.
  • the BD drive 1 performs ioad/read/eject of the BD-ROM, and accesses to the BD-ROM.
  • the Read Buffer 2 is a FIFO memory for storing the TS packets read from the BD-ROM in a first-m first-out order.
  • the PID filter 3 filters more than one TS packet outputted from the Read Buffer 2.
  • the filtering by the PID filter 3 is to write the only TS packets having a desired PID to the Transport Buffers 4a-4c. Buffering is not necessary for the filtering by the PID filter 3, and accordingly, the TS packets inputted to the PID filter 3 are written to the Transport Buffers 4a-4c without delay.
  • the Transport Buffers 4a-4c are for storing the TS packets outputted from, the PID filter 3 in a first-in first-out order.
  • a speed at which the TS packets from, the Transport Buffers 4a-4c are outputted is a speed Rx.
  • the peripheral circuit 4d is a wired logic for converting the TS packets read from the Transport Buffers 4a-4cinto functional segments .
  • the functional segments obtained by the conversion are stored in the Coded Data Buffer 13.
  • the Video Decoder 5 decodes the more than one TS packets outputted from the PID filter 3 into a decompressed picture and writes to the Video Plane 6.
  • the Video Plane 6 is a plane memory for a moving picture.
  • ⁇ e Audio Decoder 7 decodes the TS packets outputted from the PID filter 3 and outputs decompressed audio data.
  • the Graphics Plane 8 is a plane memory having an area for one screen, and is able to store decompressed graphics for one screen.
  • the CLUT unit 9 converts an index color of the decompressed Graphics stored in the Graphics Plane 8 based on the values for Y, Cr, and Cb indicated by the PDS.
  • the adder 10 multiplies the decompressed Graphics to which the color conversion has been performed by the CLUT unit 9 by the T value (Transparency) indicated by the PDS, adds the decomposed picture data stored in the Video Plane per pixel, then obtains and outputs the composed image.
  • the Graphics Decoder 12 decodes the Graphics Stream, to obtain the decomposed graphics, and writes the decomposed graphics as the Graphics Object to the Graphics Plane 8. By decoding the Graphics Stream, the subtitles and menus appear on the screen.
  • the Graphics Decoder 12 includes the Coded Data Buffer 13, the peripheral circuit 13a, the Stream. Graphics Processor 14, the Object Buffer 15, the Composition Buffer 16, and the Graphical Controller 17.
  • the Coded Data Buffer 13 is a buffer in which the functional segment is stored along with the DTS and PTS.
  • the functional segment is obtained by removing a TS packet header and a PES packet header from, each TS packet in the Transport Stream stored in the Transport Buffer 4a-4c and by arranging the payloads sequentially.
  • the PTS and DTS out of the removed TS packet header and PES packet header are stored after making correspondence between the PES packets .
  • the peripheral circuit 13a is a wired logic that realizes a transfer between the Coded Data Buffer 13 and the Stream Graphics Processor 14, and a transfer between the Coded Data Buffer 13 and the Composition Buffer 16.
  • the ODS is transferred from the Coded Data Buffer 13 to the Stream Graphics Processor 14.
  • the current time is a time indicated by the DTS of the PCS and PDS
  • the PCS and PDS are transferred to the Composition Buffer 16.
  • the Stream Graphics Processor 14 decodes the ODS, and writes the decompressed graphics of the index color obtained by decoding as the Graphics Object to the Object Buffer 15.
  • the decoding by the Stream Graphics Processor 14 starts at the time of the DTS corresponding to the ODS, and ends by the decode end time indicated by the PTS corresponding to the ODS.
  • the decoding rate Rd of the Graphics Object is an output rate of the Stream Graphics Processor 14.
  • the Object Buffer 15 is a buffer corresponding to a pixel buffer in the ETSI EN 300 743 standard, and the Graphics Object obtained by the decode that the Stream Graphics Processor 14 performs is disposed.
  • the Object Buffer 15 needs to be set to twice or four times as large as the Graphics Piane 8, because m case the Scrolling effect is performed, the Object Buffer 15 needs to store the Graphics Object that is twice or four tim.es as large as the Graphics Plane.
  • the Composition Buffer 16 is a memory m which the PCS and PDS are disposed.
  • the Graphical Controller 17 decodes the PCS disposed in the Composition Buffer 16, and performs a control based on the PCS. A tim.ing for performing the control is based on the PTS attached to the PCS.
  • FIG.27 illustrates sizes of the write rates Rx, Re, and Rd, Graphics Plane 8 , Coded Data Buffer 13 , Object Buffer 15, and Composition Buffer 16.
  • the transfer rate Rd (Pixel Decoding Rate) between the Streami Graphics Processor 14 and Object Buffer 15 does not need to be updated every video frame cycle, and 1/2 or 1/4 of the Re is sufficient for the Rd. Accordingly, the Rd is either 128 Mbps or 64 Mbps.
  • each elements perform, a decoding operation m a pipeline structure.
  • FIG.28 is a timing chart illustrating a pipeline processing by the reproduction apparatus .
  • a 5th row m the drawing is a Display
  • a 4th row shows read periods from the PCS, WDS, PDS, and ODS to the Coded Data Buffer 13.
  • a 3rd row shows decode periods of each ODS by the Stream.
  • Graphics Processor 14 shows operations that the Graphical Controller 17 performs.
  • the DTS (decode starting time) attached to the ODSl and ODS2 indicate t31 and t32 m the drawing, respectively. Because the decode starting time is set by DTS, each ODS is required to be read out to the Coded Data Buffer 13. Accordingly, the reading of the ODSl is completed before a decode period dpi m which the
  • ODSl is decoded to the Coded Data Buffer 13. Also, the reading of the 0DS2 is completed before a decode period dp2 m which the 0DS2 is decoded to the Coded Data Buffer 13.
  • the PTS (decode ending time) attached to the ODSl and ODS2 indicate t32 and 133 in the drawing, respectively. Decoding of the ODSl by the Stream Graphics Processor 14 is completed by the t32, and decoding of the ODS2 is completed by a time indicated by the t33.
  • the Stream Graphics Processor 14 reads the ODS to the Coded Data Buffer 13 by the time the DTS of the ODS indicates, and decodes the ODS read to the Coded Data Buffer 13 by the time the PTS of the ODS indicates, and write the decoded ODS to the Object Buffer 15.
  • a period cdl at the 1st row in the drawing indicates a period necessary for the Graphics Controller 17 to clear the Graphics Plane.
  • a period tdl indicates a period necessary to write the Graphics Object obtained on the Object Buffer to the Graphics Plane 8.
  • the PTS of the WDS indicates the deadline to start writing, and the PTS of the PCS indicates ending of the write and a timing for display. At the time indicated by the PTS of the PCS, the decompressed graphics to compose an interactive screen is obtained on the Graphics Plane 8.
  • the Stream Graphics Processor 14 performs decoding continuously while the Graphics Controller 17 performs clearing of the Graphics Piane 8.
  • FIG.28 a case in which the clearing of the Graphics Plane ends before completing the decoding of the ODS is explained.
  • FIG.29 illustrates a timing chart in a pipeline processing of a case in which the decoding of the ODS ends before the clearing of the Graphics Plane is completed. In this case, it is not possible to write to the Graphics Plane at a time of completion of the decoding of the ODS. When the clearing of the Graphics Plane is completed, it becomes possible to write the graphics obtained by the decode to the Graphics Plane.
  • the controlling unit 20 is implemented by writing a program, performing an operation shown in FIG.30, and having a general CPU execute the program.
  • the operation performed by the controlling unit 20 is explained by referring to FIG.30.
  • FIG.30 is a flowchart showing a process of a loading operation of the functional segment .
  • SegmentK is a variable indicating eacn of Seg .ents (PCS, WDS, PDS, and ODS) tnat is read out in reproducing the AVClip .
  • An ignore flag is a flag to determine if the SegmentK is ignored or loaded.
  • the flowchart has a loop structure, in which first the ignore flag is initialized to 0 and then Steps S21-S24 and Steps S27-S31 are repeated for each SegmentK (Step S25 and Step 326) .
  • Step S21 is for judging if the SegmentK is the PCS, and if the SegmentK is the PCS, judgments in Step S27 and Step S28 are performed.
  • Step S22 is for judging if the ignore flag is 0. If the ignore flag is C , the operation moves to Step S23 , and if the ignore flag is 1, the operation moves to Step S24. If the ignore flag is 0 (Yes m Step S22) , the SegmentK is loaded to the Coded Data
  • Step S24 the SegmentK is ignored in Step S24. By this, the rest of all functional segments that belong to the DS are ignored because Step S22 is No (Step S24) .
  • Steps S27-S31, S34, and S35 are steps for setting the ignore flag.
  • Step S27 it is judged if segmet_type of the SegmentK is the Acquisition Point . If the SegmentK is the Acquisition Point , the operation moves to Step S28, and if the SegmentK is either the Epoch Start or Normial Case, then the operation moves to Step S31.
  • Step S28 it is judged if a preceding DS exists in any of the buffers in the Graphics Decoder 12 (the coded data buffer 13, stream graphics processor 14 , obj ect buffer 15 , and composition buffer 16) .
  • the judgment in Step S28 is made when the judgment in Step S27 is Yes.
  • a case m which a preceding DS does not exist in the Graphics Decoder 12 indicates a case in which the skip operation is performed. In this case, the display starts from the DS that is the Acquisition Point, and therefore the operation moves to Step S30 (No in Step S28) .
  • Step S30 the ignore flag is set to C and the operation moves to Step S22.
  • Step S29 Yes in Step 328.
  • Step S29 the ignore flag is set to 1 and the operation moves to Step S22.
  • Step S31 it is judged if segment_type of the PCS is the
  • Step S34 like in Step S28, it is judged if a preceding DS exists in any of the buffers in the Graphics Decoder 12. If the preceding DS exists, the ignore flag is set to 0 (Step ⁇ 30) . If the preceding DS does not exist, it is not possible to obtain sufficient functional segmients to compose an interactive screen and the ignore flag is set to 1 (Step S35) .
  • the DS is multiplexed as shown in FIG.31
  • three DS are multiplexed with a moving picture.
  • the segment_type of a DSl is Epoch Start
  • the segment_type of a DS10 is Acquisition Point
  • the segment__type of a DS20 is Normal Case.
  • the DS10 is the closest to a skipping target, and therefore the DS10 is the DS described in the flowchart in FIG.30.
  • the ignore flag is set to 0 because no preceding DS exists in the Coded Data Buffer 13, and the DSiC is loaded to the Coded Data Buffer 13 of the reproduction apparatus as shown by an arrow mdl in FIG.32.
  • the DS20 is to be ignored because the DS20 is Normal Case Display Set and DS2C because a preceding DS does not exist in the Coded Data Buffer 13 (an arrow md2 in FIG.32) .
  • FIG.33 illustrates loading of the DSl, DS10, and DS20 in a normal reproduction.
  • the DSl whose segment_type of the PCS is the Epoch Start is loaded to the Coded Data Buffer 13 as it is (Step S23) .
  • the ignore flag of the DS10 whose segment_type of the PCS is the Acquisition Point is set to 1 (Step S29) , the functional segments that constitute the DS10 are ignored and not loaded to the Coded Data Buffer 13 (an arrow rd2 in FIG.3 , and Step S24) .
  • the DS20 is loaded to the Coded Data Buffer 13, because the segment_type of the PCS of the DS20 is the Normal Case (an arrow rd3 in FIG.34) .
  • FIGs .35-37 illustrate a flowchart showing the operations perform.ed by the Graphical Controller 17.
  • Steps S41-S44 are steps for a main routine of the flowchart and waits for any of events prescribed in Steps S41-S44 occurs.
  • Step S41 is to judge if a current reproducing time is a time indicated by the DTS of the PCS, and if the judging is Yes, then an operation in Steps S45-S53 is performed.
  • Step ⁇ 45 is to judge if the compcsiticn_state of the OCS is the epoch_start , and if judged to be the epoch_star , the Graphics Plane 8 is all cleared in Step S46. If judged to be other than the epoch_start, the Window indicated by the window_horizon al_position, window_vertical_position, window_width, and window_he ⁇ ght of the WDS is cleared.
  • Step S48 is a step performed after the clearing performed m Step S46 or in Step S47, and to judge if the time indicated by the PTS of any ODSx has passed.
  • the decoding of any ODSx could be already completed by the time the clearing ends, because the clearing of an entire Graphics Plane 8 takes time. Therefore, in -Steps S48, it is judged if the decoding of any ODSx is already completed by the time the clearing ends. If the judging is No, the operation returns to the main routine. If the time indicated by the PTS of any ODSx has already passed, an operation in Steps S49-S51 is perfor ed. InStepS49, it is judged if obj ect_crop_flag is 0, and if the flag indicates 0, then the Graphics Object is set to "no display" (Step S50) .
  • Step S49 If the flag is not 0 in Step S49, then an object cropped based on object_cropping_horizontal_position, object
  • Step 52 it is judged if the time corresponding to a PTS of another ODSy has passed.
  • Step S53 if the decoding of the ODSy has already been completed, then the ODSy becomes ODSx (Step S53), and the operation moves to Step S49. By this, the operation from Steps S49-S51 is also performed to another ODS.
  • Step S42 and Steps 354-S59 are explained below.
  • Step 42 it is judged if the current reproducing point is at the PTS of the WDS. If the judging is that the current reproducing point is at the PTS of the WDS, then it is judged if the number of the Window is one or not in Step 354. If the judging is two, the operation returns to the main routine. If the judging is one, a loop processing of Steps S55-S59 is performed. In the loop processing, operations in Steps S55-S59 are performed to each of the two Graphics Object displayed in the Window. In Step S57, it is judged if object_crop_flag indicates 0. If it indicates 0, then the Graphics is not displayed (Step S58).
  • StepS59 a cropped object based on object_cropping_ho ⁇ zontal_position, obj ect_cropping _vertical_positicn, cropping_width, and cropping_height is written to the Window in the Graphics Plane 8 at the position indicated by object_cropping_horizontal_posit ⁇ on and object_cropping_vertical_position (StepS59).
  • Step S44 it is judged if the current reproducing point is at the PTS of the PDS. If the judging is that the current reproducing point is at the PTS of the PDS, then it is judged if pallet_u ⁇ date_flag is one or not in Step S6C. If the judging is one, the PDS indicated by pallet_id is set in the CLUT unit (Step S61) . If the judging is 0, then Step S6I is skipped.
  • the CLUT unit performs the color conversion of the Graphics Object on the Graphics Plane 8 to be combined with the moving picture (Step S62) .
  • Step S43 Step S64-S66 are explained below.
  • Step 43 it is judged if the current reproducing point is at the PTS of the ODS. If the judging is that the current reproducing point is at the PTS of the ODS, then it is judged if the number of the Window is two or not in Step S63. If the judging is one, the operation returns to the main routine. If the judging is two, operations in Steps S64-S66 are performed. In Step S64, it is judged if obj ect_crop_flag indicates 0. If it indicates 0, then the Graphics is not displayed (Step S65) .
  • Step S66 a cropped object based on obj ect_cropping_horizontal_position, obj ect_cropping vertical_pos ⁇ tion, cropping_width, and cropping_height is written to the Window in the Graphics Plane 8 at the position indicated by obj ect_cropp ⁇ ng_ho ⁇ zontal_pos ition and ob ect_cropping_vertical_position.
  • the PDS that belongs to the DSn it is sufficient if the PDS is available m the CLUT unit 9 by the PCS is loaded to the Composition 3uffer 16 (DTS ( DSn [PCS] ) ) after decoding start point of a first ODS (DTS (DSn [ODSl] )) . Accordingly, a value of PTS of each PDS ( PDSl-PDSlast ) in the DSn is required to be set so as to satisfy the following relations.
  • the DTS of the PDS is net referred to during the reproducing, the DTS of the ODS is set to the same value as the PTS of the PDS in order to satisfy the MPEG2 standard.
  • FIG.38 illustrates the pipeline of the reproduction apparatus based on the PTS of the PDS.
  • FIG.38 is based on FIG.26.
  • a first row in FIG.38 indicates setting the ODS m the CLUT unit 9. Under the first row are the same as firstte fifth rows in FIG.26.
  • the setting of the PDSl-PDSlast to the CLUT unit 9 is performed after the transferring the PCS and WDS and before the decoding of the ODSl, and accordingly the setting of the PDSl-PDSlast to the CLUT unit 9 is set before a point indicated by the DTS of the ODSl as shown by arrows up2 and up3. As described above, the setting of the PDS is performed m prior to the decoding of the ODS.
  • the END that belongs to the DSn indicates the end of the DSn, and accordingly it is necessary that the PTS of the END indicates the decode ending time of the 0DS2.
  • the decode ending time is indicated by the PTS (PTS (DSn [ODSlast ]) ) of the 0DS2 (ODSlast) , and therefore the PTS of the END is required to be set at a value that satisfies an equation below.
  • DTS (DSn [END] ) PTS (DSn [ODSlast ] )
  • the PCS in the DSn is loaded to the Composition 3uffer 16 before a loading time of the first ODS (ODSl) , and therefore the PTS of the END should be after a loading time of the PCS in the DSn and before a loading time of the PCS that belongs to the DSn+1. Accordingly, the PTS of the END is required to satisfy a relation below.
  • the loading time of the first ODS is before a loading time of a last PDS (PDSlast), and therefore the PTS of the END ( PTS ( DSn [END] ) ) should be after a loading time of the PDS that belongs to the DSn ( PTS ( DSn ] PDSlast ])) .
  • the PTS of the END is required to satisfy a relation below. PTS (DSn[PDSlast] ) ⁇ PTS (DSn[END] )
  • FIG.39 is a diagram, describes the significance of the END in the pipeline process of the reproduction apparatus.
  • FIG.39 is based on FIG.26, and each row in FIG.39 is substantially the same as FIG .26 other than that a first row in FIG .39 indicates the content of the Composition Buffer 16.
  • 2 Display Sets , DSn and DSn+1 are illustrated.
  • the ODSlast in the DSn is the last ODSn of A-ODSs, and accordingly, the point indicated by the PTS of the END is before the DTS of the PCS in the DSn+1.
  • the DTS of the END is not referred to during reproduction, the DTS of the END is set to the same value as the PTS of the END in order to satisfy the MPEG2 standard.
  • the reproduction apparatus does not have to render the Graphics for an entire Plane.
  • the reproduction apparatus may render the Graphics for only a predetermined size of Window, such as 25% to 33% of the Graphics Plane. Because the rendering of the Graphics ether than the Graphics in the Window is not necessary, the load for software in the reproduction apparatus decreases.
  • the size of the Window is set to 1/4 of an entire Graphics Plane and the writing rate Re to the Graphics Piane is set to 256 Mbps, so as to update the Graphics for each video frame. Further, by setting the update rate to be 1/2 or 1/4 of the video frame rate, it becomes possible to update a larger size of the Graphics. However, when the update rate is 1/2 or 1/4 of the video frame rate, it takes 2 or 4 frames to write to the Graphics Plane.
  • FIG.40 illustrates an internal structure of a reproduction apparatus according to the second embodiment .
  • the reproduction apparatus in FIG.40 is new in comparison with the reproduction apparatus according to FIGs.24 and 25 in that the reproduction apparatus m FIG.40 has two Graphics Planes (a Graphics Plane 81 and a Graphics Plane 82 in the drawing) , and the two Graphics Planes constitute a double buffer.
  • FIG.41 schematically illustrates an operation of reading out and writing to the Graphics Planes that constitute the double buffer.
  • An upper row indicates contents of the Graphics Plane 81, and a bottom row indicates contents of the Graphics Piane 82.
  • the contents of the both Graphics Planes per frame are illustrated from a first frame to a fifth frame (left to right) .
  • a part of the Graphics Planes 81 and 82 for each frame that are enclosed by a thick line is a target of the reading out.
  • a face mark is contained in tne Graphics Plane 81, and the face mark is to be replaced by a sun mark that is in an Object Buffer 15.
  • a size of the sun mark is 4 Mbytes, which is a .aximum size of the Object Buffer 15.
  • a third embodiment relates to a manufacturing process of the BD-ROM.
  • FIG.42 is a flowchart illustrating the manufacturing process of the BD-ROM according to tne third embodiment.
  • the manufacturing of the BD-ROM includes a material manufacturing step S201 for producingmaterial andrecordingmovies and sound, an authoring step S202 for generating an application format using an authoring apparatus, and a pressing step S203 for mianufacturing a master disc of the BD-ROM and pressing to finish the BD-ROM.
  • the authoring step of the BD-ROM includes Steps ⁇ 204-S209 as follows.
  • Step S204 the WDS is described so as to define the Window in which subtitles are displayed, and in Step S205, a period of time during which the window is defined to appear at the same pos tion in the same size, is set as one Epoch, and the FCS for each Epoch is described.
  • the Graphics as material for subtitles is converted into the ODS, and the Display Set is obtained by combining the ODS with the PCS, WDS, and PDS in Step S206. Then, m Step S2C7, each functional segment m the Display Set is divided into the PES packets, and the Graphics Stream, is obtained by attaching the time stamp.
  • Step S208 the AVClip is generated by multiplexing the graphics stream with the video stream and audio stream, that are generated separately.
  • the application format is completed by adjusting the AVClip into the BD-ROM format.
  • inventions described in the Claims of the present application include the above embodiments as well as expansions or generalizations of the modified examples.
  • degree of expansion and generalization is based on characteristics of technological levels of the related art at the time of the application
  • the inventions according to the Claims of the present application reflect the means to solve the technical problems in the conventional art, and therefore the scope of the invention does not exceed the technological scope that those skilled m the art would recognize as means to solve the technical problems in the conventional art.
  • the inventions according to the Claims of the present application substantially correspond to the descriptions of the details of the invention.
  • the BD-ROM is used in the explanations of all of the above embodim.ents .
  • characteristics of the present invention are in the Graphics Stream that is recorded in a media, and such characteristics do not depend on physical properties of the BD-ROM. Any recording medium that is capable of storing the Graphics Stream may realize the present invention.
  • Examples of such recording medium include optical discs such as a DVD-ROM, a DVD-RAM, a DVD-RW, a DVD-R, a DVD-RW, a DVD-R, a CD-R, and a CD-RW, magnetic optical discs such as a PD and MO, semiconductor memory cards such as a compact flash card, a smart media, a memory stick, a multi-media card, and a PCM-CIA card, and magnetic discs such as a flexible disc, a SuperDisk, a Zip, and a Clik!, and removable hard disk drives such as an ORB, a Jaz, a SparQ, a SyJet, a EZFley, and a micro drive, in addition to built-in hard disks.
  • optical discs such as a DVD-ROM, a DVD-RAM, a DVD-RW, a DVD-R, a DVD-RW, a DVD-R, a CD-R, and a CD-RW
  • the reproduction apparatus described in all of the above embodiments decodes the AVClip recorded in the BD-ROM and outputs the decoded AVClip to a TV.
  • the reproduction apparatus that includes only a BD-ROM drive, and the TV provided with other elements.
  • the reproduction apparatus and the TV may be connected via I ⁇ EE1394 to create a home network.
  • the reproduction apparatus in the embodiments is used by connecting to the TV, the reproduction apparatus may be an all in one TV and reproduction apparatus .
  • the LSI (integrated circuit) alone that forms an essential part of the processing in the reproduction apparatus of each embodiment may be put into practice.
  • reproduction apparatus and the LSI are both described in the present specification, and therefore manufacturing a reproduction apparatus based on the internal structure of the reproduction apparatus according to the first embodiment is an implementation of the present invention, no natter what working example it may take .
  • transferring whether as a gift or profit, lending, and importing of the reproduction apparatus according to the present invention are also considered to be the implementation of the present invention. Offering such transfer and lend to common users by way of storefront display and distribution of brochure is also considered to be the implementation of the present invention.
  • the extension header is 4-bytes data called TP__extra_header that includes arrival_time_stamp and ccpy_permiss ⁇ on _indicator .
  • the TS packets having the T?_extra__header (hereinafter referred to as the TS packets with EX) are grouped by every 32 packets and written to 3 sectors.
  • the group of 32 TS packets with EX stored in 3 sectors is called Aligned Unit.
  • the reproduction apparatus When the reproduction apparatus is used in the home network connected via the IEEEI394, the reproduction apparatus transmits the Aligned Unit in a following transmission procedure.
  • a sender obtains TP_extra_header from each of the 32 TS packets with EX included in the Aligned Unit, and outputs main body of the TS packets after decoding based on DTC? standard.
  • isochronous packets are inserted between any two successive TS packets. Insertion points are positions based on time indicated by arrivai_time_stamp m TP_extra_header .
  • the reproduction apparatus outputs DTCP_desc ⁇ ptor .
  • the DTC?_descriptor indicates settings for copy permission. By describing the DTC?_descr ⁇ ptor so as to indicate the copying is prohibited, the TS packets are not recorded by other devices when using in the home network connected via the IEEE1394.
  • the digital stream in the above embodiments is the AVClip.
  • the digital stream may be a Video Object (V03) in DVD-Video standard cr DVD-Video Recording standard.
  • V03 is a I ⁇ O/IEC13818-l-standard-based program stream obtained by multiplexing the video stream, and audio stream.
  • the video stream in the AVClip may also be based on MP ⁇ G4 or WMV standard.
  • the audio stream may be based on Linear-PCM, Dolby-AC3, MP3, MPEG-AAC, or DTS standard.
  • the movie m the above embodiments may be obtained by encoding analog image signals transmitted via analog broadcasting, or may be stream data constitutedby transport stream transmitted via digital broadcasting.
  • contents by encoding analog or digital image signals that are recorded in a videotape . Further, the contents may also be obtained by encoding analog or digital image signals that are directly loaded from a video camera. Moreover, contents may be a digital work delivered by a distributing server .
  • the Graphics Object in the first and second embodiments is raster data that is encoded based on run-length limited encoding.
  • the run-length limited encoding is adopted for compressing and encoding the Graphics Object because the run-length limited encoding is the most appropriate for compressing and decompressing the subtitles.
  • the subtitles have characteristics that a length in a horizontal direction becomes relatively long, and accordingly, high compression rate is obtained by using the run-length limited encoding.
  • the run-length limited encoding is preferable for making software for decoding because the load in decompression is low.
  • the same compression/decompression method as for the subtitles is employed for the Graphics Object.
  • the Graphics Object may be PNG data.
  • the Graphics Object is not required to be the raster data and may be vector data.
  • the Graphics Object may be transparent graphics.
  • a target for display effect by the PCS may be the graphics for the subtitles selected based on a language setting of the reproduction apparatus. Realizing such a display has a high utilitarian value, because it becomes possible to realize an effect, which is realized by the moving picture itself in the conventional DVD, by the subtitle graphics displayed according to the language setting of the reproduction apparatus.
  • the Window size is set to be 25% of an entire Graphics Plane in order to set the writing rate Re to the Graphics Plane to the rate at which the clearing of the Graphics Plane and redrawing is performed in one frame.
  • the Re may be set so that the clearing and re-drawmg are completed during a vertical retrace period. Given that the vertical retrace period is 25% of 1/29.93 seconds, the Re is IGbps . Setting the Re m such a way has a high utilitarian value, because it is possible to display the graphics smoother.
  • the Graphics Plane is mounted to the reproduction apparatus.
  • a line buffer for storing decompressed pixels for a line in place of the Graphics Plane to the reproduction apparatus . Conversion into image signals is performed by line, and therefore the conversion into the image signals may be carried out with the line buffer alone.
  • the explanations are given taking the text subtitles for the movie as the examples of the graphics.
  • the graphics may include such as a combination of devices, characters, and colors that constitute a trade m.ark, a national crest, a national flag, a national emblem, a symbol and a great seal for supervision or certification that a national government uses, a crest, a flag, or an emblem of an international organization, or a mark of origin of a particular item.
  • the Window for rendering the subtitles is defined either at an upper side of the screen, or the bottom of the screen, assuming that the subtitles are written horizontally.
  • the Window may be defined to appear either on left or right side of the screen so as to display the subtitles on the left and right of the screen. In this way, it is possible to change text direction and display subtitles vertically.
  • the AVClip in the above embodiments constitutes the movie.
  • the AVClip may also be used for karaoke .
  • the PCS m.ay perform the display effect such that the color of the subtitles changes along with a song.
  • a recording medium and a reproduction apparatus are capable of displaying subtitles with a display effect . Accordingly, it is possible to add higher values to movies supplied in the market, and to activate markets for films and consumer products.
  • the recording medium and the reproduction apparatus according to the present invention have high industrial applicability in industry such as the film, industry and consumer products industry.

Abstract

A recording medium storing an AVClip structured by multiplexing video stream and a graphics stream. The VIDEO graphics stream represents a moving picture made of a plurality of pictures, and the graphics stream includes graphics data representing graphics to be combined with the pictures. The graphics stream also includes window information (WDS) that specifies a window for rendering the graphics, and that indicates a width, a height and a position of the window on a plane which is a plane memory of a reproduction apparatus that combines the graphics with the pictures.

Description

DESCRIPTION
RECORDING MEDIUM, REPRODUCTION APPARATUS, RECORDING METHOD, REPRODUCING METHOD, PROGRAM, AND INTEGRATED CIRCUIT
Technical Field
The present invention relates to a recording medium such as a BD-ROM, and a reproduction apparatus, and more specifically, to a technique of subtitling by reproducing a digital stream constituted by multiplexing a video stream and a graphics stream..
Background Art
Subtitling realized by rendering graphics streams is an important technique for allowing people in different linguistic areas to appreciate a film, produced in a language other than their native languages. An example of a conventional technique of subtitling is a memory allocation scheme for a Pixel Buffer based on the ETSI EN 30C 743 standard set forth by European Telecommunications Standards Institute (ETSI). The Pixel Buffer is a memory for temporarily storing decompressed graphics, and a reproduction apparatus writes the graphics in the Pixel Buffer to a display memory called a Graphics Plane, and thus the graphics is displayed. In the memory allocation scheme, a definition of a Region is included in the Pixel Buffer, and a part of the decompressed graphics that corresponds to the Region is written to the Graphics Plane. For example, when a subtitle "Goodbye..." is contained in the Pixel Buffer and a position and a size of the Region are defined so as to includes a part "Go", then the part "Go" is written to the Graphics Plane and displayed on the screen. Likewise, when the position and size of the Region are defined so as to includes a part "Good", then the part "Good" is displayed on the screen.
By repeating of the defining of the Region and the writing to the Graphics Plane, the subtitle "Goodbye... " is displayed on the screen gradually, i.e., first "Go", next "Good", then "Goodbye", and finally the whole subtitle "Goodbye..." is displayed. By rendering a subtitle in such a way, it is possible to realize a wipe-m effect.
The ETSI EN 30C 743 standard, however, does not at all consider to guarantee the sync between a graphics display and a picture display when a burden for writing to the Graphics Plane is high. The graphics written to the Graphics Plane is not compressed, and accordingly, the burden for writing to the Graphics Plane increases as a resolution of the graphics becomes higher. A size of the graphics to be written to the Graphics Plane is up to 2 Mbytes when rendering the graphics in a resolution of 1920x1080, which is a proposed standard resolution for a 3D-R0M, and a higher bandwidth for a graphics data transfer from the Pixel Buffer tc the Graphics Plane is necessary in order to render graphics as large as 2 Mbytes synchronously with the picture display. However, demanding a high bandwidth for the data transfer to write the graphics to the Graphics Plane hinders an attempt of cost reduction in manufacturing the reproduction apparatus. It is possible to lower the necessary bandwidth in writing to the Graphics Plane by having the reproduction apparatus always perform a "reasonable write", in which only a difference from a previous display is written to the Graphics Plane. However, demanding the reproduction apparatus to always perform the "reasonable write" restricts software applicable to the reproduction apparatus. As described in the above, the high burden for writing to the Graphics Plane demands that reproduction apparatuses operate in the high bandwidth or perform the reasonable write, and as a result, restricts product development of reproduction apparatuses .
Disclosure of the Invention
An object of the present invention is to provide a recording medium w th which graphics may be updated synchronously with a picture display even when an amount of data to be written to a Graphics Plane is large.
In order to achieve the above object, an example of the recording mediu according to the present invention is a recording medium, used for storing data, said recording medium comprising: a digital strea constituted by multiplexing a video stream and a graphics stream, wherein said video stream, represents a moving picture .ade of a plurality of pictures, and the graphics stream, includes : graphics data representing graphics to be combined with the pictures; and window information that specifies a window for r.endering the graphics therein, the window information indicating a width, a height and a position of the window on a plane, the plane being a plane memory of a reproduc ion apparatus that combines the graphics with the pictures.
By specifying a part of the Plane corresponding to each picture as the window for rendering the graphics, it is not necessary that the reproduction apparatus renders the graphics for an entire plane, and it is sufficient that the reproduction apparatus renders the graphics only in a limited size of window. Because it is not necessary to render the graphics outside the window in the plane, the load of software in the reproduction apparatus may be reduced. Further, by setting a size of the window so as to ensure a sync display between the graphics and the picture, it becomes possible for a producer who performs authoring to guarantee the sync display in any kind of reproduction apparatus, even when update of the graphics is performed in a worst case. Moreover, by setting a position and a size of the window by the window information, it is possible to adjust the position and size of the window in the authoring, so that the subtitles are out of the way of pictures when viewing the screen. Therefore, the visibility of the graphics are maintained even when the picture on the screen changes as time passes, and thus it is possible to maintain the quality of a film.
The worst case in updating the graphics means a case in which the graphics is updated in a least efficient operation, i.e. all clear and re-drawing of the window. When setting the size of the window in order to prepare for the worst case, it is desirable that the above recording medium is such that the width and height of the window are set so that a size of the window is 1/x of the plane, the plane corresponding to a size of each picture and x being a real number based on a ratio between a window update rate and a picture display rate.
By setting the window size in this manner, a bandwidth on the reproduction apparatus that is necessary for writing to the graphics plane is set to a fixed value. By structuring the reproduction apparatus so as to satisfy this bandwidth, it is possible to realize the sync display between the graphics and the picture regardless of the software mounted to the reproduction apparatus .
As described above, it is possible to present a minimum standard for a structure of the reproduction apparatus. As long as the transfer rate is set so as to satisfy the minimum standard, a design of the reproduction apparatus is at the discretion of developers. Therefore, it is possible to expand the possibility in development of the reproduction apparatus.
Brief Description Of The Drawings
FIG.l illustrates an example of use of a recording medium according to the present invention.
FIG.2 illustrates a structure of a 3D-R0M. FIG.3 is a diagram schematically illustrating a structure of an AVCiip.
FIG.4A illustrates a structure of a presentation graphics stream.
FIG.43 illustrates a PES packet obtained after functional segments are converted. FIG.5 illustrates a logical structure that is made of various kinds cf functional segments.
FIG.6 illustrates a relation between a display position of a subtitle and an Epoch.
FIG.7A illustrates syntax to define a Graphics Object in an Object Definition Segment (ODS) .
FIG.7B illustrates syntax of a Palette Definition Segment (PDS) .
FIG.8A illustrates syntax of a Window Definition Segment (WDS) . FIG.83 illustrates syntax of a Presentation Composition Segment (PCS) .
FIG.9 illustrates an example of a description of a Display Set for subtitling.
FIG.10 illustrates an example of a description of the WDS and PCS in a DS1. FIG.11 illustrates an example of a description of the PCS in a D32.
FIG.12 illustrates an example of a description of the PCS in a DS3. FIG.13 is an example of a description of a Display Set when Cut-In/Out is performed, illustrating along a timeline.
FIG.14 is an example of a description of a Display Set wnen Fade-In/Out is performed, illustrating along a timeline.
FIG.15 is an example of a description of a Display Set when Scrolling is performed, illustrating along a timeline.
FIG.16 is an example of a description of a Display Set when Wipe-In/Out is performeα, illustrating along a timeline.
FIG.17 is a diagram comparing two cases: a window has four Graphics Objects, and a window has two Graphics Objects. FIG .18 illustrates an example of an algorithm for calculating a decode duration.
FIG.19 is a flowchart of the algorithm, of FIG.18. FIGs.20A and B are flowcharts of the algorithm of FIG.18. FIG.21A illustrates a case in which each window has an Object Definition Segment.
FIGs.21B and C are timing charts showing orders among numerals referred to in FIG.18.
FIG.22A illustrates a case in which each window has two Ob ect Definition Segments. FIGs.22B and 0 are timing charts showing orders among numerals referred to in FIG.18.
FIG .23A describes a case in which each of two Windows includes an ODS.
FIG.23B illustrates a case in which a decode period (2) is longer than a total of a clearing period (1) and a write period ( 3 1 ) .
FIG.23C illustrates a case in which a total of the clearing period (1) and the write period (31) is longer than the decode period (2 ) . FIG.24 illustrates shifts in time of update described in an example in the present specification.
FIG.25A illustrates four Display Sets that are described so as to perform the above explained update.
FIG.25B is a timing chart showing settings of DTS and PTS of functional segments included in the four Display Sets.
FIG.26 illustrates an internal structure of a reproduction apparatus according to the present invention.
FIG.27 illustrates sizes of write rates Rx, Re , and Rd, Graphics Plane 8, Coded Data Buffer 13, Object Buffer 15, and Composition Buffer 16.
FIG.28 is a timing chart illustrating a pipeline processing by the reproduction apparatus.
FIG.29 illustrates a timing chart in a pipeline processing of a case in which decoding of the ODS ends before clearing of the Graphics Plane is completed.
FIG.30 is a flowchart illustrating a process of a loading operation of a functional segment.
FIG.31 shows an example of multiplexing. FIG.32 illustrates a manner in which a DS1C s loaded to the Coded Data Buffer 13.
FIG.33 illustrates loading of a DS1, the DS10, and a DS20 in a normal reproduction.
FIG.34 illustrates loading of the DS1, DS10, and DS20 in the normal reproduction as shown in FIG.33. FIG. 5 illustrates a flowchart showing a process performed by the Graphical Controller 17.
FIG.36 illustrates a flowchart showing the process performed by the Graphical Controller 17.
FIG.37 illustrates a flowchart showing the process performed by the Graphical Controller 17.
FIG.38 illustrates a pipeline process of the reproduction apparatus based on the PTS of the PDS.
FIG.39 is a diagram describes a significance of the END m the pipeline process of the reproduction apparatus. FIG.40 illustrates an internal structure of the reproduction apparatus according to a seconα emoodiment .
FIG. 1 schematically illustrates an operation of reading cut and writing to the Graphics Planes that constitute a double buffer . FIG.42isa flowchart illustrating the manufacturing process of the BD-ROM according to a third embodiment.
Best Mode for Carrying Out the Invention [First Embodiment] A First Embodiment of a recording medium according to the present invention is explained below.
FIG .1 illustrates an example of use of the recording medium.
In the drawing, BD-ROM 100 is the recording medium, according to the present invention. Tne BD-ROM 100 is used for providing data of movie works to a Home Theatre System structured by a reproduction apparatus 200, a television 300, and a remote controller 400.
The recording medium according to the present invention is manufactured by an improvement in an application layer of a BD-ROM.
FIG.2 illustrates a structure of the BD-ROM. In the drawing, the BD-ROM is shown at a bottom of the drawing, and a track on the 3D-R0M is shown above the BD-ROM. The track is actually in a spiral shape on the disc, but shown in a line in the drawing. The track includes a lead-in area, a volume area, and a lead-out area . The volume area in this drawing has a physical layer, a file system, layer, and an application layer. At a top of the drawing, an application format of the BD-ROM is illustrated using a directory structure. As illustrated in the drawing, the BD-ROM has a directory BDMV under the root directory, and the BDMV directory contains a file for storing an AVClip with an extension M2TS (XXX.M2TS), a file for storing administrative info for the
AVClip with an extension CLPI (XXX.CLPI) , and a file for defining a logical Play List (PL) for the AVClip with an extension MPLS
(YYY.MPLS) . By forming the above application format, it is possible to manufacture the recording medium according to the present invention. In a case in which there are more than one file for each kind, it is preferable to provide three directories named STREAM, CLIPINF, and PLAYLIST under the BDMV to store the files with the same extension in one directory. Specifically, it is desirable to store the files with the extension M2TS in the STREAM, the files with the extension CLPI in the CLIPINF, and the files with the extension MPLS in the PLAYLIST.
An explanation about the AVClip (XXX.M2TS) m the above application format is given below.
The AVClip (XXX.M2T3) is a digital stream in MPEG-TΞ format (TS is Transport Stream) obtained by multiplexing a video stream, at least one audio stream, and a presentation graphics stream. The video stream, represents pictures of the film, the audio stream represents sound of the film, and the presentation graphics stream represents subtitles of the film. FIG.3isa diagram schematically illustrating a structure of the AVClip. The AVClip (XXX.M2TS) is structured in a following manner .
Each of the video stream, made of plural vide frames (picture p l, pj2, and pj3), and the audio stream, made of plural audio frames
(top row of the drawing) are converted into a line of PES packets (second row of the drawing) , and then into a line of TS packets
(third row of the drawing) . ^h presentation graphics stream
(bottom row of the drawing) is converted into PES packets (second to bottom row of the drawing) , and then into TS packets (third to bottom row of the drawing) . Three lines of PS packets are multiplexed, and thus the AVClip (XXX.M2TS) is constituted.
In the drawing, only one presentation graphics stream is multiplexed. However, in a case in which the BD-ROM is compatible to plural languages, a presentation graphics stream for each language is multiplexed to constitute tne AVClip. The AVClip constituted in the abovemanner is divided in o more than one extent , like ordinary computer files, and stored in areas in the BD-ROM. Next, thepresentation graphics streamis explained. FIG.4A illustrates a structure of the presentation graphics stream. A top row indicates the TS packet line to be multiplexed to the AVClip . A secondto the top row indicates the PES packet line that constitutes a graphics stream.. The PES packet line is structured by retrieving payloads out of TS packets having a predetermined ?ID, and connecting the retrieved payloads.
A third to the top row indicates the structure of tne graphics stream. The graphics stream is made of functional segments named a Presentation Composition Segment (PCS) , a Window Definition Segment (WDS) , a Palette Definition Segment (PDS) , an Object Definition Segment (ODS) , and an END of Display Set Segment (END) . Among the above functional segments, the PCS is called a screen composition segment, and the WDS, PDS, ODS, and END are called definition segments. The PES packet and each of the functional segments correspond one to one, or one to plurality. In other words, one functional segment is either recorded in the BD-ROM after converted into one PES packet , orafterdividedintc fragments and converted into mere than one PES packet.
FIG.43 illustrates the PES packet obtained by converting the functional segments. As shown in the drawing, the PES packet is made of a packet header and the payload, and the payioad is a substantial body of a functional segment. The packet header includes a DTS and a PTS corresponding to the functional segment. The DTS and PTS included in the packet header are hereinafter referred to as the DTS and PTS of the functional segment.
The above described various kind of functional segments constitute a logical structure as illustrated in FIG.5. FIG.5 illustrates the logical structure that is made of the various kinds of functional segments. In the drawing, a top row illustrates Epochs, a middle row illustrates Display Sets (DS) , and a bottom row illustrates the functional segments.
Each of the DS shown in the middle row is a group of functional segments that compose graphics for one screen, among all of the plural functional segments that constitute the graphics stream. Broken lines in the drawing indicate the DS to which the functional segments in the bottom, row belong, and show that a series of the functional segments of the PCS, WDS, PDS, ODS, and END constitute one DS . The reproduction apparatus is able to generate graphics for one screen by reading the functional segments that constitute the DS.
The Epochs shown in the top row indicate time periods, and memory management is consecutive timewise along a timeline of the AVClip reproduction in one Epoch. One Epoch also represents a group of αata that is assigned to the same period of time. The memory referred to here are the Graphics Plane tnat stores the graphics for one screen, and an Object Buffer that stores decompressed graphics data. The consecutiveness of tne memory management means a flash of the Graphics Plane or of the Object Buffer does not occur in the Epoch, and erasing and rendering of the graphics are only performed in a predetermined rectangular area on the Graphics Plane (the flash here indicates clearing of all contents of the stored data in a plane or a buffer) . A size and a position of the rectangular area are fixed during one Epocn. As long as the erasing and rendering of the graphics are only perfor.ed in the predetermined rectangular area on the Graphics Plane, a sync reproduction between tne picture and the graphics is guaranteed. In other words, the Epoch is a unit n tne reproducing timeline, and in this unit , the picture and the graphics are guaranteeα to be reproduced synchronously. Wnen moving the area, in which the graphics are erased and rendered, to a different position, it s necessary to define a point on the timeline to move the area, and a period after the point becomes a new Epoch. The sync reproduction is not guaranteed at a boarder between two Epochs .
In viewing an actual film, one Epoch is a time period in which subtitles are displayed in the same rectangular area on the screen. FIG.6 illustrates a relation between the position of the subtitles and the Ξpocns . In an example llustratedby the drawing, the positions at which the five subtitles "Actually...", "I was hiding", "my feelings.", "I always", and "loved you." are shown move according to the picture in the film. Specifically, the subtitles "Actually... ", "I was hiding", and "my feelings . " appear at the bottom of the screen, while the subtitles "I always" ana "loved you." are shown at the top of the screen. The position of the rectangular area moves in order that the subtitles are out of the way of pictures when viewing the screen, considering visibility of the film. A time period during which the subtitles appear at the bottom is an Epoch 1, and a subsequent tiir.e period during which the subtitles appear at the top is an Epoch 2. The Epochs i and 2 each have a different area in which the subtitles are rendered. The area in the Epoch 1 is a Window 1 positioned at the bottom, of the screen, and the area m the Epoch 2 is a Window 2 positioned at the top of the screen. The memory management is consecutive in each of the Epochs 1 and 2 , and accordingly, rendering of the subtitles in the Windows 1 and 2 is synchronous with the pictures .
Next, details about the Display Set (DS) are described. Broken lines hkll and hkl2 m FIG.5 indicate which functional segment at the middle row belongs to which Epoch. A series of DS "Epoch Start", "Acquisition Point", and "Normal Case" constitute the Epoch at the top raw. The "Epoch Start", "Acquisition Point", and Normal Case" are types of the DS, and an order between the "Acquisition Point" and "Normal Case" does not matter and either of them may come first.
The Epoch Start is a DS that has a display effect of "new display" , which indicates a start of a new Epoch . Because of this , the Epoch Start contains all functional segments needed to display a new composition of the screen. The Epoch Start is provided at a position which is a target of a skip operation of the AVClip, such as a chapter in a film.
The Acquisition Point is a DS that has a display effect of "display refresh", and is identical in content used for rendering graphics with the Epoch Start which is a preceding DS. The Acquisition Point is not provided at a starting point of the Epoch, but contains all functional segments needed to display the new composition of the screen. Therefore, it is possible to display the graphics without fail when a skip operation to the Acquisition Point is performed. Accordingly, with the Acquisition Point, it is possible to compose a screen in the middle of the Epoch.
The Acquisition Point is provided at a position that could be a target for the skip operation. An example of such a position is a position that could be specified when performing a time search . The tim.e search is an operation in response to a user' s input of a time to start reproducing from a reproducing point corresponding to the time specified by the user. The time is specified roughly, such as by 10 minutes or by 10 seconds, and accordingly, points at which the reproduction starts are provided at such as a 10 minute interval, or a 10 second interval. By providing the Acquisition
Point at the points at which the reproduction may start, it is possible to perform, reproduction smoothly after the time search.
The Normal Case is a DS that has a display effect of "display update", and contains only elements that are different from, the preceding composition of the screen. Specifically, when subtitles in a DSv is the same as subtitles in a DSu but the screen is displayed differently in the DSv and DSu, the DSv is provided so as to include only the PCS and makes the DSv the Normal Case. By this, it does not necessary to provide an ODS with the same content as the content of the ODS in the preceding DS, and a data size in the BD-ROM may be reduced. On the other hand, because the DS as the Normal Case contains only the difference, it is not possible to compose the screen using the Normal Case alone.
Details of the Definition Segments (ODS, WDS, and PDS) are explained below. The Object Definition Segment (ODS) is a functional segment that defines the Graphics Object. An explanation of the Graphics Obj ect is given first . A selling point of the AVClip recorded in the BD-ROM is its resolution as high as hi-vision, and therefore the resolution for the Graphics Object is set at 1920^1080 pixels. 3ecause of the high resolution of 1920x1080 pixels, it is possible to display a specific character style for the subtitles clearly on the screen. As for colors of the subtitles, a bit length of an index value for each pixel (Color Difference Red Cr, Color Difference Blue Ob, Luminance Y, and Transparency T) is 8 bits, and thus it is possible to chose any 256 colors out of full color ( 16, 777 , 216 colors) for the subtitles . The subtitles realized by the Graphics Object are rendered by positioning texts on a transparent background.
Syntax of ODS to define the Graphics Object is shown in FIG.7A. The ODS is made of segment_type indicating that the segment is the ODS, segment_length indicating a data length of the ODS, ob ect_id uniquely identifying the Graphics Object corresponding to the ODS in the Epoch, obj ect_version_number indicating a version of the ODS within the Epoch, last_insequence__flag, and obj ect_data_fragment which is a consecutive sequence of bytes corresponding to a part or all of Graphics Object.
The obj ect_id is for uniquely identifying the Graphics Obj ect corresponding to the ODS in the Epoch. The Epoch of the graphics stream, contains more than one ODS having the same ID. The ODS having the same ID also have the same width and height, and are assigned with a common area in the Object Buffer. After one of the ODS having the sam.e ID is read in the common area, the read ODS is overwritten by a subsequent ODS having the same ID. By overwriting the ODS that is read to the Object Buffer by the subsequent ODS having the same ID as the reproduction of the vide stream proceeds, the graphics by the ODS is updated accordingly. A size constraint that the width and height of the Graphics Object having the same ID should be the same is applied only during one Epoch, and the Graphics Objects m different Epochs may have different sizes.
Explanations about last_sequence_flag and object_ data_fragment are given next. In some cases, it is not possible to store the decompressed graphics that constitutes the subtitle in one ODS due to a payload constraint of the PES packet. In such cases, the graphics issplitmtoaseriesof consecutive ragments, and one fragment is set to the obj ect_data_fragment . When one Graphics Object is storedasmorethan one fragment, every fragment except a last fragment has the same size. The last fragment is less than or equal to the size of previous fragments. The ODS carrying the fragments appear in the same sequential order in the DS, w th an end of the sequence indicated by the ODS having the last_sequence_flag. Although the above described syntax of the ODS is based on a premise that the fragιr;ents are stacked in from the preceding PES, the fragments may be stacked so that each PES contains a blank part.
Next, the Palette Definition Segment (PDS) is explained. The PDS is used to define a palette for a color conversion . FIG.73 shows syntax of the PDS . The PDS is made of segment_type indicating that the segment is the PDS, segment_length indicating a data length of the PDS, palette_id uniquely identifying the palette contained in the PDS, palette_ version_number indicating a version of the PDS within the Epoch, and palette_entry_id specifying an entry number of the palette. The palette_entry_id indicates the Color Difference Red (Cr__value) , the Color Difference Blue (Cb_value) , Luminance (Y_value) , and Transparency (T_value) . Next, an explanation about the Window Definition Segment (WDS) is given below.
The WDS is used to define the rectangular area on the Graphics Plane. As described in the above, the memory management is sequential only when erasing and rendering is performed within a certain area on the Graphics Plane. The area on the Graphics Piane is definedbythe WDS and called "Window" . FIG.8A illustrates syntax of the WDS. As shown by the drawing, the WDS is made of segment type indicating that the segment is theWDS, segment_length indicating a data length of the WDS, window_ιd uniquely identifying the Window on the Graphics Plane, window_horizontaI__posιtιon specifying a horizontal address of a top left pixel of the Window on the Graphics Plane, window_vertical_position specifying a vertical address of the top left pixel of the Window on the Graphics Plane, wmdow_width specifying a width of the Window on the Graphics Plane, and window_height specifying a height of the Window o . the Graphics Plane.
Ranges of values that the window_horizontai_position, window_verticai_position, wmdow_width, and window_height nay take are explained below. A coordinate system for those values is within an area on the Graphics Plane, and whose size is indicated two-di ensionally by the window_height for a height and the window_width for a width.
The window_horizontal_positιon specifies the horizontal address of the top left pixel of the Window on the Graphics Piane, and is within a range of 0 to (window_width) -1. Also, the window_vertical_position specifies the vertical address of the top left pixel of the Window on the Graphics Plane, and is within a range of C to (window_heιght ) -1. The window_width specifies the width of the Window on the Graphics Plane. The specified width falls within a range of 1 to (video_width) - (window_horizontal_posιtion) . Further, the window_height specifies the height of the Window on the Graphics Plane, and the specified height is within a range of 1 to (video_height ) - (window_vertical_position) .
The position and size of the Window on the Graphics Plane for each Epoch are defined by the window_horizontal_position, wmdow_vertical_positιon, windcw_width, and window_height . Accordingly, it is possible to adjust the position and size of the Window at authoring, so that the Window in one Epoch appears at the position that does not come in the way of the picture when viewing the film. By this, thev sibilityofthesubtitlesbecomes higher. Because the WDS is defined for each Epoch, it is possible to adjust the position of the Window according to the picture, even if the picture changes in the course of time. As a result, the quality of the film is maintained as high as m a case where the subtitles are incorporated in the mam body of the film.
Next, the End of Display Set Segment (END) is explained. The END provides an indication that a transmission of the DS is completed. The End is inserted into a stream immediately after a last ODS in one DS . The End is made of segment_type indicating that the segment is the END and segment_length indicating a data length of the END. ^'r.e END does not include any other element that requires a further explanation. Next, an explanation about the Presentation Composition Segment (PCS) is given below.
The PCS is a functional segment that is used for composing an interactive display. FIG.83 illustrate syntax of the PCS. As shown in the drawing, the PCS is made of segment_type, segment_Iength, compositιon_number, composition_state, palette_update_flag, palette_id, and window information 1-m.
The composition_number identifies the Graphics Update in the DS by values in a range of 0 to 15. If the Graphics Update exists between the head of the Epoch and the PCS, the compositιon_number is incremented every time the Graphics Update occurs.
The composition_state indicates the type of the DS in which the PCS s contained, Norm.al Case, Acquisition Point, or Epoch Start. The palette_update_flag indicates that the PCS describes a Palette only Display Update. The Palette only Display Update indicates that only the palette is updated from an immediately previous palette. The palette_update_ lag field is set to "1", if the Palette only Display Update is performed. The palette_id identifies the palette to be used in the Palette only Display Update.
The window information 1-m. indicate how to control each Window in the DS to which the PCS belong. A broken line wdl in FIG.8B is to detail an internal syntax for window information i. The window information i is made of object_id, window_id, obj ect_cropped_flag, obj ect_horιzontal_position, an obj ect_vertical_pcsition, and cropping_rectang!e information 1-n.
The object_ιd identifies the ODS in a Window corresponding to the window information i.
The window_id identifies the Window to which the Graphics Object is allocated in the PCS. Up to two Graphics Objects may be assigned to one Window.
The object_cropped_flag is used to switch between display and no-display of a cropped Graphics Object m the Object Buffer. When the object__cropped_flag is set to "1", the cropped Graphics Object is displayed in the Object Buffer, and if set to "0", the Graphics Object is not displayed.
The obj ect_horizontal_position specifies a horizontal address of a top left pixel of the Graphics Object in the Graphics Plane .
The object_vertical_position specifies a vertical address of the top left pixel of the Graphics Object in the Graphics Plane. The cropping_rectangIe information i-n are elements used when the obj ect_cropped_flag is set to "1". A broken line wd2 is to detail an internal syntax for cropping_rectangle information i. As shown by the broken line wd2 , the cropping_rectangle information i is made of four fields, obj ect_cropping_horιzontal_pcsition, obj ect_croppιng_ vertical_position, obj ect_cropping_width, and ob ect_ cropping_height .
The obj ect_cropping_horizontal_position specifies a horizontal address of a top left corner of a cropping rectangle to be used during rendering of the Graphics Object in the Graphics Plane. The cropping rectangle is a cropping frame that is used to specify and crop a part of the Graphics Object, and corresponds to Region m the ETSI EN 300 743 standard.
The obj ect_cropping_vertιcaI_posit on specifies a vertical address of the top left corner of the cropping rectangle to be used during rendering of the Graphics Obj ect in the Graphics Plane .
The object_cropping_width specifies a width of the cropping rectangle .
The obj ect_cropping_height specifies a height of the cropping rectangle. A specific example of the PCS is detailed below. In the example, the subtitles "Actually...", "I was hiding", and "my feelings." as shown in FIG.6 appear gradually by writing to the Graphics Piane 3 tim.es as tne picture proceeds . FIG.9 is an example of description for realizing such a subtitle αispiay. An Epoch in the drawing includes a DS1 (Epoch Start) , a DS2 (Normal Case) , and a DS3 (Normal Case) . The DS1 contains a WDS for specifying the Window in which the subtitles are displayed, an ODS for specifying the line "Actually... I was hiding m.y feelings.", and a first PCS. Tne DS2 contains a second PCS, and the DS3 contains a third PCS.
FIGs .10-12 illustrate examples of the WDS and PCS contained m the DS. FIG.10 snows an example of the PCS m the DS1.
In FIG.10, the window_norizontal_position and the w ndow_vertιcal_posιtion of tne WDS are indicated oy a LP1, a position of the top left pixel of the Window on tne Graphics Plane. The window_wιdth and wιndow_neιght indicate the width and height of the Window, respectively.
In FIG.10, the ooj ect_cropping_hcπzontal_position and object_cropping_vertical_position indicate a reference point ST1 of the cropping rectangle in the coordinate system in which an origin is the top left pixel of the Graphics Object. The cropping rectangle is an area having the width from the ST to the obj ect_croppιr.g_width, and the height from the ST to the obj ect_cropping_height -(a rectangle shown by a heavy-line frame) . The cropped Graphics Object is positioned within a rectangle shown by a broken-line frame cpl, with a reference point in the coordinate system, with an origin at the object_horizontal_positιon and object_verticaI_position (the top left pixel of the Graphics Object) in the Graphics Plane . By this, the subtitle "Actually... " is written to the Window on the Graphics Piane, and then composed with the .ovie picture and displayed on the screen.
FIG.11 shows an example of the PCS in the DΞ2. The WDS in the DS2 is not explained, because the WDS m the DS2 is the same as the WDS in the DSl. A description of the cropping information in the DS2 is different from the description of the cropping information shown in FIG.10.
In FIG.11, the ob ect_cropping_horizontal_position and object_cropping_vertical_position m the cropping information indicate a top left pixel of the subtitle "I was hiding" out of "Actually... I was hiding my feelings." in the Object Buffer. The object_cropping_width and obj ect_cropping_height indicates a width and a height of a rectangle containing the subtitle "I was hiding". 3y this, the subtitle "I was hiding" is written to the Window on the Graphics Plane, and then composed with the movie picture and displayed on the screen.
FIG.12 shows an example of the PCS in the DS3. The WDS m the DS3 is not explained, because the WDS in the DS3 is the same as the WDS in the DSl. A description of the cropping information in the DS3 is different from the description of the cropping information shown in FIG.10.
In FIG.12, the obj ect_cropping_horizontal_positicn and obj ect_cropping_vertical_position in the cropping information indicate a top left pixel of the subtitle "my feelings." cut of "Actually... I was hiding my feelings." in the Object Buffer. The obj ect_cropping_width and obj ect_cropping_height indicates a width and a height of a rectangle containing the subtitle "m.y feelings.". 3y this, the subtitle "my feelings." is written to the Window on the Graphics Piane, and then composed with the movie picture and displayed on the screen. 3y describing the DSl, DS2, and DS3 as explained above, it
99 is possible to achieve an effect of displaying the subtitles on the screen. It is also possible to achieve other kinds of effect, anddescriptionprotocols for realizing other effects are explained below . First, a description protocol for a Cut-In/Out effect is explained. FIG.13 shows an example of the description of the DS when Cut-In/Out is performed, illustrating along a timeline.
In the drawing, x and y in Window (x, y, u, v) respectively indicate values of the window__vertιcal_position and wmdow_horizontal_position, and u and v respectively indicate values of the window_width and window_height . Also in the drawing, a andb m Cropping Rectangle (a,b, c,d) respectively indicate values of the obj ect_cropping_vertical_position and obj ect_cropping_horizontal_position, and c and d indicate values of the obj ect_croppir.g_width and obj εct_croppmg_height , respectively. Display Sets DS11, DS12, and DS13 are at points til, t!2, and t!3 on the reproduction timeline in the drawing.
The DΞll at the point til includes a PCS 0 in which the composιtion_state is "Epoch Start" and the obj ect_crcpped_flag is"C" (no_cropping_rectangle_vιsible) , a WDS-0 having a statement for a Window in a width 700 x height 500 at (100,100) in the Graphics Plane, a PDS#0, an ODS#0 indicating a subtitle "Credits:", and an END.
The DS12 at the point tl2 includes a PCS÷l whose composition_state is "Normal Case" and indicating a crop operation of the Graphics Object to be in a 600x400 size from (0,0) m the Object Buffer (cropping_rectangleϊf0 ( 0 , C , 600 , 400 ) ) , and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Piane (on Window#0 (0 , 0 ) ) . The DS13 at the point tl3 includes a PCS 2 whose composition_state is "Normal Case" and in which the object_cropped_flag issetto"0"soastoerasethe cropped Graphics Object (no_cropping_rectangle_visible) .
With the above explained Display Sets, the subtitle "Credits :" is no-display at the til, appears at the 112 , then becomes no-display at the tl3 again, and the Cut-In/Cut-Out effect is realized.
Secondly, a description protocol for a Fade-In/Out effect is explained. FIG.14 shows an example of the description of the DS when Fade-In/Out is performed, illustrating along a timeline.
Display Sets DS21, DS22, DS23, and DS24 are at points t2I, -22, t23, and t24 on the reproduction timeline in the drawing.
The DS21 at the point t21 includes a PCS#C whose composition_state is "Epoch Start" and indicating the crop operation of the Graphics Object to be in a 600x400 size from (0,0) in the Object Buffer (cropping_rectangle:ιi0 ( 0 , 0 , 600 , 400 ) ) , and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Plane (on Window#0 ( 0 , 0) ) , a WDS 0 having a statement for a Window in a width 700 height 500 at (100,100) in the Graphics Plane, a PDStO, an OD3 0 indicating a subtitle "Fi ", and an END.
The DS22 at the point t22 includes a PCSrl whose composition_state is "Normal Case", and a ?DS 1. The ?DS 1 indicates the same level of Cr and Cb as the PDS C, but a luminance indicated by the PDSπl is higher than the luminance in the PDSrO. The DS23 at the point t23 includes a PCS#2 whose composition_state is "Normal Case", a?DS#2, and an END. The ?DSr2 indicates the same level of Cr and Cb as the PDS^l, but the luminance indicated by the PDSr2 is lower than the luminance in the PDST4!.
The DS24 at the point t24 includes a PCS whose composition_state is "Normal Case" and the object_cropped_flag is "0" (no_cropping_rectangle_visible) , and an END.
Each DS specifies a different PDS from a preceding DS, and accordingly, the luminance of the Graphics Object that is rendered with more than one PCS in one Epoch becomes gradually high, or low. Bythis, it is possible to realize the effect of Fade-In/Out .
Next, a description protocol for a Scrolling is explained.
FIG.15 shows an example of the description of the DS when Scrolling is performed, illustrating along a timeline. Display Sets DS31,
DS32, DS33, and DS34 are at points t31, t32, t33, and t34 on the reproduction timeline in the drawing.
The DS31 at the point t31 includes a PCS^O whose composιtion_state is set to "Epoch Start" and obj ect_cropped_flag is"0" (no_cropping_rectangle_visible) , a WDS#0 having a statement for a Window in a width 700 x height 500 at (100,100) in the Graphics Plane, aPDS#0, an ODS#C indicating a subtitle "Credits : Company", and an END.
The DS32 at the point t32 includes a ?0S#1 whose composition_state is "Norrr;al Case" and indicating the crop operation of the Graphics Object to be in a 600x400 size from (0,0) in the Object Buffer (cropping^ectangie^O ( C , 0 , 600 , 400 ) ) , and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Plane (on Window^C (0,0)). An area of the 600x 00 size from (0,0) in the Object Buffer includes a part "Credits:" of the subtitle "Credits: Company" shown in two lines, and thus the part "Credits:" appears on the Graphics Plane.
The DS33 at the point t33 includes a PCS 2 whose composition_state is "Normal Case" and indicating the crop operation of the Graphics Object to be in a 600x400 size from (0, 100) m the Object Buffer (cropping_rectangle 0 (0,100,600,400)), and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Flane (on Window#0 ( 0 , 0) ) . The area of the 600x400 size f om (0,100) in the Obj ect Buffer includes the part "Credits : " and a part "Company" of the subtitle "Credits: Company" shown in two lines, and thus the parts "Credits:" and "Company" appear in two lines on the Graphics Piane.
The DS34 at the point t34 includes a PCS 3 whose composιtion_state is "Normal Case" and indicating the crop operation of the Graphics Obj ect to be in a 600x400 size from (0,200) in the Object Buffer (cropping_rectangle^C (0,200,600,400)), and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Plane (on Window^O (0,0)) . The area of the 600x400 size from (0,200) in the Obj ect Buffer includes the part "Company" of the subtitle "Credits: Company" shown in two lines, and thus the part "Company" appears on the Graphics Plane. By the above PCS description, it is possible to scroll down the subtitle in two lines.
Finally, a description protocol for a Wipe-In/Out effect is explained. FIG.16 shows an example of the description of the DS when Wipe-In/Out is performed, illustrating along a timeline. Display Sets DS21, DS22, DS23, and DS24 are at points -21, t22, t23, and t24 on the reproduction timeline in the drawing.
The D351 at the point t51 includes a PCS O whose compcsιtion_state is set tc "Epoch Start" and the obj ect_cropped_flag is. "0" (no_cropping_rectangle_visible) , a WDS#0 having a statement for a Window in a width 700 x height 500 at (100,100) in the Graphics Plane, a ?DS#0, an ODS^O indicating a subtitle "Fm", and an END.
The DS52 at the point t52 includes a PCS l whose composition__state is "Normal Case" and indicating the crop operation of the Graphics Object to be in a 600x400 size from (0,0) m the Object Buffer (cropping_rectangle#0 (0, 0 , 600, 400 ) ) , and positioning the cropped Graphics Object at the coordinates (0,0) in the Graphics Piane (on Window#0 (0,0)). An area of the 600x400 size from. (0,0) in the Object Buffer includes the subtitle "Fin", and thus the subtitle "Fin" appears on the Graphics Plane.
The DS53 at the point t53 includes a PC3 2 whose composition_state is "Normal Case" and indicating the crop operation of the Graphics Object tobe ina 400x400 size from. (200, 0) in the Object Buffer (cropping_rectangle 0 (200,0,400,400)), and positioning the cropped Graphics Obj ect at the coordinates (200,0) m the Graphics Plane (on Windowjf0 (200 , 0 ) ) . 3y this, an area indicated by coordinates (200,0) and (400,400) in the Window becomes a display area, and an area indicated by coordinates (0,0) and (199,400) becomes a no-dispiay area. The DS54 at the point t54 includes a PCS#3 whose compcsition_state is "Normal Case" and indicating the crop operation of the Graphics Obj ect to be in a 200χ40C size from (400,0) in the Obj ect Buffer (cropping_rectangle:fi0 (400,0,200,400)), and positioning the cropped Graphics Object at the coordinates (400, 0) m the Graphics Plane (on Window^0 ( 400 , 0 ) ) . By this, an area indicatedby coordinates (0,0) and (399, 400) becomes the no-display area .
By this, as the no-dispiay area becomes larger, the display area becomes smaller, and thus the Wipe-In/Out effect is realized. As described above, various effects such as Cut-In/Out, Fade-In/Out, Wipe-In/Out, and Scrolling may be realized using corresponding scrip s, and therefore it is possible to make various arrangements in rendering the subtitles.
Constraints for realizing the above effects are as follows. In order to realize the Scrolling effect, operations for clearing and redrawing of the Window becomes necessary. Taking the example of FIG.15, it is necessary to perform "window clear" to erase the Graphics Object "Credits:" at the t32 from the Graphics Piane, and then to perform "window redraw" to write a lower part of "Credits:" and an upper part of "Company" to the Graphics Plane during an interval between the t32 and t33. Given that the interval is the same as an interval of video frames, a transfer rate between the Object Buffer and the Graphics Piane desirable for the Scrolling effect becomes an important point. Here, a constraint about how large the Window may be is looked into. An Re is the transfer rate between the Object Buffer and the Graphics Plane. A worst scenario here is to perform both of the Window clear and Window redraw at the rate Re. In this case, each of the Window clear andWindow redraw is required tobeperform.ed at a rate half of Re (Rc/2) .
In order to make the Window clear and Window redraw synchronized with a video frame, an equation below s need to be satisfied .
Window size x Frame Rate =? Rc/2 If the Fram.e Rate is 29.97, Re is expressed by an equation belo .
Re = Window size 2 χ29.97
In rendering the subtitles, the Window size accounts for at least 25% to 33% of the- Graphics Plane . A total number of pixels in the Graphics Plane is 1920x1080. Taking that an index bit length per pixel is 8 bits, a total capacity of the Graphics Plane is
2 Mbytes ( :=1920χl080χ 8 ) .
Taking that the Window size is 1/4 of the total capacity of the Graphics Plane, the Window size becomes 500 Kbytes (=2 Mbytes /4) . By suostituting this value to the above equation, Re is calculated to be 256 Mbps (=500 Kbytes x 2x29.97) . If the rate for the Window clear and Window redraw may oe a half or a quarter of the frame rate, it is possible to double or quadruple the size of the Window even if the Re is the same. By keeping the Window size 25% to 33% of the Graphics Plane and displaying the subtitles at the transfer rate of 256 Mbps, it is possible to maintain the sync display between the graphics and the movie picture, no matter what kind of display effect is to be realized. Next, the position, size, and area of the Window are explained. As explained above, the position and area of the Window does not change m one Epoch. The position and the size of the Window set to be the same during one Epoch because it is necessary to change a target write address of the Graphics Plane if the position and the size change, and changing the address causes an overhead that lowers the transfer rate from the Object Buffer to the Graphics Piane .
A number of Graphics Objects per Window has a limitation. The limitation of the number is provided m order to reduce the overhead in transferring decoded Graphics Object. The overhead here is generated when setting the address ofanedgeofthe Graphics Object, and the more a number of edges, the .ore the overhead is generated.
FIG.17 shows examples in comparison, an example in which a Window has four Graphics Objects and another example in which a Window has two Graphics Objects. The number of the edges of the example with four Graphics Objects is twofold of the number of the edges of the example with two Graphics Objects.
Without the limitation in the number of the Graphics Object, it becomes unknown how many overheads could be generated in transferring the Graphics, and thus the load for the transfer increases and decreases drastically. On the other hand, when a maximumnumber of the Graphics Object in a Window is two, the transfer rate may be set taking up to 4 overhead into account . Accordingly, it is easier to set the number of a minimum transfer rate.
Next, an explanation about how the DS having the PCS and ODS is assigned to the timeline of the AVClip. The Epoch is a period of time in which a memory management is consecutive along the reproduction timeline. Since the Epoch is made of more than one DS, how to assign the D3 to the reproduction tim.eline of the AVClip is important. The reproduction timeline of the AVClip is a timeline for specifying timings for decoding and reproducing of each piece of picture data that constitute the video stream, multiplexed to the AVClip. The decoding and reproducing timings on the reproduction timeline are expressed at an accuracy of 90 KHz. A DTS and PTS that are attached to the PCS and ODS in the DS indicate timings for a synchronic control on the reproduction tim.eline. The assigning of the Display Set to the reproduction timeline means pεrfor .ing the synchronic control using the DTS and PTS attached to the PCS and ODS.
First, how the synchronic control is performed using the DTS and PTS attached to the ODS is explained below.
The DTS indicates, at the accuracy of 90 KHz, a time when the decoding of the ODS starts, and the PTS indicates a time when the decoding ends .
The decoding of the ODS does not finish at once, and has a certain length of time. In response to a request for clearly indicating a starting point and an ending point of a decode duration, the DTS and PTS of the ODS respectively indicate the times when the decoding starts and ends. The value of the PTS indicates the deadline, and therefore it is necessary that the decoding of the ODS has to be completed by the time indicated by the PTS and the decompressed Graphics
Object is written to the Object Buffer on the reproduction apparatus.
The decode starting time of any ODSj in a DSn is indicated by a DTS (DSn [ODS" ) at the accuracy of 90 KHz. Adding a maximum length of the decode duration to the DTS (DSn [ODS] ) is the time when the decoding of the ODSj ends. When a size of the ODSj is "SIZE (DSn [ODSj ] ) " and a decoding rate of the ODS is an "Rd", the maximum time required for decoding indicated by second s expressed in "SIZE ( DSn [ODSj ]) //Rd" . The symbol "//" indicates an operator for a division with rounding up after a decimal place. By converting the maximum time period into a number expressed at the accuracy of 90 KHz and adding to the DTS of the ODSj , the time when the decoding ends (90 KHz) indicated by the PTS is calculated.
The PTS of the ODSj n the DSn is expressed in a following equation.
PTS (DSn [ODSj] ) = DTS (DSn [ODSj] ) +90, 000* (SIZE ( DSn [ODSj ] ) //Rd)
Further, it is necessary that a relation between two succeeding ODS, ODSj and ODSj-1, satisfies a following equation.
PTS (DSn [ODSj] ) <DTS (DSn[0DSj-l] ) Next, settings of the DTS and PTS of the PCS are explained.
It is necessary that the PCS is loaded to the Object Buffer on the reproduction apparatus before the decode starting time
(DTS (DSn[ODSl] ) ) of a first ODS (ODS1) in the DSn, and before the time (PTS (DSn [PDS1] ) ) when a first PDS (PDS1) in the DSn becomes effective. Accordingly, it is necessary that the DTS is set so as to satisfy following equations. DTS (DSn [PCS] ) ≤DTS (DSn[ODSl] ) DTS (DSn [PCS] ) ≤PTS (DSn [ PDS1] )
Further, the PTS of the PCS in the DSn is expressed in a following equation.
PTS (DSn [PCS] ) >DTS (DSn [PCS] ) -decodeduration (DSn)
The "decodeduration (DSn) " indicates a time duration for decoding all the Graphics Objects used for updating PCS . The decode duration is not a fixed value, but does not vary according to a status of the reproduction apparatus and a device or a software mounted to the reproduction apparatus. When the Object used for composing a screen of a DSn.PCSn is a DSn . PCSn . OBJ [j ] , the decodeduration (DSn) is affected by time (i) needed for clearing the Window, decode durations (ii) for decoding a DSn. PCSn. OBJ, and time (iii) needed for writing of the DSn. PCSn . OBJ. When the Rd and Re are set, the decode_duraticn (DSn) is always the same. Therefore, the PTS is calculated by calculating lengths of these durations in authoring.
The calculation of the decode_duration is perform.ed based on a program, shown in FIG .18. FIGs.19, 20A and 203 are flowcharts schematically showing algorithms of the program. An explanation about the calculation of the decode_duration is given below referring to these drawings. In the flowchart shown in FIG.19, first, a PLANΞINITIALZE- function is called (Step SI m FIG.19) . The PLANEINITIALZΞ function is used for calling a function for calculating a time period necessary to initialize the Graphics
Plane for rendering the DS . IntheStepSlinFIG.19, the function is calledwith arguments DSn, DSn. PCS . O3J[0] , and decode_duration.
The following explains the PLANEINITIALZΞ function in reference to FIG.20A. In the drawing, initiaIize_duratιon is a variable indicating a return value of the PLANEINITIALZE function . Step S2 m FIG.20 is an if statement for switching operations depending on whether or not the page_state in the PCS in the DSn indicates the Epoch Start. If the page_state indicates the Epoch Start (DSn. PCS . page_state==epoch_start , Step S2=Yes in FIG .18 ) , a time period necessary to clear the Graphics Plane s set to an init alize_duration (Step S3).
When the transfer rate "Re between the Object Buffer and the Graphics Piane is 256,000,000 as described in the above, and the total size of the Graphics Plane is set to video_width video_height , the time period necessary to clear is "videc_width*video_height//256, 000, 000" . When multiplied by 90.000Hz so as to express at the time accuracy of PTS, the time period necessary to clear the Graphics Plane is "90, 0C0χvidec_width7rvideo_height//256, 000, 000". This time period is added to the ιnιtialιze_duratιon .
If the page_state does not indicate the Epoch Start (Step S2=No) , a time period necessary to clear Window [i] defined by the WDS is added to the inιtialize_duration for all Windows (Step S4) . When the transfer rate Re between the Obj ect Buffer and the Graphics Plane is 256,000,000 as described in the above and a total size of Wmodow]i] that belongs to the WDS is ∑SIZE (WDS . WIN [i] ) , the time period necessary to clear is "∑SIZE (WDS. WIN [i] )//256,0C0,000". When multiplied by 90.000Hz so as to express at the time accuracy of PTS, the time period necessary to clear the Windows that belong to the WDS is "90, 00Oχ∑SIZE(WDS.WIN[i] ) //256, 000, 000". This time period is added to the initialize_duration, and the initialize_duration as a result is returned. The above is the PLANEINITIALZE function. Step S5 in FIG.19 for switching operations depending on whether the number of the Graphics Objects in the DSn is 2 or 1 (if (DSn. PCS . num_of_object==2, if (DSn. PCS . num_of_obj ect==I in FIG.18 ) , and if the number is 1 (Step S5) , a waiting time for decoding the Graphics Object is added to the decode_duration (Step S6) . Calculation of the waiting time is performed by calling a WAIT function (decode_duration ^=WAIT(DSn, DS . PCS . OBJ [0] , decode_duration) in FIG.18). The function is called using argumen s set to DSn, DSn . PCS . OBJ [0 ] , decode_duration, and a return value is wait_duration. FIG.20B is a flowchart showing an operation of the WAIT function .
In the flowchart, the decode_duratιon of an invoker is set as a current_duration. An obj ect_definition_ready_time is a variable set to the PTS of the Graphics Object of the DS . A current_tιme is a variable set to a total value of the current_duration and the DTS of the PCS m the DSn. When the obj ect_definition_ready_time is larger than the current_time (Yes to Step S7 , if (current_time <obj ect_defmition_ready_time) ) , the wait_duratιon as the return value is set to be a difference between the obj ect_definition_ready_tim.e and the current_tim.e (Step S8, wait_duration -= object _definition_ready_time - current_time) . The decode_duration is set to the tim.e period that the return value of the WAIT function added to the time period necessary for re-drawing the Window, (90,000 * (SIZE (DSn. DS .WIN [ 0] ) ) // 256, 000, 000) .
The above explanation is for the case in which the number of the Graphics Object is one. In Step S5 in FIG.5, it is judged if the number of the Graphics Objects is two. If the number of the Graphics Objects in the DSn is more than two (if (DSn. PCS.num_of_object==2) in FIG.18), the WAIT function is called using OBJ[0] m the PCS as an argument, and add a return value to the decode_duration (Step SIC) .
In a succeeding Step Sll, it is judged if the Window to which the O3J[0] of the DSn belongs is the same as the Window to which the Graphics Object [1] belongs (if (DSn . OBJ [ 0] . window_id== DSn. PCS.OBJi.l] .window_id) . If the Window is the same, the WAIT function is called using 03J[1] as an argument, and add a return value wait_duration to the decode_duration (Step S12) , and add the time necessary to redraw the Window to which O3J]0] belong (90, 000 * (SIZE (DSn. WDS. OBJ [C] . window_id) ) //256, 000, 000) to the decode_duration (Step S13).
If it is judged that the Windows are different (Step Sll, "different"), the time necessary to redraw the Window is added to which O3J[0] belong (90,000 * (SIZE (DSn . DS . OBJ [ 0] . window_id) ) // 256, 000, 000) to the decode_duration (Step 315), the WAIT function is called using 03J]1] as an argument, and add a return value waιt_duration to the decode_duration (Step S16) , and the time necessary to redraw the Window to which OBJ[l] belong (90, 000 * (SIZE (DSn.WDS.O3J]0] . window_id) ) // 256, 000, CCC) to the decode_duration (Step S17).
The decode_duration is calculated by the above algorithm. A specific manner in which the PTS of the OCS is set is explained belo .
FIG.21A illustrates a case in which one ODS is included in one Window. FIGs.2IB and 21C are timing charts showing values in an order of time that are referred to in FIG.18. Abottom line "ODS Decode" and a middle line "Graphics Plane Access" in each chart indicate two operations that are performed simultaneously when reproducing. The above algorithm is described assuming that these two operations are performed in parallel. The Graphics Plane Access includes a clearing period (1) and a write period (3) . The clearing period (1) indicates either a time period necessary to clear an entire Graphics Plane (90, OOOx (size of Graphics Plane //256, 000, C00) ) , or a time period necessary to clear all Windows on the Graphics Plane (∑(9C, OOOx (size of Window [i] //256, CCC , 000 ) ) .
The write period (3) indicates a time period necessary to render an entire Window ( 90 , 000χ ( size of Window [i] //256,000,000) ) . Further, a decode period (2) indicates a time period between the DTS and the PTS of the ODS.
Lengths of the clearing period (1) , the decode period (2) , and the write period ( 3 ) may vary depending on a range to be cleared, a size of ODS to be decoded, and a size of the Graphics Object to be written to the Graphics Plane. For convenience, a starting point of the decode period (2) in the drawing is the same as a starting point of the clearing period (1).
FIG.213 illustrates a case in which the decode period (2) is long, and the decode_duration equals to a total of the decode period (2) and the write period (3).
FIG.21C illustrates a case in which the clearing period (1) is long, and the decode_duration equals to a total of the clearing period (1) and the write period (3).
FIGs .22A to 22C illustrate a case in which two ODS is included in one Window. The decode period (2) in both FIGs.22B and 22C indicates a total time period necessary for decoding two Graphics .
Likewise, the write period (3) indicates a total time period necessary for writing two Graphics to the Graphics Plane.
Even though the number of ODS is two, it is possible to calculate the decode_duration in the same manner as in the case of FIG.21. When the decode period (3) for decoding the two ODS is long, the decode_duratιon equals to a total of the decode period
(2) and the write period (3) as shown in FIG.223.
"When the clearing period (1) is long, the decode_duration equals to a total of the clearing period (1) and the write period
(3) .
FIG .23A describes a case in which each of two Windows includes an ODS. As in the previous cases, when the clearing period (1) is longer than the decode period (3) for decoding the two ODS, the decode_duration equals to a total of the clearing period (1) and the decode period (2) . However, when the clearing period (1) is shorter than the decode period (3), it is possible to write to a first Window before the decode period (2) ends. Accordingly, the decode_duraticn does not equal to either of a total of the clearing period (1) and the write period (3), or a total of the decode period (2) and the write period (3) .
When a time period necessary for decoding a first ODS is a write period (31) and a time period necessary for decoding a second ODS is a write period (32), FIG.233 illustrates a case in which the decode period (2) is longer than a total of the clearing period (1) and the write period (31) . In this case, the decode_duration equals to a total of the decode period (2) and the write period (32) .
FIG.23C illustrates a case in which a total of the clearing period (1) and the write period (31) is longer than the decode period (2) . In this case, the decode_duration equals to a total of the clearing period (1) , the write period (31) , and the write period (32) .
The size of the Graphics Plane is known from a model of the reproduction apparatus in advance. Also, the size of the Window, and the size and number of the ODS are also known at the authoring.
Accordingly, it is possible to find to which combination of time periods the decode_duration equals: the clearing period (1) and the write period (3) , the decode period (2) and the write period (3) , the decode period (2) and hewriteperiod (32) , or the clearing period (1), the write period (3) and the write period (32).
By setting the PTS of the ODS based on the calculation of the decode_duration explained above, it is possible to synchronously display the graphics with the picture data at a high accuracy. Such a sync display at a high accuracy becomes possible by defining the Window and limiting an area to re-draw to the Window.
Thus , introducing a concept of Window into an authoring environment has a great significance.
The following is an explanation about settings of the DTS and PTS of the WDS in the DSn. The DTS of the WDS may be set so as to satisfy an equation below.
DTS (DSn [WDS] ) >DTS (DSn [PCS] )
On the other hand, the OT5 of the WDS in the DSn indicates a deadline to start writing to the Graphics Plane. Because it is sufficient to write to the Window on the Graphics Plane, the time to start writing to the Graphics Plane is determined oy subtracting a time length indicated by the PTS of the PCS from. a time period necessary for writing the WDS . When a total size of the WDS is ∑SIZE ("WDS .-WIN [i] ) , the time necessary for clearing and re-drawing is "∑SIZE (WDS . WIN [i] ) //256, 000, 000" . When expressing at a time accuracy of 90.000 KHz, the time is
"90, OOO ∑SIZE (WDS. WIN [i] )//256, 000,000".
Accordingly, it is possible to calculate the PTS of the WDS by the following equation. PTS (DSn [WDS] ) = PTS (DSn] PCS] )-90000χ∑SIZE ( DS. I [i] )//256,000,000
The PTS indicated in the WDS is the deadline, and it is possible to start writing to the Graphics Plane earlier than the PTS. In other words, as shown in FIG.23, once decoding the ODS to be rendered in one of the Windows, writing of the Graphics Object obtained by the decoding may start at this point.
As described above, it is possible to assign the Window to any point of time on the reproduction timeline of the AVClip using the DTS and PTS added to the WDS. Explanations about an example of settings of the DTS and PTS in a Display Set based on the settings are give below, referring to specific example illustrated in FIGs.24-25. The example is about a case m which subtitles are displayed by writing to the Graphics Piane four times, and an update is performed for displaying each of two subtitles "what is blu-ray." and "blu-ray is everywhere." FIG.24 illustrates shifts in time of the update in the example. Until a point tl, "what" is displayed, and "what is" is displayed after the tl till a t2, and then "what is blu-ray. " is displayed at a t3. After a whole sentence of a first subtitle has appeared, a second subtitle "blu-ray s everywhere." is displayed at a t4.
FIG.25A illustrates four Display Sets that are described so as to perform the above explained update. A DSl includes a PCS1.2 for controlling an update at the tl, a PDS1 for coloring, an OD31 corresponding to the subtitle "what is blu-ray.", and an END as an ending code of the DSl.
A DS2 includes a PCS1.2 for controlling an update at the t2, and an END. A DS 3 includes a PCS1.3 for controlling an update at a t3 and an END. A DS 4 includes a PCS2 for controlling an update at the t2, a PDS2 for color conversion, an ODS2 corresponding to the subtitle "blu-ray is everywhere.", and an END.
Referring to a timing chart in FIG.253, settings of DTS and
PTS for each functional segment in the four Display Sets are explained. The reproduction timeline in the timing chart is the same as the timeline in FIG.24. In the timing chart of FIG.25A,
PTS(PCSl.l), PTS(?CS1.2), ?TS(?CS1.3), and PTS(?CS2) are respectively set at a display point tl for displaying "what", a display point t2 for displaying "what is", a display point t3 for displaying "what is blu-ray. ", and a display point 14 for displaying
"blue-ray is everywhere.". Each PTS are set as above, because it is necessary that the control such as cropping described in each PCS is performed at the display point of each subtitle.
PTS(ODSl) and PTS(ODS2) are set so as to indicate points that are calculated by subtracting decode__duration from the points indicated by PTS(PCSl.l) and ?TS(PCS2), respectively, because
PTS (PCS) is required to be set so as to satisfy a formula below.
PTS (DSn [PCS] ) ≥DTS (DSn [PCS] ) -decodeduration (DSn)
In FIG.25B, PTS(ODS2) is set so as to indicate a point t5 that comes before the point t4 , and PTS (0DS1) issetsoasto indicate a point tO that comes before the point tl.
DTS(ODSl) and DTS(ODS2) are set so as to indicate points that are calculated by subtracting decode_duration from the points indicated by PIS(ODSl)- and ?TS(ODS2), respectively, because DTS (ODS) is required to be set so as to satisfy an equation below.
PTS (DS [ODSj] )=DTS (DSn [ODSj ]) +90, 000 (SIZE (DSn [ODSj ] ) //Rd)
In FIG.25B, PTS(ODΞ2) is set so as to indicate the point t5 that comes before the point tO, and PTS(ODSl) is set so as to indicate a point that comes before the point tO. A relation indicated by DTS (OD32) =PTS (ODSI) is satisfied here. 3y setting a PTS of an ODS immediately after a PTS of a preceding ODS to be displayed earlier, the reproduction apparatus performs an operation in which the ODS is read out to the memory so as to overwrite the preceding ODS, and thus it is possible that the reproduction process is performed by a small size of memory. 3y realizing such a reproduction process, choices for a memory size for a reproduction apparatus become wider.
The DTS of PCS1.1 is set so as to be DTS (PCS1.1)=DTS (ODS1) , because the value for the DTS of PCS1.1 may be any point before the point indicated by DTS(ODSl).
The PTS of ODS1, the DTS of ODS2, and the PTS of the PCSI.2, PCS1.3, and PCS2 are set at the point tO , so as to satisfy a relation indicated by an equation below.
PTS (0DS1)=DTS (0DS2)=?TS ( PCS1.2 ) -PTS (PCS1.3)=PTS (PCS2) This is because the value for the DTS of PCS1.2 and PCS1.3 may be any points before the point indicated by PTS(PCS1.3) , and the DTS of PCS2 may be any point before the point indicated by DTS (PCS2) .
As explained above, it is possible to perform update of a succeeding PCS as soon as the updating of a previous PCS is completed, by reading out more than one PCS at the same time.
It is sufficient that the DTS and PTS of PCS and the DTS and PTS of ODS satisfy the relations indicated by the formulae above. Accordingly, it becomes possible that the values are set to be DTS (ODS2)=PTS (ODS1) or PTS (ODS1 ) =DTS (0DS2 ) = PTS (PCS1.2)=?TS (PCS1.3)=DTS (PCS2) . By such settings for time stamps, it is possible to adjust time length of a period in which load in decoding increases or more buffers are needed. Such adjustment expands possibility of the controls during the reproduction, and advantageous for those who perform authoring or manufacture reproducing apparatuses.
Data structures of the Display Sets (PCS, WDS, PDS, ODS) explained above is an instance of the class structure described in a programming language. Producers that perform authoring may obtain the data structures on the BD-ROM by describing the class structure according to the syntax provided in the Blu-ray Disc Prerecording Format.
Next, a practical example of a reproduction apparatus according to the present invention is explained below. FIG.26 illustrates an internal structure of the reproduction apparatus according to the present invention. The reproduction apparatus according to the present invention is industrially produced based on the internal structure shown in the drawing. The reproduction apparatus according to the present invention is mainly structured by three parts: a system. LSI, a drive device, and a microcomputer system., and it is possible to industrially produce the repreduction apparatus by mounting the three parts to a cabinet and a substrate of the apparatus . The system. LSI is an integrated circuit in which various processing units for carrying out a function of the reproduction apparatus are integrated. The reproduction apparatus manufactured in the above manner comprises a BD drive 1, a Read Buffer 2, a PID filter 3, Transport Buffers 4a-4c, a peripheral circuit 4d, a Video Decoder 5, a Video Piane 6, an Audio Decoder 7 , a Graphics Plane 8 , a CLUT unit 9, an adder 10, a Graphics Decoder 12, a Coded Data Buffer 13, a peripheral circuit 13a, a Stream Graphics Processor 14, an Object Buffer 15, a Composition Buffer 16, and a Graphical Controller 17.
The BD drive 1 performs ioad/read/eject of the BD-ROM, and accesses to the BD-ROM. The Read Buffer 2 is a FIFO memory for storing the TS packets read from the BD-ROM in a first-m first-out order.
The PID filter 3 filters more than one TS packet outputted from the Read Buffer 2. The filtering by the PID filter 3 is to write the only TS packets having a desired PID to the Transport Buffers 4a-4c. Buffering is not necessary for the filtering by the PID filter 3, and accordingly, the TS packets inputted to the PID filter 3 are written to the Transport Buffers 4a-4c without delay.
The Transport Buffers 4a-4c are for storing the TS packets outputted from, the PID filter 3 in a first-in first-out order. A speed at which the TS packets from, the Transport Buffers 4a-4c are outputted is a speed Rx.
The peripheral circuit 4d is a wired logic for converting the TS packets read from the Transport Buffers 4a-4cinto functional segments . The functional segments obtained by the conversion are stored in the Coded Data Buffer 13.
The Video Decoder 5 decodes the more than one TS packets outputted from the PID filter 3 into a decompressed picture and writes to the Video Plane 6. The Video Plane 6 is a plane memory for a moving picture.
^ e Audio Decoder 7 decodes the TS packets outputted from the PID filter 3 and outputs decompressed audio data.
The Graphics Plane 8 is a plane memory having an area for one screen, and is able to store decompressed graphics for one screen.
The CLUT unit 9 converts an index color of the decompressed Graphics stored in the Graphics Plane 8 based on the values for Y, Cr, and Cb indicated by the PDS.
The adder 10 multiplies the decompressed Graphics to which the color conversion has been performed by the CLUT unit 9 by the T value (Transparency) indicated by the PDS, adds the decomposed picture data stored in the Video Plane per pixel, then obtains and outputs the composed image.
The Graphics Decoder 12 decodes the Graphics Stream, to obtain the decomposed graphics, and writes the decomposed graphics as the Graphics Object to the Graphics Plane 8. By decoding the Graphics Stream, the subtitles and menus appear on the screen. The Graphics Decoder 12 includes the Coded Data Buffer 13, the peripheral circuit 13a, the Stream. Graphics Processor 14, the Object Buffer 15, the Composition Buffer 16, and the Graphical Controller 17.
The Coded Data Buffer 13 is a buffer in which the functional segment is stored along with the DTS and PTS. The functional segment is obtained by removing a TS packet header and a PES packet header from, each TS packet in the Transport Stream stored in the Transport Buffer 4a-4c and by arranging the payloads sequentially. The PTS and DTS out of the removed TS packet header and PES packet header are stored after making correspondence between the PES packets . The peripheral circuit 13a is a wired logic that realizes a transfer between the Coded Data Buffer 13 and the Stream Graphics Processor 14, and a transfer between the Coded Data Buffer 13 and the Composition Buffer 16. In the transfer operation, when a current time is a time 'indicated by the DTS of the ODS, the ODS is transferred from the Coded Data Buffer 13 to the Stream Graphics Processor 14. When the current time is a time indicated by the DTS of the PCS and PDS, the PCS and PDS are transferred to the Composition Buffer 16.
The Stream Graphics Processor 14 decodes the ODS, and writes the decompressed graphics of the index color obtained by decoding as the Graphics Object to the Object Buffer 15. The decoding by the Stream Graphics Processor 14 starts at the time of the DTS corresponding to the ODS, and ends by the decode end time indicated by the PTS corresponding to the ODS. The decoding rate Rd of the Graphics Object is an output rate of the Stream Graphics Processor 14.
The Object Buffer 15 is a buffer corresponding to a pixel buffer in the ETSI EN 300 743 standard, and the Graphics Object obtained by the decode that the Stream Graphics Processor 14 performs is disposed. The Object Buffer 15 needs to be set to twice or four times as large as the Graphics Piane 8, because m case the Scrolling effect is performed, the Object Buffer 15 needs to store the Graphics Object that is twice or four tim.es as large as the Graphics Plane. The Composition Buffer 16 is a memory m which the PCS and PDS are disposed.
The Graphical Controller 17 decodes the PCS disposed in the Composition Buffer 16, and performs a control based on the PCS. A tim.ing for performing the control is based on the PTS attached to the PCS.
Next, recommended values for the transfer rate and buffer size for structuring the PID filter 3, Transport Buffer 4a-4c, Graphics Plane 8, CULT unit 9, Coded Data Buffer 13, and Graphical Controller 17 are explained. FIG.27 illustrates sizes of the write rates Rx, Re, and Rd, Graphics Plane 8 , Coded Data Buffer 13 , Object Buffer 15, and Composition Buffer 16.
The transfer rate Re between the Object Buffer 15 and the Graphics Plane 8 is the highest transfer rate in the reproduction apparatus of the present embodiment, and calculated as 256Mbps ( =500 Kbytes * 29.97 x 2) from the window size and the frame rate . Unlike the Re, the transfer rate Rd (Pixel Decoding Rate) between the Streami Graphics Processor 14 and Object Buffer 15 does not need to be updated every video frame cycle, and 1/2 or 1/4 of the Re is sufficient for the Rd. Accordingly, the Rd is either 128 Mbps or 64 Mbps.
The Transport Buffer Leak Rate Rx between the Transport 3uffer 4a-4c and Coded Data Buffer 13 is a transfer rate of the ODS in a compressed state. Accordingly, the transfer rate Rd multiplied by the compression rate is sufficient for the Transport Buffer leak rate Rx . Given the compression rate of the ODS is 25%, 16 Mbps (=64 Mbps * 25%) is sufficient.
The transfer rates and buffer sizes shown in the drawing are the minimum, standard, and it is also possible to set at higher rates and larger sizes. In the above structured reproduction apparatus, each elements perform, a decoding operation m a pipeline structure.
FIG.28 is a timing chart illustrating a pipeline processing by the reproduction apparatus . A 5th row m the drawing is a Display
Set in the 3D-R0M, a 4th row shows read periods from the PCS, WDS, PDS, and ODS to the Coded Data Buffer 13. A 3rd row shows decode periods of each ODS by the Stream. Graphics Processor 14. A 1st row shows operations that the Graphical Controller 17 performs.
The DTS (decode starting time) attached to the ODSl and ODS2 indicate t31 and t32 m the drawing, respectively. Because the decode starting time is set by DTS, each ODS is required to be read out to the Coded Data Buffer 13. Accordingly, the reading of the ODSl is completed before a decode period dpi m which the
ODSl is decoded to the Coded Data Buffer 13. Also, the reading of the 0DS2 is completed before a decode period dp2 m which the 0DS2 is decoded to the Coded Data Buffer 13. On the other hand, the PTS (decode ending time) attached to the ODSl and ODS2 indicate t32 and 133 in the drawing, respectively. Decoding of the ODSl by the Stream Graphics Processor 14 is completed by the t32, and decoding of the ODS2 is completed by a time indicated by the t33. As explained above, the Stream Graphics Processor 14 reads the ODS to the Coded Data Buffer 13 by the time the DTS of the ODS indicates, and decodes the ODS read to the Coded Data Buffer 13 by the time the PTS of the ODS indicates, and write the decoded ODS to the Object Buffer 15. A period cdl at the 1st row in the drawing indicates a period necessary for the Graphics Controller 17 to clear the Graphics Plane. Also, a period tdl indicates a period necessary to write the Graphics Object obtained on the Object Buffer to the Graphics Plane 8. The PTS of the WDS indicates the deadline to start writing, and the PTS of the PCS indicates ending of the write and a timing for display. At the time indicated by the PTS of the PCS, the decompressed graphics to compose an interactive screen is obtained on the Graphics Plane 8.
After the CLUT unit 9 performs the color conversion of the decompressed graphics and the adder 10 performs composition of the decomposed graphics and a decomposed picture stored in the Video Piane 6, a composite image is obtained.
In the Graphics Decoder 12, the Stream Graphics Processor 14 performs decoding continuously while the Graphics Controller 17 performs clearing of the Graphics Piane 8. By the above pipeline processing, it is possible to perform a prompt display of the graphics .
In FIG.28 , a case in which the clearing of the Graphics Plane ends before completing the decoding of the ODS is explained. FIG.29 illustrates a timing chart in a pipeline processing of a case in which the decoding of the ODS ends before the clearing of the Graphics Plane is completed. In this case, it is not possible to write to the Graphics Plane at a time of completion of the decoding of the ODS. When the clearing of the Graphics Plane is completed, it becomes possible to write the graphics obtained by the decode to the Graphics Plane.
Next, how the controlling unit 20 and the Graphics Decoder 12 are implemented is explained below. The controlling unit 20 is implemented by writing a program, performing an operation shown in FIG.30, and having a general CPU execute the program. The operation performed by the controlling unit 20 is explained by referring to FIG.30.
FIG.30 is a flowchart showing a process of a loading operation of the functional segment . In the flowchart, SegmentK is a variable indicating eacn of Seg .ents (PCS, WDS, PDS, and ODS) tnat is read out in reproducing the AVClip . An ignore flag is a flag to determine if the SegmentK is ignored or loaded. The flowchart has a loop structure, in which first the ignore flag is initialized to 0 and then Steps S21-S24 and Steps S27-S31 are repeated for each SegmentK (Step S25 and Step 326) .
Step S21 is for judging if the SegmentK is the PCS, and if the SegmentK is the PCS, judgments in Step S27 and Step S28 are performed.
Step S22 is for judging if the ignore flag is 0. If the ignore flag is C , the operation moves to Step S23 , and if the ignore flag is 1, the operation moves to Step S24. If the ignore flag is 0 (Yes m Step S22) , the SegmentK is loaded to the Coded Data
Buffer 13 in Step S23.
If the ignore flag is 1 (No in Step S22) , the SegmentK is ignored in Step S24. By this, the rest of all functional segments that belong to the DS are ignored because Step S22 is No (Step S24) .
As explained above, whether the SegmentK is ignored or loaded is determined by the ignore flag. Steps S27-S31, S34, and S35 are steps for setting the ignore flag.
In Step S27, it is judged if segmet_type of the SegmentK is the Acquisition Point . If the SegmentK is the Acquisition Point , the operation moves to Step S28, and if the SegmentK is either the Epoch Start or Normial Case, then the operation moves to Step S31.
In Step S28, it is judged if a preceding DS exists in any of the buffers in the Graphics Decoder 12 (the coded data buffer 13, stream graphics processor 14 , obj ect buffer 15 , and composition buffer 16) . The judgment in Step S28 is made when the judgment in Step S27 is Yes. A case m which a preceding DS does not exist in the Graphics Decoder 12 indicates a case in which the skip operation is performed. In this case, the display starts from the DS that is the Acquisition Point, and therefore the operation moves to Step S30 (No in Step S28) . In Step S30, the ignore flag is set to C and the operation moves to Step S22.
A case in which a preceding DS exists in the Graphics Decoder
12 indicates a case in which normal reproduction is performed.
In this case, the operation moves to Step S29 (Yes in Step 328) .
In Step S29, the ignore flag is set to 1 and the operation moves to Step S22.
In Step S31, it is judged if segment_type of the PCS is the
Normal Case. If the PCS is the Normial Case, the operation moves to Step S34, and if the PCS is the Epoch Start, then the ignore flag is set to 0 in Step S30. In Step S34, like in Step S28, it is judged if a preceding DS exists in any of the buffers in the Graphics Decoder 12. If the preceding DS exists, the ignore flag is set to 0 (Step Ξ30) . If the preceding DS does not exist, it is not possible to obtain sufficient functional segmients to compose an interactive screen and the ignore flag is set to 1 (Step S35) .
By setting the ignore flag in the above manner, the functional segments that constitute the Normial Case are ignored when the preceding DS does not exit in the Graphics Decoder 12.
Taking an example of a case in which the DS is multiplexed as shown in FIG.31, a manner how the reading of the DS is performed is explained. In the example of FIG.31, three DS are multiplexed with a moving picture. The segment_type of a DSl is Epoch Start, the segment_type of a DS10 is Acquisition Point, and the segment__type of a DS20 is Normal Case. Given that, in an AVClip in which the three DS and the moving picture are multiplexed, a skip operation to a picture data ptiO as shown by an arrow amlis performed, the DS10 is the closest to a skipping target, and therefore the DS10 is the DS described in the flowchart in FIG.30. Although the segment_type is judged to be the Acquisition Point in Step S27, the ignore flag is set to 0 because no preceding DS exists in the Coded Data Buffer 13, and the DSiC is loaded to the Coded Data Buffer 13 of the reproduction apparatus as shown by an arrow mdl in FIG.32. On the other hand, in a case in which the skipping target is after the DS10 (an arrow am2 in FIG.31) , the DS20 is to be ignored because the DS20 is Normal Case Display Set and DS2C because a preceding DS does not exist in the Coded Data Buffer 13 (an arrow md2 in FIG.32) .
FIG.33 illustrates loading of the DSl, DS10, and DS20 in a normal reproduction. The DSl whose segment_type of the PCS is the Epoch Start is loaded to the Coded Data Buffer 13 as it is (Step S23) . However, because the ignore flag of the DS10 whose segment_type of the PCS is the Acquisition Point is set to 1 (Step S29) , the functional segments that constitute the DS10 are ignored and not loaded to the Coded Data Buffer 13 (an arrow rd2 in FIG.3 , and Step S24) . Further , the DS20 is loaded to the Coded Data Buffer 13, because the segment_type of the PCS of the DS20 is the Normal Case (an arrow rd3 in FIG.34) .
Next, operations by the Graphical Controller 17 are explained. FIGs .35-37 illustrate a flowchart showing the operations perform.ed by the Graphical Controller 17.
Steps S41-S44 are steps for a main routine of the flowchart and waits for any of events prescribed in Steps S41-S44 occurs.
Step S41 is to judge if a current reproducing time is a time indicated by the DTS of the PCS, and if the judging is Yes, then an operation in Steps S45-S53 is performed.
Step Ξ45 is to judge if the compcsiticn_state of the OCS is the epoch_start , and if judged to be the epoch_star , the Graphics Plane 8 is all cleared in Step S46. If judged to be other than the epoch_start, the Window indicated by the window_horizon al_position, window_vertical_position, window_width, and window_heιght of the WDS is cleared.
Step S48 is a step performed after the clearing performed m Step S46 or in Step S47, and to judge if the time indicated by the PTS of any ODSx has passed. The decoding of any ODSx could be already completed by the time the clearing ends, because the clearing of an entire Graphics Plane 8 takes time. Therefore, in -Steps S48, it is judged if the decoding of any ODSx is already completed by the time the clearing ends. If the judging is No, the operation returns to the main routine. If the time indicated by the PTS of any ODSx has already passed, an operation in Steps S49-S51 is perfor ed. InStepS49, it is judged if obj ect_crop_flag is 0, and if the flag indicates 0, then the Graphics Object is set to "no display" (Step S50) .
If the flag is not 0 in Step S49, then an object cropped based on object_cropping_horizontal_position, object
_cropping_vertical_position, cropping_width, and cropping
_height is written to the Window in the Graphics Plane 8 at the position indicated by object_croppmg_horizontal_position and obj ect_cropping_vertical_positιon (Step S51) . By the above operation, one or more Graphics Obj ects are rendered in the Window .
In Step 52, it is judged if the time corresponding to a PTS of another ODSy has passed. When writing the ODSx to the Graphics
Plane 8, if the decoding of the ODSy has already been completed, then the ODSy becomes ODSx (Step S53), and the operation moves to Step S49. By this, the operation from Steps S49-S51 is also performed to another ODS.
Next, by referring to FIG.36, Step S42 and Steps 354-S59 are explained below.
In Step 42, it is judged if the current reproducing point is at the PTS of the WDS. If the judging is that the current reproducing point is at the PTS of the WDS, then it is judged if the number of the Window is one or not in Step 354. If the judging is two, the operation returns to the main routine. If the judging is one, a loop processing of Steps S55-S59 is performed. In the loop processing, operations in Steps S55-S59 are performed to each of the two Graphics Object displayed in the Window. In Step S57, it is judged if object_crop_flag indicates 0. If it indicates 0, then the Graphics is not displayed (Step S58).
If it doesn't indicate 0, then a cropped object based on object_cropping_hoπzontal_position, obj ect_cropping _vertical_positicn, cropping_width, and cropping_height is written to the Window in the Graphics Plane 8 at the position indicated by object_cropping_horizontal_positιon and object_cropping_vertical_position (StepS59). By repeating the above operations, more than one Graphics Object is rendered in the Window.
In Step S44, it is judged if the current reproducing point is at the PTS of the PDS. If the judging is that the current reproducing point is at the PTS of the PDS, then it is judged if pallet_uαdate_flag is one or not in Step S6C. If the judging is one, the PDS indicated by pallet_id is set in the CLUT unit (Step S61) . If the judging is 0, then Step S6I is skipped.
After that, the CLUT unit performs the color conversion of the Graphics Object on the Graphics Plane 8 to be combined with the moving picture (Step S62) .
Next, by referring to FIG.37, Step S43 and Steps S64-S66 are explained below.
In Step 43, it is judged if the current reproducing point is at the PTS of the ODS. If the judging is that the current reproducing point is at the PTS of the ODS, then it is judged if the number of the Window is two or not in Step S63. If the judging is one, the operation returns to the main routine. If the judging is two, operations in Steps S64-S66 are performed. In Step S64, it is judged if obj ect_crop_flag indicates 0. If it indicates 0, then the Graphics is not displayed (Step S65) .
If it doesn't indicate 0 then a cropped object based on obj ect_cropping_horizontal_position, obj ect_cropping vertical_posιtion, cropping_width, and cropping_height is written to the Window in the Graphics Plane 8 at the position indicated by obj ect_croppιng_hoπ zontal_pos ition and ob ect_cropping_vertical_position (Step S66) . By repeating tne above operations, the Graphics Object is rendered in each Window.
The above explanations are about the DTS and PTS of the PCS, and the DTS and PTS of the ODS that belong to DSn. The DTS and PTS of the PDS, and the DTS and PTS of the END are not explained.
First , the DTS and PTS of the ?D that belongs to the DSn are explained .
As for the PDS that belongs to the DSn, it is sufficient if the PDS is available m the CLUT unit 9 by the PCS is loaded to the Composition 3uffer 16 (DTS ( DSn [PCS] ) ) after decoding start point of a first ODS (DTS (DSn [ODSl] )) . Accordingly, a value of PTS of each PDS ( PDSl-PDSlast ) in the DSn is required to be set so as to satisfy the following relations. DTS (DSn [PCS] )<?TS (DSn [ PDS1] ) PTS (DSn[PDSj] ) ≤PTS ( DSn [PDSj τl] ) <?TS ( DSn [ PDSlast ] ) PTS (DSn [PDSlast] ) ≤DTS (DSn [ODSl] )
Note that the DTS of the PDS is net referred to during the reproducing, the DTS of the ODS is set to the same value as the PTS of the PDS in order to satisfy the MPEG2 standard.
Following is an explanation about roles of the DTS and PTS in the pipeline processing of the reproduction apparatus when the DTS and PDS are set so as to satisfy the above relations. FIG.38 illustrates the pipeline of the reproduction apparatus based on the PTS of the PDS. FIG.38 is based on FIG.26. A first row in FIG.38 indicates setting the ODS m the CLUT unit 9. Under the first row are the same as firstte fifth rows in FIG.26. The setting of the PDSl-PDSlast to the CLUT unit 9 is performed after the transferring the PCS and WDS and before the decoding of the ODSl, and accordingly the setting of the PDSl-PDSlast to the CLUT unit 9 is set before a point indicated by the DTS of the ODSl as shown by arrows up2 and up3. As described above, the setting of the PDS is performed m prior to the decoding of the ODS.
Next, a setting of the PTS of END of Display Set segment in the DSn is explained. The END that belongs to the DSn indicates the end of the DSn, and accordingly it is necessary that the PTS of the END indicates the decode ending time of the 0DS2. The decode ending time is indicated by the PTS (PTS (DSn [ODSlast ]) ) of the 0DS2 (ODSlast) , and therefore the PTS of the END is required to be set at a value that satisfies an equation below. DTS (DSn [END] ) =PTS (DSn [ODSlast ] )
In terms of a relation between the DSn and the PCS that belongs to the DSn+1, the PCS in the DSn is loaded to the Composition 3uffer 16 before a loading time of the first ODS (ODSl) , and therefore the PTS of the END should be after a loading time of the PCS in the DSn and before a loading time of the PCS that belongs to the DSn+1. Accordingly, the PTS of the END is required to satisfy a relation below. DTS (DSn [PCS] ) ≤PTS (DSn [END] )<DTS (DSr.-rl [PCS] )
On the other hand, the loading time of the first ODS (ODSl) is before a loading time of a last PDS (PDSlast), and therefore the PTS of the END ( PTS ( DSn [END] ) ) should be after a loading time of the PDS that belongs to the DSn ( PTS ( DSn ] PDSlast ])) . Accordingly, the PTS of the END is required to satisfy a relation below. PTS (DSn[PDSlast] ) ≤PTS (DSn[END] )
Following is an explanation about significance of the PTS of the END m the pipeline processing of the reproduction apparatus .
FIG.39 is a diagram, describes the significance of the END in the pipeline process of the reproduction apparatus. FIG.39 is based on FIG.26, and each row in FIG.39 is substantially the same as FIG .26 other than that a first row in FIG .39 indicates the content of the Composition Buffer 16. Further, in FIG.39, 2 Display Sets , DSn and DSn+1 are illustrated. The ODSlast in the DSn is the last ODSn of A-ODSs, and accordingly, the point indicated by the PTS of the END is before the DTS of the PCS in the DSn+1.
By the PTS of the END, it is possible to find when the loading of the ODS in the DSn is completed during reproduction.
Note that although the DTS of the END is not referred to during reproduction, the DTS of the END is set to the same value as the PTS of the END in order to satisfy the MPEG2 standard.
As described in the above, a part of the Graphics Piane is specified as the Window for displaying the Graphics according to the present embodim.ent, and therefore the reproduction apparatus does not have to render the Graphics for an entire Plane. The reproduction apparatus may render the Graphics for only a predetermined size of Window, such as 25% to 33% of the Graphics Plane. Because the rendering of the Graphics ether than the Graphics in the Window is not necessary, the load for software in the reproduction apparatus decreases. Even in a worst case in which the updating of the Graphics is performed such as 1/4 of the Graphics Plane, it is possible to display the Graphics synchronously with the picture by the reproduction apparatus performing the write to the Graphics Plane at a predetermined transfer rate such as 256 Mbps, and by setting the size of the Window so as to ensure the sync display with the picture .
Thus, it is possible to realize a high resolution subtitle display for various reproduction apparatuses, because the sync display is easily ensured. [Second Embodiment] In the first embodiment, the size of the Window is set to 1/4 of an entire Graphics Plane and the writing rate Re to the Graphics Piane is set to 256 Mbps, so as to update the Graphics for each video frame. Further, by setting the update rate to be 1/2 or 1/4 of the video frame rate, it becomes possible to update a larger size of the Graphics. However, when the update rate is 1/2 or 1/4 of the video frame rate, it takes 2 or 4 frames to write to the Graphics Plane. When one Graphics Plane is provided, a process of the writing of the Graphics during the 2 or 4 frames during which the Graphics is written becomes visible to a user. In such a case, a display effect such as switching from one Graphics to a larger Graphics in a moment may not be effectively realized. Therefore, in a second embodiment, two Graphics Planes are provided. FIG.40 illustrates an internal structure of a reproduction apparatus according to the second embodiment . The reproduction apparatus in FIG.40 is new in comparison with the reproduction apparatus according to FIGs.24 and 25 in that the reproduction apparatus m FIG.40 has two Graphics Planes (a Graphics Plane 81 and a Graphics Plane 82 in the drawing) , and the two Graphics Planes constitute a double buffer. Accordingly, it is possible to write to one of the Graphics Planes while the reading is performed from the other of the Graphics Planes . Further, a Graphical Controller 17 according to the second embodiment switches the Graphics Plane that is read out at a point indicated by the PTS of the PCS. FIG.41 schematically illustrates an operation of reading out and writing to the Graphics Planes that constitute the double buffer. An upper row indicates contents of the Graphics Plane 81, and a bottom row indicates contents of the Graphics Piane 82. The contents of the both Graphics Planes per frame are illustrated from a first frame to a fifth frame (left to right) . A part of the Graphics Planes 81 and 82 for each frame that are enclosed by a thick line is a target of the reading out. In the drawing, a face mark is contained in tne Graphics Plane 81, and the face mark is to be replaced by a sun mark that is in an Object Buffer 15. A size of the sun mark is 4 Mbytes, which is a .aximum size of the Object Buffer 15.
To write the sun mark to the Graphics Plane 82 at the writing rate to the Graphics Plane (Rc=256 Mbps) , it takes 4 frames till the writing is completed, and only 1/4 of the sun mark is written to tne Graphics Plane 82 during the first frame, 2/4 during the second frame, and 3/4 during the third frame . Because the Grapnics Plane 81 is the target to be displayed m the screen, however, the process of writing the sun mark to the Graphics Plane is not visiole to the user . At the fifth frame, when the target of display switches to the Graphics Plane 82, the contents of the Grapnics Plane 82 become visiαle to the user. T us, the switching from the face mark to the sun mark is completed.
As described above, according to the second embodim.ent, it is possible to switch display m the screen to another graphics at once even when a large size graphics is written to the Graphics Plane for four frames, and therefore useful when displaying such as credits, an outline of a movie, or a warning, at once m an entire screen. [Third Embodiment] A third embodiment relates to a manufacturing process of the BD-ROM. FIG.42 is a flowchart illustrating the manufacturing process of the BD-ROM according to tne third embodiment.
The manufacturing of the BD-ROM includes a material manufacturing step S201 for producingmaterial andrecordingmovies and sound, an authoring step S202 for generating an application format using an authoring apparatus, and a pressing step S203 for mianufacturing a master disc of the BD-ROM and pressing to finish the BD-ROM.
The authoring step of the BD-ROM includes Steps Ξ204-S209 as follows.
In Step S204, the WDS is described so as to define the Window in which subtitles are displayed, and in Step S205, a period of time during which the window is defined to appear at the same pos tion in the same size, is set as one Epoch, and the FCS for each Epoch is described.
After obtaining the OCS in the above manner, the Graphics as material for subtitles is converted into the ODS, and the Display Set is obtained by combining the ODS with the PCS, WDS, and PDS in Step S206. Then, m Step S2C7, each functional segment m the Display Set is divided into the PES packets, and the Graphics Stream, is obtained by attaching the time stamp.
Finally, in Step S208, the AVClip is generated by multiplexing the graphics stream with the video stream and audio stream, that are generated separately. After obtaining the AVClip, the application format is completed by adjusting the AVClip into the BD-ROM format. ]Other Matters]
The above explanations do not illustrate all embodiments according to the present invention. The present invention may also be realized by modified examples shown below. Inventions described in the Claims of the present application include the above embodiments as well as expansions or generalizations of the modified examples. Although the degree of expansion and generalization is based on characteristics of technological levels of the related art at the time of the application, the inventions according to the Claims of the present application reflect the means to solve the technical problems in the conventional art, and therefore the scope of the invention does not exceed the technological scope that those skilled m the art would recognize as means to solve the technical problems in the conventional art. Thus, the inventions according to the Claims of the present application substantially correspond to the descriptions of the details of the invention.
(1) The BD-ROM is used in the explanations of all of the above embodim.ents . However, characteristics of the present invention are in the Graphics Stream that is recorded in a media, and such characteristics do not depend on physical properties of the BD-ROM. Any recording medium that is capable of storing the Graphics Stream may realize the present invention. Examples of such recording medium include optical discs such as a DVD-ROM, a DVD-RAM, a DVD-RW, a DVD-R, a DVD-RW, a DVD-R, a CD-R, and a CD-RW, magnetic optical discs such as a PD and MO, semiconductor memory cards such as a compact flash card, a smart media, a memory stick, a multi-media card, and a PCM-CIA card, and magnetic discs such as a flexible disc, a SuperDisk, a Zip, and a Clik!, and removable hard disk drives such as an ORB, a Jaz, a SparQ, a SyJet, a EZFley, and a micro drive, in addition to built-in hard disks.
(2) The reproduction apparatus described in all of the above embodiments decodes the AVClip recorded in the BD-ROM and outputs the decoded AVClip to a TV. However, it is also possible to realize the present invention by the reproduction apparatus that includes only a BD-ROM drive, and the TV provided with other elements. In this case, the reproduction apparatus and the TV may be connected via IΞEE1394 to create a home network. Moreover, although the reproduction apparatus in the embodiments is used by connecting to the TV, the reproduction apparatus may be an all in one TV and reproduction apparatus . Further, the LSI (integrated circuit) alone that forms an essential part of the processing in the reproduction apparatus of each embodiment may be put into practice. Such reproduction apparatus and the LSI are both described in the present specification, and therefore manufacturing a reproduction apparatus based on the internal structure of the reproduction apparatus according to the first embodiment is an implementation of the present invention, no natter what working example it may take . Moreover, transferring whether as a gift or profit, lending, and importing of the reproduction apparatus according to the present invention are also considered to be the implementation of the present invention. Offering such transfer and lend to common users by way of storefront display and distribution of brochure is also considered to be the implementation of the present invention.
(3) Information processing executed by a program shown in the flowcharts is realized using hardware resources, and accordingly, the program, whose processing is shown m each flowchart is established alone as an invention. Although all the above embodiments describe the program according to the present invention as built-in in the reproduction apparatus, the program according to the first embodiment alone may be implemented. Examples of implementation of the program, alone include (i) producing the programs, (ii) transferring the programs as a gift orprofit, (iii) lending the programs, (iv) importing the programs , (v) providing general public with the programs via an interactive electronic communications line, and (vi) offering the transfer and lend to comm.on users by way of storefront display and distribution of brochure. (4) Time elements in the steps that are performed in a sequential order in each flowchart are essential characteristics of the present invention, and it is clear that the process shown m each flowcharts discloses a method of reproduction . Performing the processes illustrated by the flowcharts by carrying out the operation in each step sequentially to obtain the object of the present invention and realizes the effects is the implementation of the recording method according to the present invention.
(5) It is desirable to add an extension header to each packets constituting the AVClip when recording in the BD-ROM. The extension header is 4-bytes data called TP__extra_header that includes arrival_time_stamp and ccpy_permissιon _indicator . The TS packets having the T?_extra__header (hereinafter referred to as the TS packets with EX) are grouped by every 32 packets and written to 3 sectors. A group including 32 TS packets with EX has 6144 bytes (=32x192), which is the same size as a size of 3 sectors 6144 bytes (=2048x3) . The group of 32 TS packets with EX stored in 3 sectors is called Aligned Unit.
When the reproduction apparatus is used in the home network connected via the IEEEI394, the reproduction apparatus transmits the Aligned Unit in a following transmission procedure. A sender obtains TP_extra_header from each of the 32 TS packets with EX included in the Aligned Unit, and outputs main body of the TS packets after decoding based on DTC? standard. When outputting the TS packets, isochronous packets are inserted between any two successive TS packets. Insertion points are positions based on time indicated by arrivai_time_stamp m TP_extra_header . Along with the output of the TS packets, the reproduction apparatus outputs DTCP_descπptor . The DTC?_descriptor indicates settings for copy permission. By describing the DTC?_descrιptor so as to indicate the copying is prohibited, the TS packets are not recorded by other devices when using in the home network connected via the IEEE1394.
(6) The digital stream in the above embodiments is the AVClip. However, the digital stream may be a Video Object (V03) in DVD-Video standard cr DVD-Video Recording standard. The V03 is a IΞO/IEC13818-l-standard-based program stream obtained by multiplexing the video stream, and audio stream. Further, the video stream in the AVClip may also be based on MPΞG4 or WMV standard. Moreover, the audio stream may be based on Linear-PCM, Dolby-AC3, MP3, MPEG-AAC, or DTS standard.
(7) The movie m the above embodiments may be obtained by encoding analog image signals transmitted via analog broadcasting, or may be stream data constitutedby transport stream transmitted via digital broadcasting.
It is also possible to obtain contents by encoding analog or digital image signals that are recorded in a videotape . Further, the contents may also be obtained by encoding analog or digital image signals that are directly loaded from a video camera. Moreover, contents may be a digital work delivered by a distributing server .
( 8 ) The Graphics Object in the first and second embodiments is raster data that is encoded based on run-length limited encoding. The run-length limited encoding is adopted for compressing and encoding the Graphics Object because the run-length limited encoding is the most appropriate for compressing and decompressing the subtitles. The subtitles have characteristics that a length in a horizontal direction becomes relatively long, and accordingly, high compression rate is obtained by using the run-length limited encoding. In addition, the run-length limited encoding is preferable for making software for decoding because the load in decompression is low. Further, m order to share the apparatus structure for decoding between the subtitles and the Graphics Object, the same compression/decompression method as for the subtitles is employed for the Graphics Object. However, using the run-length limited encoding is not an essential part of the present invention, and the Graphics Object may be PNG data. Moreover, the Graphics Object is not required to be the raster data and may be vector data. In addition, the Graphics Object may be transparent graphics.
(9) A target for display effect by the PCS may be the graphics for the subtitles selected based on a language setting of the reproduction apparatus. Realizing such a display has a high utilitarian value, because it becomes possible to realize an effect, which is realized by the moving picture itself in the conventional DVD, by the subtitle graphics displayed according to the language setting of the reproduction apparatus.
(10) A target for display effect by the PCS m.ay be the graphics for the subtitles selected based on a display setting of the reproduction apparatus. Specifically, Graphics for various display modes such as a wide-vision, a pan-scan, and a letterbox are recorded in the 3D-R0M, and the reproduction apparatus select any of the recorded settings based on tne setting for the TV to which the -reproduction apparatus is connected. In this case, the display effect based on the PCS is performed to the subtitle graphics displayed according to the display setting, the subtitles look more impressive and professional. Realizing such a display has a high utilitarian value, because it becomes possible to realize an effect similar to the effect realized in the moving picture itself in the conventional DVD, by the subtitle graphics being displayed according to the display setting of the reproduction apparatus.
(11) In the first embodiment, the Window size is set to be 25% of an entire Graphics Plane in order to set the writing rate Re to the Graphics Plane to the rate at which the clearing of the Graphics Plane and redrawing is performed in one frame. However, the Re may be set so that the clearing and re-drawmg are completed during a vertical retrace period. Given that the vertical retrace period is 25% of 1/29.93 seconds, the Re is IGbps . Setting the Re m such a way has a high utilitarian value, because it is possible to display the graphics smoother.
Further, it is also possible to perform the writing synchronously with line scanning, in addition to the writing during the vertical retrace period. By this, it is possible to display the graphics smoother even at tne writing rate Re is 256 Mbps.
(12) In the above embodiments, the Graphics Plane is mounted to the reproduction apparatus. However, it is also possible to mount a line buffer for storing decompressed pixels for a line in place of the Graphics Plane to the reproduction apparatus . Conversion into image signals is performed by line, and therefore the conversion into the image signals may be carried out with the line buffer alone.
(13) In the above embodiment, the explanations are given taking the text subtitles for the movie as the examples of the graphics. However , the graphics may include such as a combination of devices, characters, and colors that constitute a trade m.ark, a national crest, a national flag, a national emblem, a symbol and a great seal for supervision or certification that a national government uses, a crest, a flag, or an emblem of an international organization, or a mark of origin of a particular item. (14) In the first embodim.ent, the Window for rendering the subtitles is defined either at an upper side of the screen, or the bottom of the screen, assuming that the subtitles are written horizontally. However, the Window may be defined to appear either on left or right side of the screen so as to display the subtitles on the left and right of the screen. In this way, it is possible to change text direction and display subtitles vertically.
(15) The AVClip in the above embodiments constitutes the movie. However, the AVClip may also be used for karaoke . In this case, the PCS m.ay perform the display effect such that the color of the subtitles changes along with a song.
Reference Numbers
1 3D drive 2 Read Buffer
3 PID Filter
4 TB Buffer
5 Video Decoder
6 Video Plane 7 Audio Decoder
8 Graphics Plane
9 CULT unit
10 adder
12 Graphics Decoder 13 Coded Data Buffer
14 Stream Graphics Processor
16 Composition Buffer
17 Graphics Controller 200 reproduction apparatus 300 TV 400 remote controller
Industrial Applicability
A recording medium and a reproduction apparatus according to the present invention are capable of displaying subtitles with a display effect . Accordingly, it is possible to add higher values to movies supplied in the market, and to activate markets for films and consumer products. Thus, the recording medium and the reproduction apparatus according to the present invention have high industrial applicability in industry such as the film, industry and consumer products industry.

Claims

:LAIMS
1. A recording medium used for storing data, said recording medium comprising: a digital stream constituted by multiplexing a video stream and a graphics stream, wherein saidvideo stream represents amoving picture made of a plurality of pictures, and the graphics stream, includes : graphics data representing graphics to be combined with the pictures; and window information that specifies a window for rendering the graphics therein, the window information indicating a width, a height and a position of the window on a plane, the plane being a piane memory of a reproduction apparatus that combines the graphics with the pictures.
2. A recording medium according to Claim 1, wherein the width and height of the window are set so that a size of the window is 1/x of the plane, the plane corresponding to a size of each picture and x being a real number based on a ratio between a window update rate and a picture display rate.
3. A recording medium according to Claim. 1, wherein: the graphics stream includes control information that contains crop information specifying a cropping frame within a graphics object obtained by decoding the graphics data; and the graphics to be rendered in the window is a part of the graphics object within the cropping frame.
4. A recording medium according to Claim 3 , wherein the control information contains position information specifying a position in the window for rendering the part of the graphics object within the cropping frame.
5. A recording medium according to Claim 4, wnerem: the graphics stream includes a plurality of pieces of control information for realizing one of scroll, wipe- , wipe-out, cut-m, and cut-out display effects; and each of the pieces of control information includes the crop information and the position information that respectively specify a different cropping frame and a position.
6. A reproduction apparatus used for reproducing a digital stream constituted by multiplexing a video stream and a graphics stream, said reproduction apparatus comprising: a video αecoder operable to decode the video stream so as to obtain a moving picture made of a plurality of pictures; a graphics decoder operable to render graphics so as to be synchronously displayed with the pictures; and a plane memory corresponding to a plane and being useα for combining the graphics with the pictures, wherein: the graphics stream includes window information that specifies a part of tne plane as a window for rendering the graphics therein; and the rendering of the graphics by said graphics decoder includes a clearing of the graphics in the window in said plane memory, and a writing of the graphics to the window in said piane memory.
7. A reproduction apparatus according to Claim 6, wherein: the graphics stream includes compressed graphics data; and said graphics decoder includes a processor operable to decode the compressed graphics data, and a controlling unit operable to perform the clearing operation and the writing operation.
8. A reproduction apparatus according to Claim 7, wherein: a size of the window is set so as to be 1/x of the plane, x being a real numoer based en a ratio between a window update rate and a display rate of the video stream; and the writing operation performeα by said controlling unit is performeα at a transfer rate based on the update rate of the window ana the size of the window.
9. A reproduction apparatus according to Claim 7, where : said graphics decoder includes an cb ect buffer operable to store decompressed graphics data αecoded by said processor; tne graphics stream includes control information that contains crop information specifying a cropping frame within a graphics ODJ ect obtained by decoding the graphics αa a in the object puffer; saiα controlling unit is operable to crop a part of the graphics object within the cropping frame; ana the graphics to be synchronously displayed with the pictures is the part of the graphics object within the cropping frame.
10. A reproduction apparatus according to Claim 9, where : the control information contains position information specifying a position in the window for rendering the part within the cropping frame; and the part within the cropping frame is written to the window at the position specified by the position information.
11. A reproduction apparatus according to Claim 10, wherein: the graphics stream includes a plurality of pieces of control information; the crop part and the position indicated by the crop inform tion and the position inform.ation respectively m eachpiece of control information are different; and said controlling unit is operable to realize one of scroll, wipe-in, wipe-out, cut-in, and cut-out display effects by performing the clearing and writing of the graphics based on tne crop information and the position information in each piece of control information.
12. A reproduction apparatus according to Claim 6, further comprising two planememories constituting a double buffer, wherein the graphics is displayed by switching the displayed graphics from contents stored in one of said plane memories said double buffer to contents stored in the other one of said plane memories.
13. A method of recording onto a recording medium, said method comprising: producing application data; and recording the produced application data in the recording medium, wherein: the application data includes a digital stream, constituted by multiplexing a video stream and a graphics stream; the video stream represents a moving picture made of a plurality of pictures, and the graphics stream includes: graphics data representing graphics to be combined with the pictures; and window information that specifies a window for rendering the graphics therein, the window information indicating a width, a height and a position of the window on a plane, the plane being a plane memory of a reproduction apparatus that combines the graphics with the pictures.
14. A program used for enabling a computer to reproduce a digital stream constituted by multiplexing a video stream and a graphics stream., said program, comprising: cede operable to cause the computer to decode the video stream so as to obtain a moving picture .ade of a plurality of pictures; and code operable to cause the computer to render graphics so as to be synchronously displayed with the pictures, wherem: the graphics stream includes window information that specifies a part of the plane as a window for rendering the graphics therein; and said code operable to cause the computer to render the graphics includes code operable to cause the com.puter to perform a clearing of the graphics in the window in a plane memory used for combining the graphics with the picture, and a writing of the graphics to the window m the plane memory.
15. A method of reproducing a digital stream constituted by multiplexing a video stream and a graphics stream, said method comprising : decoding the video stream so as to obtain a moving picture made of a plurality of pictures; and rendering graphics sc as to be synchronously displayed with the pictures, wherein: the graphics stream includes window information that specifies a part cf the plane as a window for rendering the graphics therein; and said rendering of the graphics includes a clearing of the graphics in the window in a plane memory used for combining the graphics with the picture, and a writing of the graphics to the window in the plane memory.
16. An integrated circuit used for reproducing a digital stream constituted by multiplexing a video stream and a graphics stream, said integrated circuit comprising: a video decoder operable to decode the video stream, so as to obtain a moving picture made of a plurality of pictures; a graphics decoder operable to render graphics so as to be synchronously displayed with the pictures; and a plane memory corresponding to a plane and being used for combining the graphics with the pictures, wherein: the graphics streami includes window information that specifies a part of the plane as a window for rendering the graphics therein; and the rendering of the graphics by said graphics decoder includes a clearing of the graphics in the window m the plane memory, and a writing of the graphics to the window in the plane memory.
PCT/JP2004/006074 2003-04-28 2004-04-27 Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit for recording a video stream and graphics with window information over graphics display WO2004098193A2 (en)

Priority Applications (11)

Application Number Priority Date Filing Date Title
CNB2004800115550A CN100521752C (en) 2003-04-28 2004-04-27 Recording medium and method,reproduction apparatus and method,program and integrated circuit
EP20040729754 EP1620855B1 (en) 2003-04-28 2004-04-27 Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit for recording / reproducing a video stream and graphics with window information over graphics display
JP2005518547A JP3803366B1 (en) 2003-04-28 2004-04-27 Recording medium, reproducing apparatus, recording method, reproducing method, program, integrated circuit
AU2004234798A AU2004234798B2 (en) 2003-04-28 2004-04-27 Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit for recording a video stream and graphics with window information over graphics display
KR1020057020409A KR100760118B1 (en) 2003-04-28 2004-04-27 Recording medium, reproduction apparatus, recording method, reproducing method, and integrated circuit
CA 2523597 CA2523597C (en) 2003-04-28 2004-04-27 Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit
US10/554,627 US7505050B2 (en) 2003-04-28 2004-04-27 Recording medium, reproduction apparatus, recording method, reproducing method, program and integrated circuit for recording a video stream and graphics with window information over graphics display
ES04729754T ES2383893T3 (en) 2003-04-28 2004-04-27 Recording medium, playback device, recording procedure, playback procedure, program, and integrated circuit to record / play a video and graphics stream with window information on a graphic display
AT04729754T ATE557392T1 (en) 2003-04-28 2004-04-27 RECORDING MEDIUM, PLAYBACKER, RECORDING METHOD, PLAYBACK METHOD, PROGRAM AND INTEGRATED CIRCUIT FOR STORING/PLAYBACKING A VIDEO STREAM AND GRAPHICS WITH INFORMATION THROUGH A GRAPHICS WINDOW
US12/341,265 US8350870B2 (en) 2003-04-28 2008-12-22 Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit
US12/626,498 US20100067876A1 (en) 2003-04-28 2009-11-25 Recording medium, reproduction apparatus, recording method, reproducing method, program and integrated circuit

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US46597203P 2003-04-28 2003-04-28
US60/465,972 2003-04-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/341,265 Division US8350870B2 (en) 2003-04-28 2008-12-22 Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit

Publications (2)

Publication Number Publication Date
WO2004098193A2 true WO2004098193A2 (en) 2004-11-11
WO2004098193A3 WO2004098193A3 (en) 2005-02-24

Family

ID=33418317

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/006074 WO2004098193A2 (en) 2003-04-28 2004-04-27 Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit for recording a video stream and graphics with window information over graphics display

Country Status (14)

Country Link
US (3) US7505050B2 (en)
EP (5) EP2369589B1 (en)
JP (7) JP3803366B1 (en)
KR (1) KR100760118B1 (en)
CN (6) CN101702756B (en)
AT (1) ATE557392T1 (en)
AU (1) AU2004234798B2 (en)
CA (1) CA2523597C (en)
DK (4) DK2369588T3 (en)
ES (5) ES2536681T3 (en)
RU (2) RU2323489C2 (en)
TW (1) TWI302697B (en)
WO (1) WO2004098193A2 (en)
ZA (1) ZA200508211B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050155058A1 (en) * 2004-01-13 2005-07-14 Samsung Electronics Co., Ltd. Storage medium having interactive graphic stream and apparatus for reproducing the same
EP1618562A1 (en) * 2003-04-29 2006-01-25 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of graphic data and methods and apparatuses of recording and reproducing
EP1641260A1 (en) * 2003-06-30 2006-03-29 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction device, recording method, program, and reproduction method
EP1642284A1 (en) * 2003-07-01 2006-04-05 Lg Electronics Inc. Recording medium having data structure including graphic data and recording and reproducing methods and apparatuses
WO2006090908A1 (en) * 2005-02-25 2006-08-31 Kabushiki Kaisha Toshiba Content reproduction apparatus and subtitle reproduction method
EP1713075A1 (en) * 2005-01-28 2006-10-18 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction device, program, reproduction method
JP2007086317A (en) * 2005-09-21 2007-04-05 Samii Kk Device, program, and method for displaying dynamic image
US7366405B2 (en) 2003-07-11 2008-04-29 Matsushita Electric Industrial Co., Ltd. Recording medium, recording method, reproduction apparatus and method, and computer-readable program
US7415192B2 (en) * 2003-07-11 2008-08-19 Matsushita Electric Industrial Co., Ltd. Recording medium, recording method, reproduction apparatus and method, and computer-readable program
JP2009508277A (en) * 2005-08-29 2009-02-26 ソニー株式会社 Effects for interactive graphic data in disk authoring
US7769275B2 (en) 2002-10-04 2010-08-03 Lg Electronics, Inc. Recording medium having a data structure for managing reproduction of graphic data and recording and reproducing methods and apparatuses
US8233779B2 (en) 2004-07-09 2012-07-31 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer-readable program
JP2015111833A (en) * 2008-12-19 2015-06-18 コーニンクレッカ フィリップス エヌ ヴェ Method and device for overlaying 3d graphics over 3d video
US9905268B2 (en) 2016-03-24 2018-02-27 Fujitsu Limited Drawing processing device and method

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100448452B1 (en) * 2000-06-09 2004-09-13 엘지전자 주식회사 Method for supporting menu of a high-density recording medium
CN101106729B (en) * 2002-10-02 2012-12-19 Lg电子株式会社 Recording and reproducing method for controlling image data reproduction data structure
CA2523597C (en) * 2003-04-28 2011-09-27 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit
US7616865B2 (en) * 2003-04-30 2009-11-10 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of subtitle data and methods and apparatuses of recording and reproducing
KR20050004339A (en) * 2003-07-02 2005-01-12 엘지전자 주식회사 Method for managing grahics data of high density optical disc, and high density optical disc therof
KR20050064150A (en) * 2003-12-23 2005-06-29 엘지전자 주식회사 Method for managing and reproducing a menu information of high density optical disc
JP4692950B2 (en) * 2004-06-11 2011-06-01 ソニー株式会社 Data processing apparatus, data processing method, program, program recording medium, and data recording medium
JP4830535B2 (en) 2006-02-22 2011-12-07 ソニー株式会社 Playback apparatus, playback method, and playback program
US20080036758A1 (en) * 2006-03-31 2008-02-14 Intelisum Inc. Systems and methods for determining a global or local position of a point of interest within a scene using a three-dimensional model of the scene
WO2008044191A2 (en) 2006-10-11 2008-04-17 Koninklijke Philips Electronics N.V. Creating three dimensional graphics data
JP5010233B2 (en) 2006-10-20 2012-08-29 株式会社東芝 Video playback apparatus and video playback method
KR101433950B1 (en) * 2008-04-02 2014-09-23 엘지전자 주식회사 Method for calling synchronousness
US8878870B1 (en) * 2008-08-01 2014-11-04 Marvell International Ltd. Graphic processing techniques and configurations
US9275684B2 (en) * 2008-09-12 2016-03-01 At&T Intellectual Property I, L.P. Providing sketch annotations with multimedia programs
CN101778238A (en) 2009-01-12 2010-07-14 晨星软件研发(深圳)有限公司 Subtitle window display device and display method thereof
WO2010096030A1 (en) * 2009-02-18 2010-08-26 Thomson Licensing Method and apparatus for preparing subtitles for display
JP5158225B2 (en) * 2011-04-18 2013-03-06 ソニー株式会社 Playback apparatus, playback method, and playback program
US20160286225A1 (en) * 2012-09-27 2016-09-29 Dolby Laboratories Licensing Corporation Inter-layer reference picture processing for coding standard scalability
EP2728886A1 (en) * 2012-10-31 2014-05-07 EyeTrackShop AB Registering of timing data in video sequences
CN105025316A (en) * 2015-07-28 2015-11-04 Tcl集团股份有限公司 Method and system for judging whether or not video is under full screen play
CN105530532B (en) * 2016-03-02 2018-09-07 深圳市茁壮网络股份有限公司 A kind of method for displaying roll titles, device and set-top box
JP2018072414A (en) * 2016-10-25 2018-05-10 ソニーセミコンダクタソリューションズ株式会社 Display controller, display system, control method of display controller, and program
CN111836093B (en) * 2019-04-16 2022-05-31 百度在线网络技术(北京)有限公司 Video playing method, device, equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0920014A1 (en) * 1997-04-12 1999-06-02 Sony Corporation Editing device and editing method
US6275852B1 (en) * 1988-07-15 2001-08-14 International Business Machines Corp. Interactive computer network and method of operation
US20010026561A1 (en) * 2000-03-31 2001-10-04 U. S. Philips Corporation Methods and apparatus for making and replaying digital video recordings, and recordings made by such methods
EP1145218A2 (en) * 1998-11-09 2001-10-17 Broadcom Corporation Display system for blending graphics and video data
US6473102B1 (en) * 1998-05-11 2002-10-29 Apple Computer, Inc. Method and system for automatically resizing and repositioning windows in response to changes in display

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9012326D0 (en) 1990-06-01 1990-07-18 Thomson Consumer Electronics Wide screen television
US6062863A (en) * 1994-09-22 2000-05-16 Kirksey; William E. Method of associating oral utterances meaningfully with word symbols seriatim in an audio-visual work and apparatus for linear and interactive application
US5741136A (en) * 1993-09-24 1998-04-21 Readspeak, Inc. Audio-visual work with a series of visual word symbols coordinated with oral word utterances
US5938447A (en) * 1993-09-24 1999-08-17 Readspeak, Inc. Method and system for making an audio-visual work with a series of visual word symbols coordinated with oral word utterances and such audio-visual work
RU2195708C2 (en) 1993-09-24 2002-12-27 Ридспик, Инк. Inscription-bearing audio/video presentation structure, method for ordered linking of oral utterances on audio/video presentation structure, and device for linear and interactive presentation
WO1996019077A1 (en) 1994-12-14 1996-06-20 Philips Electronics N.V. Subtitling transmission system
JPH0981118A (en) * 1995-09-11 1997-03-28 Casio Comput Co Ltd Image control device
JP3362856B2 (en) * 1996-05-09 2003-01-07 松下電器産業株式会社 No matter how the main image is arranged on the screen, a recording medium, a reproducing apparatus, and a reproducing method capable of superimposing the sub-image on the main image with good balance
JP3345414B2 (en) 1996-05-09 2002-11-18 松下電器産業株式会社 A recording medium, a reproducing apparatus, and a reproducing method capable of superimposing a sub-video on a main video in a well-balanced manner, regardless of how the main video is arranged on a screen.
ID29305A (en) * 1997-12-15 2001-08-16 Matsushita Electric Ind Co Ltd OPTICAL PLATE, RECORDING APARATUS, COMPUTER-CAN-READING STORAGE MEDIA, STORING RECORDING PROGRAMS, AND RECORDING METHODS
WO2000046988A2 (en) * 1999-02-08 2000-08-10 United Video Properties, Inc. Electronic program guide with support for rich program content
JP2001069458A (en) * 1999-08-27 2001-03-16 Toshiba Corp Recording and reproducing device with grouping processing function of still picture
JP2001332006A (en) * 2000-05-17 2001-11-30 Toshiba Corp Background image capturing system
US7133449B2 (en) * 2000-09-18 2006-11-07 Broadcom Corporation Apparatus and method for conserving memory in a fine granularity scalability coding system
US20040095831A1 (en) * 2000-10-02 2004-05-20 Yasumori Hino Record medium, its recorder, its recording method, its reproducing apparatus, and its reproducing apparatus
US7019864B2 (en) * 2001-06-22 2006-03-28 Xeikon International N.V. Page composition in an image reproduction system using segmented page elements
US6952236B2 (en) * 2001-08-20 2005-10-04 Ati Technologies, Inc. System and method for conversion of text embedded in a video stream
JP4136383B2 (en) 2002-01-28 2008-08-20 キヤノン株式会社 Television receiver and control method thereof
US7196733B2 (en) 2002-01-28 2007-03-27 Canon Kabushiki Kaisha Apparatus for receiving broadcast data, method for displaying broadcast program, and computer program
US7142225B1 (en) * 2002-01-31 2006-11-28 Microsoft Corporation Lossless manipulation of media objects
US20040081434A1 (en) * 2002-10-15 2004-04-29 Samsung Electronics Co., Ltd. Information storage medium containing subtitle data for multiple languages using text data and downloadable fonts and apparatus therefor
US20040213542A1 (en) * 2003-04-22 2004-10-28 Hiroshi Hamasaka Apparatus and method to reproduce multimedia content for a multitude of resolution displays
CA2523597C (en) * 2003-04-28 2011-09-27 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit
RU2388073C2 (en) * 2003-04-29 2010-04-27 Эл Джи Электроникс Инк. Recording medium with data structure for managing playback of graphic data and methods and devices for recording and playback

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6275852B1 (en) * 1988-07-15 2001-08-14 International Business Machines Corp. Interactive computer network and method of operation
EP0920014A1 (en) * 1997-04-12 1999-06-02 Sony Corporation Editing device and editing method
US6473102B1 (en) * 1998-05-11 2002-10-29 Apple Computer, Inc. Method and system for automatically resizing and repositioning windows in response to changes in display
EP1145218A2 (en) * 1998-11-09 2001-10-17 Broadcom Corporation Display system for blending graphics and video data
US20010026561A1 (en) * 2000-03-31 2001-10-04 U. S. Philips Corporation Methods and apparatus for making and replaying digital video recordings, and recordings made by such methods

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Digital Video Broadcasting (DVB); Subtitling systems; Final draft ETSI EN 300 743" ETSI STANDARDS, EUROPEAN TELECOMMUNICATIONS STANDARDS INSTITUTE, SOPHIA-ANTIPO, FR, vol. BC, no. V121, June 2002 (2002-06), XP014001876 ISSN: 0000-0001 cited in the application *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7769275B2 (en) 2002-10-04 2010-08-03 Lg Electronics, Inc. Recording medium having a data structure for managing reproduction of graphic data and recording and reproducing methods and apparatuses
EP1618562A4 (en) * 2003-04-29 2011-03-16 Lg Electronics Inc Recording medium having a data structure for managing reproduction of graphic data and methods and apparatuses of recording and reproducing
EP1618562A1 (en) * 2003-04-29 2006-01-25 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of graphic data and methods and apparatuses of recording and reproducing
US8020117B2 (en) 2003-06-30 2011-09-13 Panasonic Corporation Recording medium, reproduction apparatus, recording method, program, and reproduction method
US7664370B2 (en) * 2003-06-30 2010-02-16 Panasonic Corporation Recording medium, reproduction device, recording method, program, and reproduction method
US8010908B2 (en) 2003-06-30 2011-08-30 Panasonic Corporation Recording medium, reproduction apparatus, recording method, program, and reproduction method
US8006173B2 (en) 2003-06-30 2011-08-23 Panasonic Corporation Recording medium, reproduction apparatus, recording method, program and reproduction method
US7913169B2 (en) 2003-06-30 2011-03-22 Panasonic Corporation Recording medium, reproduction apparatus, recording method, program, and reproduction method
EP1641260A1 (en) * 2003-06-30 2006-03-29 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction device, recording method, program, and reproduction method
US7716584B2 (en) 2003-06-30 2010-05-11 Panasonic Corporation Recording medium, reproduction device, recording method, program, and reproduction method
US7680394B2 (en) * 2003-06-30 2010-03-16 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer-readable program
US7668440B2 (en) * 2003-06-30 2010-02-23 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer-readable program
US7620297B2 (en) * 2003-06-30 2009-11-17 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer-readable program
EP1641260A4 (en) * 2003-06-30 2009-12-16 Panasonic Corp Recording medium, reproduction device, recording method, program, and reproduction method
EP1642284A1 (en) * 2003-07-01 2006-04-05 Lg Electronics Inc. Recording medium having data structure including graphic data and recording and reproducing methods and apparatuses
EP1642284A4 (en) * 2003-07-01 2008-10-01 Lg Electronics Inc Recording medium having data structure including graphic data and recording and reproducing methods and apparatuses
US8121463B2 (en) 2003-07-11 2012-02-21 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer-readable program
US7366405B2 (en) 2003-07-11 2008-04-29 Matsushita Electric Industrial Co., Ltd. Recording medium, recording method, reproduction apparatus and method, and computer-readable program
US8139915B2 (en) 2003-07-11 2012-03-20 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer-readable program
US7415192B2 (en) * 2003-07-11 2008-08-19 Matsushita Electric Industrial Co., Ltd. Recording medium, recording method, reproduction apparatus and method, and computer-readable program
US8126316B2 (en) 2003-07-11 2012-02-28 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer readable program
US8126317B2 (en) 2003-07-11 2012-02-28 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer readable program
US20050155058A1 (en) * 2004-01-13 2005-07-14 Samsung Electronics Co., Ltd. Storage medium having interactive graphic stream and apparatus for reproducing the same
US20080163294A1 (en) * 2004-01-13 2008-07-03 Samsung Electronics Co., Ltd. Storage medium having interactive graphic stream and apparatus for reproducing the same
US20080163115A1 (en) * 2004-01-13 2008-07-03 Samsung Electronics Co., Ltd. Storage medium having interactive graphic stream and apparatus for reproducing the same
US8233779B2 (en) 2004-07-09 2012-07-31 Panasonic Corporation Recording medium, recording method, reproduction apparatus and method, and computer-readable program
US8280233B2 (en) 2005-01-28 2012-10-02 Panasonic Corporation Reproduction device, program, reproduction method
EP1713075A4 (en) * 2005-01-28 2009-10-21 Panasonic Corp Recording medium, reproduction device, program, reproduction method
US7873264B2 (en) 2005-01-28 2011-01-18 Panasonic Corporation Recording medium, reproduction apparatus, program, and reproduction method
EP1713075A1 (en) * 2005-01-28 2006-10-18 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction device, program, reproduction method
US8571390B2 (en) 2005-01-28 2013-10-29 Panasonic Corporation Reproduction device, program, reproduction method
US8655145B2 (en) 2005-01-28 2014-02-18 Panasonic Corporation Recording medium, program, and reproduction method
WO2006090908A1 (en) * 2005-02-25 2006-08-31 Kabushiki Kaisha Toshiba Content reproduction apparatus and subtitle reproduction method
JP2009508277A (en) * 2005-08-29 2009-02-26 ソニー株式会社 Effects for interactive graphic data in disk authoring
JP2007086317A (en) * 2005-09-21 2007-04-05 Samii Kk Device, program, and method for displaying dynamic image
JP2015111833A (en) * 2008-12-19 2015-06-18 コーニンクレッカ フィリップス エヌ ヴェ Method and device for overlaying 3d graphics over 3d video
US9905268B2 (en) 2016-03-24 2018-02-27 Fujitsu Limited Drawing processing device and method

Also Published As

Publication number Publication date
ES2536684T3 (en) 2015-05-27
RU2005136872A (en) 2006-06-10
ATE557392T1 (en) 2012-05-15
CN101582982A (en) 2009-11-18
ES2536683T3 (en) 2015-05-27
EP2369588B1 (en) 2015-03-04
EP2369589A1 (en) 2011-09-28
DK2369590T3 (en) 2015-05-04
AU2004234798A1 (en) 2004-11-11
JP2008278533A (en) 2008-11-13
US20070057969A1 (en) 2007-03-15
CN101702750B (en) 2013-03-06
EP2369590A1 (en) 2011-09-28
JP4828618B2 (en) 2011-11-30
CN100521752C (en) 2009-07-29
EP2369588A1 (en) 2011-09-28
CN1781153A (en) 2006-05-31
KR20050121271A (en) 2005-12-26
RU2323489C2 (en) 2008-04-27
JP4880645B2 (en) 2012-02-22
JP5094986B2 (en) 2012-12-12
CN101702757B (en) 2013-07-31
DK2369589T3 (en) 2015-05-04
EP1620855A2 (en) 2006-02-01
US20090185789A1 (en) 2009-07-23
JP2011176855A (en) 2011-09-08
DK2369588T3 (en) 2015-06-08
EP2367174A1 (en) 2011-09-21
JP2006246496A (en) 2006-09-14
JP3803366B1 (en) 2006-08-02
US7505050B2 (en) 2009-03-17
EP2367174B1 (en) 2015-03-04
CA2523597A1 (en) 2004-11-11
ES2536680T3 (en) 2015-05-27
ES2536681T3 (en) 2015-05-27
US8350870B2 (en) 2013-01-08
RU2442290C2 (en) 2012-02-10
KR100760118B1 (en) 2007-09-18
EP2369590B1 (en) 2015-02-25
CN101702757A (en) 2010-05-05
JP2009268113A (en) 2009-11-12
JP4173512B2 (en) 2008-10-29
RU2007140686A (en) 2009-05-10
CA2523597C (en) 2011-09-27
JP2008295065A (en) 2008-12-04
CN101702756A (en) 2010-05-05
TW200501108A (en) 2005-01-01
JP4245651B2 (en) 2009-03-25
ES2383893T3 (en) 2012-06-27
CN101702756B (en) 2013-04-17
ZA200508211B (en) 2007-03-28
DK2367174T3 (en) 2015-06-08
TWI302697B (en) 2008-11-01
EP2369589B1 (en) 2015-02-25
JP2009021997A (en) 2009-01-29
AU2004234798B2 (en) 2007-10-18
JP2006518946A (en) 2006-08-17
CN101582983B (en) 2014-01-22
EP1620855B1 (en) 2012-05-09
US20100067876A1 (en) 2010-03-18
CN101702750A (en) 2010-05-05
CN101582982B (en) 2011-11-09
WO2004098193A3 (en) 2005-02-24
CN101582983A (en) 2009-11-18
JP4245652B2 (en) 2009-03-25

Similar Documents

Publication Publication Date Title
EP1620855B1 (en) Recording medium, reproduction apparatus, recording method, reproducing method, program, and integrated circuit for recording / reproducing a video stream and graphics with window information over graphics display
JP4569931B2 (en) Recording method, playback device, program, and playback method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005518547

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2004234798

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2005/08211

Country of ref document: ZA

Ref document number: 200508211

Country of ref document: ZA

ENP Entry into the national phase

Ref document number: 2004234798

Country of ref document: AU

Date of ref document: 20040427

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2004234798

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2523597

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 1020057020409

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 20048115550

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2004729754

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2005136872

Country of ref document: RU

WWP Wipo information: published in national office

Ref document number: 1020057020409

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 2004729754

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2007057969

Country of ref document: US

Ref document number: 10554627

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 10554627

Country of ref document: US