WO2006109673A1 - 情報処理装置および情報処理方法、プログラム格納媒体、プログラム、データ構造、並びに記録媒体の製造方法。 - Google Patents
情報処理装置および情報処理方法、プログラム格納媒体、プログラム、データ構造、並びに記録媒体の製造方法。 Download PDFInfo
- Publication number
- WO2006109673A1 WO2006109673A1 PCT/JP2006/307334 JP2006307334W WO2006109673A1 WO 2006109673 A1 WO2006109673 A1 WO 2006109673A1 JP 2006307334 W JP2006307334 W JP 2006307334W WO 2006109673 A1 WO2006109673 A1 WO 2006109673A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- font style
- outline
- character object
- font
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/278—Subtitling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2537—Optical discs
- G11B2220/2541—Blu-ray discs; Blue laser DVR discs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/488—Data services, e.g. news ticker
- H04N21/4884—Data services, e.g. news ticker for displaying subtitles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/781—Television signal recording using magnetic recording on disks or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/8042—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/806—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
- H04N9/8063—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8211—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a sound signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8233—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a character code signal
Definitions
- the present invention relates to an information processing apparatus, an information processing method, a program storage medium, a program, a data structure, and a recording medium manufacturing method, and in particular, an information processing that enables a user to reliably determine subtitles.
- the present invention relates to an apparatus, an information processing method, a program storage medium, a program, a data structure, and a recording medium manufacturing method.
- DVD Digital Versatile Disc
- content data such as video and audio is multiplexed together with various sub-picture data such as subtitles and recorded as a program stream on the disc.
- Navigation data for interactive playback of the stream is also recorded on the disc.
- Such technology is also disclosed in, for example, Patent Document 1 and Patent Document 2! /
- interactive playback refers to playback from a desired position or playback in a desired order, as well as a multi-angle function that enables playback of images taken from various angles.
- Various languages of DVD such as the multi-language function that enables playback in the language of your choice, the violent scene, etc. V, playback that the user uses the function.
- the program stream can multiplex a plurality of audio streams and a plurality of caption data streams. This makes it possible, for example, to record subtitle data in different languages on a single disc for one video content such as a movie, and the user can start video playback or during playback. , You can select audio and subtitles in your preferred language.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2003-140662
- Patent Document 2 JP 2002-311967
- each of the sequentially played back frames includes video and corresponding subtitles.
- the user cannot distinguish the subtitle because the subtitle is combined with the video. There was a problem that there was.
- the present invention has been made in view of such a situation, and makes it possible for a user to reliably determine captions.
- An information processing apparatus is an information processing apparatus that generates subtitle data for displaying a subtitle corresponding to a predetermined AV stream, and is a character that holds a character object for displaying a subtitle.
- Font style specification information that can specify at least a font style with an outline as the font style to be applied to the object holding means and the character object, and a font style with an outline is specified in the font style specification information.
- the attribute data holding means for holding attribute data including at least width specification information that can specify the width of the outline of the character object corresponding to the font style and the attribute data holding means.
- Preserved font style specification information If a font style with an outline is specified, the font style data of the font style with the outline is acquired, and the font style is based on the width specification information held by the attribute data holding means.
- the conversion means obtains the first font style data of the outline font style when the font style designation information specifies the font style and outline font style. And the width of the outline of the character object corresponding to the first font style data obtained by the obtaining means based on the width designation information.
- the outline of the character object is expanded and contracted at the same rate in both the inner direction toward the white part and the opposite outer direction, and the width is expanded and contracted in the inner direction and the outer direction.
- the width change means for outputting the second font style data of the font style with the mark, and the acquisition means on the character object corresponding to the second font style data output from the width change means Perform processing to synthesize character object data corresponding to the first font style data, and obtain a font-style character object data outline with an outline whose width is expanded or contracted only in the outer direction. It is possible to have a compositing means for outputting subtitle data.
- the attribute information held by the attribute data holding means is the font style with the outline when the font style specification information specifies the outline and the font style. It further includes color designation information that can specify the outline color for the corresponding character object, and the conversion means updates the font style data based on the color designation information in addition to the width designation information held by the attribute data holding means.
- the character object held by the character object holding means can be converted into subtitle data using at least the updated font style data.
- An information processing method of the present invention is an information processing method of an information processing device that generates caption data for displaying captions corresponding to a predetermined AV stream, and includes a character object for displaying captions.
- Character object retention control step for controlling retention, font style specification information that can specify at least a font style with an outline as a font style to be applied to the character object, and font style specification information.
- Attribute data that controls the retention of attribute data that includes at least width specification information that can specify the width of the outline for the character object corresponding to the font style with the outline.
- step of controlling processing font style specifying information Nio held by, One outline Te was font style is specified, acquires font style data of the font style having an outline, the genus
- the font style data is updated based on the width designation information retained by the processing of the property data retention control step, and the character object retained by the character object retaining means is converted into subtitle data using at least the updated font style data.
- a conversion step for converting is a step for converting.
- Each of the program storage medium and the program of the present invention is a program storage medium and a program corresponding to the above-described information processing method of the present invention.
- caption data for displaying a caption corresponding to a predetermined AV stream is generated.
- caption data is generated as follows.
- a character object for displaying subtitles is held, font style specification information that can specify at least a font style with an outline as a font style to be applied to the character object, and font style specification
- the attribute data including at least the width specification information that can specify the width of the outline of the character object corresponding to the font style is retained. ing.
- the font style specification information specified specifies the outline and the font style
- the font style data of the outline and the font style is acquired and the saved width Based on the designation information, the font style data is updated, and the retained character object is converted into subtitle data using at least the updated font style data.
- the data structure of the present invention is a data structure of information necessary for generating subtitle data for displaying subtitles corresponding to a predetermined AV stream, and is a character object and character for displaying subtitles. If the font style to be applied to the object is one of the outlines, the font style specification information that can specify at least the font style, and the font style specification information, the font style specification information is one of the outlines and the font style is specified. And the attribute data including at least width designation information capable of designating an outline width of the character object corresponding to the font style.
- the data structure is included in a predetermined stream, and the clip manages the predetermined stream.
- the information file can contain specification information that specifies the file for the character object.
- the data structure is stored in a file different from the file in which the AV stream is stored, and the AV stream is stored in the PlayList file that is the playback management information!
- the structure is specified by the SubPlay system stored in the PlayList file.
- the recording medium of the present invention has a data structure of information necessary for generating caption data for displaying captions corresponding to a predetermined AV stream, and is a character object for displaying captions. If the font style to be applied to the character object is one of the outlines, the font style specification information that can specify at least the font style, and the font style specification information is one of the outlines and the font style that is specified. In addition, data having a data structure including attribute data including at least width designation information capable of designating an outline width of a character object corresponding to the font style with the outline is recorded.
- the recording medium manufacturing method of the present invention is a data structure of information necessary for generating subtitle data for causing a playback device to play back subtitles corresponding to a predetermined AV stream, and displays the subtitles.
- Font style specification information that can specify at least a font style with an outline as a font style applied to the character object, and a font style with an outline is specified in the font style specification information.
- data having a data structure including at least attribute data including at least width specifying information for specifying an outline width of the character object corresponding to the font style is generated.
- the generated data becomes a playback target of the playback device. Comprising the step of recording the recording medium.
- Subtitle data for displaying subtitles corresponding to the AV stream is generated.
- a font object that can specify at least a font style with an outline as a character object for displaying subtitles and a font style applied to the character object
- the font style is specified in the font style specification information and the font style specification information
- the outline of the character object corresponding to the font style is selected.
- Subtitle data is generated using attribute data including at least width specification information that can specify the width of the video.
- the present invention it is possible to display subtitles corresponding to an AV stream when a predetermined AV stream is being reproduced.
- the user can reliably determine the caption.
- FIG. 1 is a block diagram showing a configuration of a playback device to which the present invention is applied.
- FIG. 2 is a diagram showing an example of an application format on a recording medium mounted on a playback device to which the present invention is applied.
- FIG. 3 is a diagram for explaining a different example of an application format on a recording medium mounted on a playback apparatus to which the present invention is applied.
- FIG. 4 is a functional block diagram for explaining functions of the controller of FIG. 1.
- FIG. 5 is a diagram showing an example of an optical disk file system and a local storage file system.
- FIG. 6 is a diagram showing an example of a file system obtained by merging the two file systems in FIG.
- FIG. 7 is a diagram for explaining a playlist and data that can be referred to by the playlist.
- FIG. 8 is a diagram for explaining data to be additionally distributed.
- FIG. 9 is a diagram illustrating the syntax of SubPlayItem.
- FIG. 10 is a diagram showing a configuration example of a text subtitle file.
- FIG. 11 is a diagram showing the syntax of the text subtitle file of FIG.
- FIG. 12 is a diagram showing the syntax of each segment constituting the text subtitle in FIG.
- FIG.13 Explains the syntax of segment_descriptor among the components of the segment in Fig.12. It is a figure to do.
- FIG. 14 is a diagram illustrating values that can be specified with segmenUype in FIG.
- FIG. 15 is a diagram showing the syntax of dialog_style_segment among the segments constituting the text subtitle of FIG.
- FIG. 16 is a diagram for explaining the syntax of dialog_style_set among the components of dialog_style_segment in FIG.
- FIG. 17 is a diagram for explaining the syntax of dialog_style_set among the components of dialog_style_segment in FIG.
- FIG. 18 is a diagram for explaining values that can be specified with player_style_flag in FIG.
- FIG. 19 is a diagram for explaining values that can be specified with textjlow in FIG.
- FIG. 20 is a diagram for explaining values that can be specified in text_horizontal_alignment in FIG.
- FIG. 21 is a diagram for explaining values that can be specified with text_vertical_alignment in FIG.
- FIG. 22 is a diagram for explaining values that can be specified with font_style in FIG.
- FIG.23 An example of a table in which the parts necessary for applying the present invention are updated to Table 9-59 (Data—type definition) of the conventional Blu-Ray Disc Read-Only Format Version 0.89r3 FIG.
- FIG. 24 is a diagram showing a virtual storage area of the local storage in FIG. 1.
- FIG. 25 is a block diagram for explaining a detailed configuration of the decoder in FIG. 1.
- FIG. 26 is a block diagram for explaining a detailed configuration of a text subtitle processing unit in FIG. 25.
- FIG. 27 is a block diagram for explaining a detailed configuration of the font rasterizer of FIG. 26.
- FIG. 28 is a diagram illustrating an example of character object data in a normal font style.
- FIG. 29 shows an example of outline-style character object data.
- FIG. 30 is a diagram for explaining an example of a conventional method among outline size (width) changing methods.
- FIG. 31 is a diagram for explaining an example of a technique to which the present invention is applied among the outline size (width) changing techniques.
- FIG. 32 is a flowchart for explaining playback processing.
- FIG. 33 is a flowchart for explaining content reproduction processing.
- FIG. 34 is a flowchart for explaining subtitle display processing.
- FIG. 35 is a flowchart for explaining subtitle display processing.
- FIG. 36 is a diagram for explaining the manufacture of a recording medium on which data reproducible by the reproducing device of FIG. 1 is recorded.
- FIG. 37 is a diagram for explaining the manufacture of a recording medium on which data that can be played back by the playback device of FIG. 1 is recorded.
- FIG. 1 is a block diagram showing a configuration example of a playback device 1 as an information processing device to which the present invention is applied.
- the controller 21 reads the navigation program (described later) recorded on the optical disk 11 by controlling the optical disk drive 22 or the power to execute a prepared control program, and stores it in the memory 23. By unfolding and executing, the entire operation of the playback device 1 is controlled. For example, the controller 21 can display a predetermined menu screen on an external display device when the optical disc 11 is loaded.
- the optical disk drive 22 reads data from the optical disk 11 according to the control by the controller 21 and outputs the read data to the controller 21, the memory 23, or the decoder 26. Information power read from the optical disc 11 If the information is a navigation program or a play list, the information read by the optical disc drive 22 is output to the controller 21 or the memory 23. If the information power AV stream read from the optical disk 11 is text data, the information read by the optical disk drive 22 is output to the decoder 26.
- FIG. 2 is a diagram illustrating an example of the application format of the recording medium.
- the recording medium referred to here includes, in addition to the optical disk 11 mounted on the reproducing apparatus 1 to which the present invention is applied, for example, a magnetic disk, a semiconductor memory, and the like.
- This application format has two layers, PI ayList and Clip, for AV (Audio Visual) stream management.
- a pair of one AV stream or text data and Clip Information, which is information accompanying it, is considered as one object, and these are collectively called a Clip.
- an AV stream data file is referred to as an AV stream file.
- the Clip Information data file is called a Clip Information file.
- a file used in a computer or the like is handled as a byte string.
- the content of an AV stream file is expanded on the time axis, and the access point of Clip is mainly designated by a PlayList with a time stamp.
- the Clip Information file is used to find address information to start decoding in the AV stream file from the time stamp.
- the PlayList is a set of information indicating the playback section of the AV stream.
- Information indicating one playback section in an AV stream is called Playltem, and Playltem is represented by a pair of IN point (playback start point) and OUT point (playback end point) between playback sections on the time axis. . Therefore, the Play List is composed of one or more Playltems as shown in FIG.
- the first PlayList illustrated first from the left is composed of two Playltems, and the AV Plays included in the first Clip illustrated on the left side by the two Playltems.
- the first half and the second half of the ream are each referenced.
- the second PlayList illustrated second from the left is composed of one Playltem, so that the entire AV stream included in the second Clip illustrated on the right is referred to.
- the third PlayList shown in the third figure from the left is composed of two Playltems, and the predetermined part of the AV stream included in the first Clip shown on the left side by the two Playstems. And a predetermined portion of the AV stream included in the second clip shown on the right side.
- the navigation program is a program for causing the controller 21 to execute the function of controlling the PlayList playback order and PlayList interactive playback.
- the navigation program also has a function of displaying a menu screen for the user to instruct execution of various reproductions.
- the navigation program is written in a programming language such as Java (registered trademark), and is recorded on a recording medium such as the optical disk 11.
- Java registered trademark
- the first Playltem included in the first PlayList illustrated first from the left in FIG. 2 is designated as information indicating the playback position at that time.
- playback of the first half of the AV stream included in the first clip shown on the left side referred to by the Platform, is performed.
- the PlayList described with reference to FIG. 2 uses a sub play item (Sub Play Item) as shown in FIG. It is also possible to include information on the specified sub path (Sub path).
- SubPlayltem is defined, for example, an independent data stream that is not multiplexed with a Clip (eg, MPEG2 transport stream) specified by Playltem can be played back in synchronization with AV stream playback.
- subtitle related information composed of a text subtitle file and a font file necessary for rendering corresponding to the Clip AV stream of the main path (Main Path) specified by Playltem, and subtitles
- main path Main Path
- subtitles By preparing a SubPlayltem that specifies the playback section of related information, the subtitle corresponding to the data that is described in the text subtitle file is described in the font file at the same time as the playback of the Clip AV stream, and the font style data is displayed.
- the display font can be displayed on the display device. That is
- the subtitle-related information and SubPlayltem configured from the text subtitle file and the font file necessary for rendering may be recorded in advance on the optical disc 11, or downloaded from the server 3 via the network 2. Or may be obtained using removable media (for example, removable media 28 described later!).
- the memory 23 appropriately stores data necessary for the controller 21 to execute various processes.
- the local storage 24 is composed of, for example, an HDD (Hard Disk Drive).
- the Internet interface 25 is connected to the network 2 by wire or wirelessly, communicates with the server 3 via the network 2 under the control of the controller 21, and downloads data downloaded from the server 3. Is supplied to the local storage 24. From the server 3, for example, data that is recorded on the optical disk 11 that is mounted on the playback device 1 at that time and that updates the data described with reference to FIG. 2 is downloaded as content.
- the local storage 24 can record contents downloaded from the server 3 via the network 2.
- the decoder 26 is supplied from the optical disk drive 22 or the local storage 24.
- the AV stream or text data is decoded, and the obtained video signal and audio signal are output to an external display device.
- the display device based on the signal decoded by the decoder 26, for example, the content recorded on the optical disc 11 (video display, audio output) is output.
- the operation input unit 29 includes an input device such as a button, a key, a touch panel, a jog dial, and a mouse, and a receiving unit that receives a signal such as infrared rays transmitted from a predetermined remote commander. Get input and supply to controller 21
- a drive 27 is connected to the controller 21 as necessary.
- a magnetic disk including a flexible disk
- an optical disk CD-ROM (Compact Disk-Read Only) are connected to the drive 27.
- Memory, DVD, etc. a magneto-optical disk (including MD (registered trademark) (Mini-Disk)), or a removable medium 28 such as a semiconductor memory is mounted.
- MD registered trademark
- Mini-Disk a magneto-optical disk
- a removable medium 28 such as a semiconductor memory
- the playback device 1 includes a local storage 24 including an HDD (Hard Disk Drive).
- the playback device 1 is connected to the network 2 by wire or wirelessly, and the content downloaded via the server 3 network 2 can be recorded in the local storage 24. From the server 3, for example, data for updating content such as a movie recorded on the optical disc 11 mounted on the playback device 1 at that time can be downloaded.
- HDD Hard Disk Drive
- FIG. 4 is a block diagram illustrating a functional configuration example of the controller 21 in FIG.
- Each configuration of FIG. 4 has a control program prepared in advance executed by the controller 21 or a navigation program recorded on the optical disc 11 is copied. This is realized by being executed by the controller 21.
- the menu screen display control unit 31 is a button operated by the user when selecting the audio or subtitle language or video angle of the content recorded on the optical disc 11, and the update file to be downloaded.
- a menu screen including buttons operated by the user when selecting is displayed on an external display device.
- the operation input obtaining unit 32 obtains a signal indicating an operation input from the user input from the operation input unit 29, and obtains a signal indicating the operation input from the user from the menu screen display control unit 31 and data acquisition.
- the data is output to the corresponding part of the unit 33 or the playback control unit 37.
- the data acquisition unit 33 controls communication performed in the Internet interface 25 in FIG. 1 or information exchange with the removable medium 28 by the drive 27. For example, the data acquisition unit 33 downloads and acquires the update file specified by the user from the server 3 and outputs the acquired file to the local storage directory management unit 34.
- the local storage directory management unit 34 manages a directory of the local storage 24, and controls data writing to the local storage 24 and data reading from the local storage 24.
- the PlayList read from the local storage 24 under the control of the local storage directory management unit 34 is output to the memory 23, and the AV stream audio data, video data, and text subtitle file read from the local storage 24 are read.
- the text data is output to the decoder 26.
- the local storage directory management unit 34 when merging the file system of the optical disc 11 and the file system of the local storage 24 is performed by the file system merge processing unit 36, the file system of the local storage 24. Information related to this is output to the file system merge processing unit 36.
- the optical disc directory management unit 35 manages a directory of the optical disc 11 and controls reading of each data from the optical disc 11.
- Studiojd and Contentjd which are identification information, are set in the optical disc 11, and the Studiojd and Contentjd read from the optical disc 11 are controlled by the optical disc directory management unit 35, and the data acquisition unit 33 and the local storage. Output to the directory manager 34.
- the optical disk directory management unit 35 As a result, the PlayList read from the optical disc 11 is output to the memory 23, and the audio data and video data of the AV stream and the text data of the text subtitle file read from the optical disc 11 are output to the decoder 26. .
- the optical disk directory management unit 35 when merging the file system of the optical disk 11 and the file system of the local storage 24, is performed by the file system merge processing unit 36, the file system of the optical disk 11 The information about is output to the file system merge processing unit 36.
- the file system merge processing unit 36 merges the file system of the optical disc 11 supplied from the optical disc directory management unit 35 and the file system of the local storage 24 supplied from the local storage directory management unit 34 into one Create a virtual file system.
- the file system merge processing unit 36 outputs the virtual file system generated by merging to the reproduction control unit 37.
- one file system generated by being merged by the file system merge processing unit 36 in the first embodiment will be referred to as a first virtual file system as appropriate.
- the playback control unit 37 executes the navigation program specified by the first virtual file system supplied from the file system merge processing unit 36, and controls the playback of the content. Specifically, the playback control unit 37 is supplied to the memory 23, refers to the stored PlayList, controls the local storage directory management unit 34 or the optical disc directory management unit 35, and records it in the optical disc 11 or the local storage 24. Audio data, video data, and text data of a text subtitle file, if necessary, are read out and recorded on the optical disc 11 or local storage 24 by controlling the decoder 26 in FIG. The audio data and video data of the AV stream that is being decoded and the text data of the text subtitle file are decoded (reproduced) as necessary.
- the merge of the file system of the optical disc 11 performed by the file system merge processing unit 36 and the file system recorded in the local storage 24 by downloading from the server 3 will be described.
- this merging is performed when reproduction of content recorded on the optical disc 11 is instructed.
- FIG. 5 shows the file system (left side) of the optical disc 11 and the files in the local storage 24. It is a figure which shows the example of a system (right side). As shown in the figure, each file system has a directory structure.
- a folder with the name “BDMV” is prepared under “root” on the optical disc 11.
- a file with the name “info.bdmv”, and “Navigation.class” A file with the name of is stored.
- info.bd mv file and Navigation.class file these files will be referred to as info.bd mv file and Navigation.class file, respectively.
- other files and folders are also referred to as “file name” with “file” added or “folder name” with “folder” added.
- the info.bdmv file describes Studio_id, which is the identification information of the producer of the optical disc 11, and Contentjd, which is the content identification information.
- Studiojd is “xxx” and Contentjd is “yyy”. Studiojd and Contentjd are also used to identify the update file to be downloaded.
- the Navigation.class file is a navigation program described in a predetermined program language.
- the BDMV folder also has a folder with the name “PLAYLIST” (PLAYLIST folder), a folder with the name “CLIPINF” (CLIPINF folder), and a folder with the name “STREAM”. (STREAM folder) is stored.
- the PLAYLIST folder stores a file set with the name "lllll.mpls" and a file set with the name "22222.mpls”. These files are PlayLists that represent playback sections such as AV stream files with time stamps.
- the CLIPINF folder stores a file in which the name "01000.clpi" is set, a file in which the name "02000.clpi” is set, and the like. These files are Clip Information indicating the correspondence between the time stamp and the address information of the AV stream file or caption related information.
- the STREAM folder stores a file in which the name "01000.m2ts" is set, a file in which the name "02000.m2ts” is set, and the like. These files are AV stream And subtitle related information.
- a folder with a name “xxx-yyy” is stored under “root” of the local storage 24.
- the folder name “xxx-yyy” is data corresponding to the content recorded in the optical disc 11 and identified by Studio_id “xxx” and Content_id “yyy”. Represents that.
- the ⁇ -yyy folder is downloaded when the optical disc 11 with Studiojd “XX x” and Content_id “yyy” is loaded into the playback device 1 and the update file (each file stored in the xxx-yyy folder) is downloaded. It is made by
- an info.bdmv file and a Navigation.class file are stored.
- This info.bdmv file is the same as the info.bdmv file on the optical disc 11
- the Navigation.class file is a file obtained by updating the Navigation.class file on the optical disc 11.
- the Navigation.class file in the local storage 24 is a file that describes a navigation program that is recorded on the optical disk 11 and is upgraded as compared with the recorded one.
- the xxx-yyy folder further stores a PLAYLIST folder, a CLIPINF folder, and a STREAM folder.
- a file with the name “11111.mpls” and a file with the name “22222.mpls” are stored in the PLAYLIST folder of the local storage 24.
- the file with the name “22222.mpls” is an updated file of the file with the same name on the optical disc 11.
- a file in which the name “22222.mpls” is set in the local storage 24 represents a downloaded update play list, and data in which a sub play is added to the play list is described in this file.
- the file with the name “04000.clpi” is stored. That is, the file with the name “04000.clpiJ” is a file newly acquired by downloading. [0070]
- the STREAM folder of the local storage 24 stores a file with the name "04000.m2ts". This file is a newly acquired file.
- a folder with the name “xxx-aaaj” and a name “yyy-bbb” are set under “root” in the local storage 24. Folder is stored. These were created when the optical disc identified by Studio_id “xxx” and Content jd “aaa” and the optical disc identified by Stud iojd “yyy” and Content jd “bbb” were mounted on the playback device 1, respectively. A file corresponding to each content is stored.
- the file system merge processing unit 36 of the controller 21 stores the file system of the optical disc 11 and the file system of the local storage 24 in the memory 23. Merge above to create the first virtual file system.
- the file system merge processing unit 36 determines the time stamp (creation date) and version described in each file. Based on the above, merging is performed so that the file obtained by downloading is used as a file to be referred to when the content is played back. Further, the file system merge processing unit 36 performs merging so that a file only in the local storage 24 connected to the optical disc 11 is used as a file to be referred to when content is reproduced.
- FIG. 6 is a diagram showing an example of a first virtual file system obtained from the two file systems of FIG.
- the navigation file is updated with the file with the navigation.class file and the name “22222.mpls” set (replaced). Is).
- a file with the name “04000.clpi” and a file with the name “04000.m2ts” that are not recorded on the optical disc 11 are added.
- the downloaded file includes an updated navigation program or PlayList
- the file of the same name on the optical disc 11 is changed depending on the file. Is updated (replaced). If a clip file (Clip Information file, AV stream file) that is not on the optical disc 11 is downloaded, it is added to the file system.
- subtitle-related information for displaying subtitles corresponding to AV streams in a predetermined section includes text subtitle data constituting a text subtitle file corresponding to subtitles in a plurality of languages, fonts Based on user operation input, subtitle-related information stored in advance on the optical disc 11 and newly downloaded or copied subtitle-related information stored in the local storage 24 based on user operation input.
- the subtitle data in the language desired by the user among the subtitles in a plurality of languages that can be displayed by the user is stored in advance on the optical disc 11 and is processed and displayed in association with the AV stream.
- the optical disc seller (seller, producer), for example, will provide Arabic subtitle information later via the network 2 in a state where only English subtitles can be displayed. Can be sold.
- optical disc sellers can sell discs at an earlier timing than translating multiple languages at once, and provide additional subtitle-related information for other languages as needed. It becomes possible.
- optical discs will be pre-sold in the regions corresponding to the main languages, and after the translation processing for other languages is completed, the optical disc sales region will be added later, and a text subtitle file download service for the corresponding languages will be provided. It is also possible to start.
- the optical disc seller (seller, producer) must at least update the new data so that the subtitle data can be displayed in association with the AV stream stored in advance on the optical disc 11.
- Playlist files and text subtitle files must be distributed, In addition to this, a font file may be distributed to define the subtitle display format corresponding to the text data described in the text subtitle file.
- FIG. 8 shows an archive of such data to be distributed additionally (corresponding to data stored in the local storage 24).
- the data to be additionally distributed includes a playlist file (PlayList.file), 8-bit number information (number_of_TextSubTitle) indicating the number of the text subtitle file, and text corresponding to the text subtitle file described above.
- a subtitle file (text_subtitle_file) may be included, and a font file (fontjle) may also be included.
- FIG. 9 is a diagram illustrating the syntax of SubPlayltem.
- the 8-bit field of Re_m o_STCjd specifies the identifier of the STC sequence referenced by Clip.
- SubPlayItem_IN_time specifies the playback start time of SubPlayltem.
- SubPlayItem_OUT_time specifies the playback end time of SubPlayltem. That is, the SubPlayltem_IN_time and SubPlayItem_OUT_time can specify the playback area of SubPlayltem.
- the time information described in SubPlayItem_IN_time and SubPlayItem_OUT_time is expressed based on the 45 kHz clock used in STC.
- the 8-bit field of number_of_ClipTextSubtitle specifies the total number of subtitle texts defined in SubPlayltem.
- the language jd 8-bit field specifies the language identifier to be used as subtitles.
- the data in the language jd field shall conform to the ISO / IEC 639-1 standard.
- the 8-bit field of character_code_id specifies an identifier for character data encoding.
- the 8-bit field of font_format_id specifies the font format identifier.
- font_file_path_length specifies the number of bytes in the path name of the font file described in font_file_path. In this field, 0 can be specified as the number of bytes of font_file_path. If 0 is specified, it is processed as if the built-in font stored in advance on the playback device 1 is specified.
- font_file_path specifies the name of the font file used to draw the caption data. If an empty string is specified in the font_file_path field, it is processed as if the built-in font stored in advance on the playback device 1 is specified.
- font_file_path This field value is ISO / IEC 6 It shall be encoded by 46 methods.
- subtitle_file_path_length specifies the number of bytes of the path name of the text subtitle file described in subtitle_file_path.
- subtitle_file_path specifies the path name of the text caption file.
- the subtitle_file_path field value shall be encoded in ISO / IEC 646 format.
- the 16-bit field of commentjength specifies the number of bytes of information described in comment. Comment describes a text subtitle comment. The comments described in the Comment are written using the character set ISO / IEC 646.
- the text subtitle file (text subtitle file, text_subtitle_file referred to in FIG. 8) is configured as a stream file as shown in FIG.
- the text subtitle file configured as a stream file in this way is referred to as a text subtitle stream file (Text_Subtitle_Stream File).
- the text subtitle stream file is a plurality of transport packets (packets described as TP in FIG. 10) during transmission. (Also referred to as TP) in the form of a transport stream.
- the transport stream is composed only of TPs having the same PID of 0x1800.
- the transport stream is composed of only the TP corresponding to a part of the text subtitle stream file. That is, in the example of FIG. 10, the transport stream is composed of only the text subtitle stream file.
- Such a transport stream is a PES (Packetized Elementary Stream) at the time of data processing (for example, at the time of recording processing or playback processing), as shown in the second diagram from the top in FIG. ) Converted to a stream in units of packets. That is, the second figure from the top in FIG. 10 shows an example of the structure of a text subtitle stream file in units of PES packets.
- the first PES packet among the multiple PES packets that make up the text subtitle stream file is called the Dialog Style Segment.
- the This Dialog Style Segment is a PES packet corresponding to attribute data described later.
- Each of a plurality of PES packets following the Dialog Style Segment is called a Dialog Presentation Segment.
- This Dialog Presentation Segment is a PES packet corresponding to a character object described later.
- FIG. 11 is a diagram showing the syntax of such a text subtitle stream (Text_subtitle_stream) finale.
- the 16 bits of the number—of—dialog—presentation—segments specify the number of Dialog Presentation Segments following the Dialog Style Segmnt.
- the text subtitle stream (Text_subtitle_stream) file is preceded by Dialog Style Segment ⁇ , followed by the number of Dialog Presentation Segments specified by number—of—dialog—presentation—segments. It will be arranged and configured.
- FIG. 12 is a diagram showing a general syntax of one segment (Segment).
- one segment is composed of segment_descriptor and segment_data.
- FIG. 13 is a diagram illustrating the syntax of segment_descriptor.
- the 8-bit field of segmenUype specifies the type of segment. Specifically, segmenUype is specified by a predetermined value of one of the segment types shown in FIG. 14 (the value on the same line in FIG. 14). That is, when the segment type is Dialog Style Segment, 0x81 is set to segmenUype. If the segment type is Dialog Presentation Segment, 0x82 is set to segmenUype. Note that 0x00 to 0x13, 0x19 to 0x7F, and 0x83 to OxFF are prepared as spares. In addition, 0x14 to 0x18 and 0x80 are prepared as values for designating segments for the graphic stream.
- segment_data in Fig. 12 is the same as dialog_style_set in the Dialog Style Segment in Fig. 15. Become.
- Figures 16 and 17 show the syntax of this dialog_style_set. Although the details will be described later, this syntax makes it possible to realize “add an outline to a character object used in text constituting subtitles”, which is one of the features of the present invention.
- the 1-bit field of player_style_flag specifies whether or not this dialog_style_set is allowed to be changed to the user's own style.
- Ob is specified as player_style_flag when the permission is not allowed (prohibited).
- lb is specified as player_style_flag.
- the 8-bit finalo of number_of_region_styles specifies the number of regi on.style (described later) used in this dialog_style_set.
- the 8-bit field of number_of_user_styles specifies the number of user_style (described later) used in this dialog_style_set.
- region_stylejd specifies the identifier of the target region_style (region style).
- a region indicates a drawing area.
- the 16-bit field of region_horizontal_position specifies the horizontal coordinate of the target region.
- the 16-bit field of region_vertical_position specifies the vertical coordinate of the target region.
- the region—width 16-bit field specifies the width of the target region.
- the 16-bit field of region_height specifies the height of the target region.
- the 8-bit field of region_bg_palette_entry_id_ref specifies the background color of the target region.
- the 16-bit field of text_box_horizontal_position specifies the horizontal coordinate of the text box in the target region (the range in which the text composing the subtitle is displayed).
- the 16-bit field of ext_box_vertical_position specifies the vertical coordinate of the text box.
- the 16-bit field of text_box_width specifies the width of the text box.
- the 16-bit field of text_box_height specifies the height of the text box.
- the 8-bit field of text_flow is displayed in the text box in the target region.
- the text_flow is specified by a predetermined one of the display directions shown in FIG. 19 and a value corresponding thereto (the value on the same line in FIG. 19). That is, if the display direction is the direction of the left force in the horizontal direction, 1 is specified for text_flow. If the display direction is right force in the horizontal direction and left force direction, 2 is specified for t ext_flow. When the display direction is the direction of force from top to bottom in the vertical direction, 3 is specified for text_flow. Note that 0 and values other than 1 to 3 are prepared as spare values.
- the 8-bit field of text_horizontal_alignment specifies the right alignment, left alignment, or center alignment (horizontal alignment) of the text displayed in the text bot- toms in the target region.
- the value power 3 ⁇ 4ext_horizontal_alignment as shown in FIG. 20 is specified. That is, in the case of left alignment, 1 is specified for text_horizontal_alignment. For centering, 2 is specified for text_horizontal_alignment. In case of right alignment, 3 is specified for text_horiz ontaLalignment. Note that 0 and values other than 1 to 3 are prepared as spare values.
- the 8-bit field of text_vertical_alignment specifies the top, bottom, and center (vertical center) of the text displayed in the text box in the target region. Specifically, the value force 3 ⁇ 4ext_vertical_alignment as shown in FIG. 21 is specified. In other words, 1 is specified for text_vertical_alignment for top alignment. When centered, 2 forces 3 ⁇ 4ext_vertical_alignment is specified. In the case of bottom alignment, 3 is specified for text_vertical_alignment. Note that 0 and values other than 1 to 3 are prepared as spare values.
- the 8-bit field of line_space specifies the interval between baselines in the target region.
- the 8-bit field of font_id_ref specifies the font of the text displayed in the text box in the target region.
- the 8-bit field of font_style specifies the font style of the text displayed in the text box in the target region. Specifically, a value as shown in FIG. 22 is specified in f ont_style. That is, the font style is Normal (previously set as standard 0x00 is specified for font_style. For font style power 3 ⁇ 4old (bold), 0x01 is specified for font_style. If the font style is Italic, 0x02 is specified for font_style. For font style power 3 ⁇ 4old and Italic, 0x03 is specified for font_style. When the font style is Outline-bordered (outline), 0x04 is specified for font_style.
- 0x05 is specified for font-style.
- 0x06 is specified for font_style.
- 0x07 is specified for font_style.
- the font style specified by 0x04 to 0x07 is one of the features of the present invention, which is “outline (contour) to the character object used in text composing subtitles etc.
- This is the font style necessary to realize "
- the value of font_style is set to any of 0x04 to 0x07. Must be specified.
- font styles in which 0x00 to 0x03 are specified as font_style are collectively referred to as normal font styles.
- Font styles with 0x04 to 0x07 specified as font_style are collectively referred to as outline styles.
- the 8-bit field of font_size specifies the font size of the text displayed in the text box in the target region.
- the font_palette_entry_id_ref field specifies the color of the font.
- outline_palette_entry_id_ref is an outline
- the text is displayed in the font style Specifies the color of the outline (the outline of the text) when displayed.
- outline.size specifies the size (width) of the outline. The outline color and size will be described later with reference to FIGS.
- dialog_style_set in the examples of Fig. 16 and Fig. 17, the syntax of user_style change setting (hereinafter referred to as user_changeable_Style_set) is shown in Fig. 17 for the syntax of region_style described above. Yes.
- the 8-bit field of user_style_id specifies the identifier of the target User_control_style.
- the 15-bit field of region— vertical— position— delta specifies the vertical movement amount of region— vertical — positi on.
- the 1-bit field of font_size_inc_dec specifies the direction in which the font size is changed, that is, whether it becomes larger or smaller after the change. Specifically, 0b force becomes larger when it becomes larger, and lb force becomes smaller when it becomes smaller. Each font_sizejnc_dec is specified.
- font— The 7-bit field of size_delta specifies the amount of font size change.
- the 1-bit field of text_box_horizontal_position_direction specifies the horizontal direction of movement of the text box. Specifically, the text_box_horizontal_position_direction is set to 0 when the moving direction is right and 1 when the moving direction is left.
- the 15-bit field of text_box_horizontal_position_delta specifies the amount of horizontal movement of the text box.
- the 1-bit field of text_box_vertical_position_direction specifies the vertical direction of movement of the text box. Specifically, 0 is specified for text_box_vertical_position_direction when the moving direction is downward, and 1 is specified when the moving direction is upward.
- the text_box_vertical_position_delta 15-bit fine redo specifies the amount of vertical movement of the text box.
- the 1-bit field of text_box_width_inc_dec specifies the direction in which the width of the text box is changed, that is, the force to increase or decrease the width after the change. Specifically, the Ob force is increased when it becomes thick, and 1 b is specified as text_box_width_inc_dec when it becomes smaller.
- the 15-bit field of text_box_width_delta specifies the amount of text box width change.
- the 1-bit field of text_box_height_inc_dec specifies the direction in which the height of the text box is changed, that is, the force that increases or decreases after the change. Specifically, 1b is specified for text_box_height_inc_dec when the Ob force decreases when the height increases.
- the 15-bit field of text_box_height_delta specifies the amount of text box height change.
- the 1-bit field of line_space_inc_dec specifies the direction of change of line_space, that is, whether it increases or decreases after the change. Specifically, Ob is specified for line_space_inc_dec when increasing, and lb force when decreasing.
- the line_space_delta and other 7-bit fields specify the amount of line_space change.
- PaletteO is composed of the following fields (not shown). That is, length specifying the length of this pale tteO, palette_entry_id specifying the specific color identifier of the palette, Y_value specifying the Y value of (Y, Cb, Cr), (Y, Cb, Cr Cb_value that specifies the value of Cb in (), Cr_value that specifies the value of Cr in (Y, Cb, Cr), and transparency PaletteO is composed of the specified T_value, t, and two fields.
- FIG. 24 is a diagram showing a virtual storage area of the local storage 24.
- the local storage 24 stores various information downloaded or copied in the file format described with reference to FIG.
- the local storage 24 has a downloaded navigation program storage unit 51, which is a storage area for downloaded navigation programs, as a virtual storage area.
- a playlist storage unit 52 that is an area for storing playlists
- a text subtitle file storage unit 53 that is an area for storing downloaded text subtitle files (text subtitle files)
- the font file storage unit 54 can be provided.
- the data files stored in the navigation program storage unit 51 and the playlist storage unit 52 are read out and supplied to the memory 23 under the control of the local storage directory management unit 34 described with reference to FIG.
- the data files stored in the text subtitle file storage unit 53 and the font file storage unit 54 are read and supplied to the decoder 26 under the control of the local storage directory management unit 34 described with reference to FIG. Is done.
- the storage areas of the navigation program storage unit 51, the playlist storage unit 52, the text subtitle file storage unit 53, and the font file storage unit 54 of the local storage 24 illustrated in FIG. 24 are virtual. Therefore, the storage area of the local storage 24 does not have to be physically divided according to the type of information stored as shown in FIG. 24. What! /
- FIG. 25 is a block diagram for explaining a detailed configuration of the decoder 26.
- the decoder 26 includes a control unit 81, a disk data acquisition unit 82, buffers 83 and 84, and a PID.
- a data decoder 89, a GUI graphics plane processing unit 90, a storage data acquisition unit 91, a text subtitle processing unit 92, a subtitle graphics plane processing unit 93, and a synthesis processing unit 94 are provided.
- the control unit 81 controls the processing of each unit of the decoder 26 based on the control of the reproduction control unit 37.
- the disc data acquisition unit 82 acquires the data supplied to the decoder 26 out of the data read from the optical disc 11, and a multiplexed stream such as an AV stream specified by Playltem described with reference to FIG. Is supplied to the buffer 83 which is a read buffer of data specified by Playltem constituting the main path, and the data of the text subtitle file or font file specified by SubPlayltem (the data of the font file is specified, If there is a case, the data is supplied to a buffer 84 which is a read buffer for data specified by SubPlayltem constituting the sub path.
- the stream data read from the noffer 83 is output to the PID filter 55 in the subsequent stage at a predetermined timing.
- the PID filter 85 distributes the input multiplexed stream to the audio decoder 86, the MPEG video decoder 87, or the GUI data decoder 89, which is a decoder of each subsequent elementary stream, according to the PID, and outputs it.
- the PID filter 55 supplies an audio (audio) stream to the audio decoder 86, supplies a video (video) stream to the MPEG video decoder 87, and supplies image data related to the user interface to the GUI data decoder 89. Supply.
- the audio decoder 86 decodes the audio stream and outputs data of the decoded audio stream.
- the MPEG video decoder 87 decodes the video stream and outputs the decoded video data to the video plane processing unit 88.
- the video plane processing unit 88 Based on the decoded video data, the video plane processing unit 88 generates a video plane corresponding to an image displayed on one page (or one frame) (an image constituting a moving image). And output to the synthesis processing unit 94.
- the GUI data decoder 89 decodes the interactive graphics stream and supplies the decoded GUI data to the GUI graphics plane processing unit 90.
- the GUI graphics plane processing unit 90 generates a graphics plane corresponding to the GUI displayed on one screen, and outputs it to the synthesis processing unit 94.
- the storage data acquisition unit 91 acquires data (that is, text subtitle file or font file data) supplied to the decoder 26 from the data read from the local storage 24, and the text subtitle processing unit 91 Supply to 92.
- the text subtitle processing unit 92 decodes the text data supplied from the storage data acquisition unit 91 or read from the nota 84 under the control of the control unit 81, and based on predetermined font style data, Is converted (rasterized) into raster data (character object data) and the like and supplied to the subtitle graphics plane processing unit 93. Details of the text subtitle processing unit 92 will be described later with reference to FIG.
- the subtitle graphics plane processing unit 93 generates a subtitle graphics plane corresponding to the subtitles to be displayed on one page (one frame) based on the decoded and rendered text data, and performs synthesis processing. Output to part 94.
- the composition processing unit 94 receives the video plane supplied from the video plane processing unit 88, the graphic plane corresponding to the GUI supplied from the GUI graphics plane processing unit 90, and the subtitle graphics plane processing unit 93.
- the supplied subtitle graphics plane is synthesized and output as a video signal.
- FIG. 26 is a block diagram for explaining a detailed configuration of the text subtitle processing unit 92.
- the text data decoder 121 decodes the data of the text subtitle file (text subtitle file), and supplies the character object to the character object buffer 122 and the attribute (attribute) data to the attribute data buffer 123.
- the text subtitle file mentioned here refers to, for example, the text subtitle stream file shown in Fig. 10 described above. Therefore, the character object refers to, for example, the above-described Dialog presentation Segment in FIG.
- the attribute data here refers to, for example, the Dialog Style Segment shown in FIG. 10 described above.
- the font_style, font, or the like shown in FIG. The outline color in the font style is specified as ⁇ outline. Palette_entry_id_ref, and the outline size (width) in the outline font style is specified.
- the attribute data stored in the attribute data buffer 123 is used for user operation input. Based on the control of the control unit 81, the change is made. For example, when the user instructs to change the font size or character color, the corresponding User_control_styleO (FIG. 17) of the attribute data stored in the attribute data buffer 123 is changed under the control of the control unit 81.
- the font rasterizer 124 selects the character object read from the character object buffer 122 based on the designation of the attribute read from the attribute data buffer 123 and the font style data supplied from the notifier 84 or the storage data acquisition unit 91.
- the data is converted into raster data (character object data) such as a bitmap and output to the subtitle graphics plane processing unit 93.
- the font rasterizer 124 detects a character object to which a bookmark is attached based on the attribute read from the attribute data buffer 123, and uses the bookmark buffer 125 to apply the same bookmark ID to the character. For objects, do not rasterize redundantly! /. Details of the font rasterizer 124 will be described later with reference to FIG.
- the bookmark buffer 125 holds the raster data of the character object to which the bookmark is given by the processing of the font rasterizer 124, and the held raster data is read by the font rasterizer 124.
- FIG. 27 is a block diagram for explaining the detailed configuration of the font rasterizer 124.
- the character object acquisition unit 151 acquires the character object from which the character object buffer 122 (FIG. 26) is also read, and supplies it to the bookmark detection unit 152.
- the bookmark detection unit 152 determines whether the character object supplied from the character object acquisition unit 151 has already been bookmarked and stored in the bookmark buffer 125 (Fig. 26). It is detected whether or not a character object to which an ID is assigned is already stored in the bookmark buffer 125.
- the bookmark detection unit 152 detects that a character object to which the same bookmark ID is assigned is stored in the bookmark buffer 125, the character object ( (Rasterized data) is acquired from the bookmark buffer 125 and the subtitle graphics plane processing unit 93 (Fig. 2 Output to 5).
- the bookmark detection unit 152 detects that a character object having the same bookmark ID is not yet stored in the bookmark buffer 125, the bookmark detection unit 152 is supplied from the character object acquisition unit 151. The character object is supplied to the rasterizing unit 153 as it is.
- the rasterization unit 153 specifies the attribute read from the attribute data buffer 123, and supplies the normal font style font style data supplied from the normal font style acquisition unit 156 or the outline style force check unit 157.
- the character object supplied from the bookmark detection unit 152 is converted to raster data (character object data) such as a bitmap based on the outline style font style data, and the subtitle graphics plane processing unit 93 At the same time, it is supplied to the bookmark storage control unit 154.
- the font style data of the outline style supplied from the force outline style caloche unit 157 which will be described later with reference to FIGS. 28 to 31, is the outline of the original font style data. Font style data in which the color and size (width) of the font are changed (processed) as necessary.
- the bookmark storage control unit 154 determines whether or not a new bookmark is described based on the attribute data of the character object supplied from the rasterizing unit 153, and the bookmark is described. If the bookmark is not listed, the control is prohibited.
- the font style detection unit 155 is required to rasterize the character object acquired by the character object acquisition unit 151 with reference to the corresponding font_style of the attribute data stored in the attribute data buffer 123. Detects the correct font style. When the font style detection unit 155 detects that the font style is the normal font style, the font style detection unit 155 notifies the normal font style acquisition unit 156 of the detection result. On the other hand, the font style detection unit 155 uses the outline style as the font style. Is detected, the result of the detection is notified to the outline style acquisition unit 161 of the outline style processing unit 157.
- the normal font style acquisition unit 156 is the font style data of the normal font style specified by the detection result of the font style detection unit 155 out of the font style data to which the buffer 84 or the storage data acquisition unit 91 is also supplied. Is obtained and supplied to the rasterizing unit 153.
- the outline style force check unit 157 is the font style specified by the detection result of the font style detection unit 155 out of the font style data supplied from the buffer 84 or the storage data acquisition unit 91, that is, the outline. Get the font style data for the style. Further, the outline style force checking unit 157 refers to the corresponding outline_palette_entry_id_rel3 ⁇ 4 of the attribute data stored in the attribute data buffer 123, and updates the outline color. The outline style force checking unit 157 refers to the corresponding outline_size in the attribute data stored in the attribute data buffer 123 and updates the size (width) of the outline. Then, the outline style processing unit 157 supplies the font style data whose outline color and size (width) are updated to the rasterizing unit 153 (more precisely, a combining unit 164 described later).
- FIG. 28 shows alphabet A as an example of character object data in the normal font style.
- the character object data A of the normal font style is referred to as character object data 171.
- FIG. 29 shows an alphabet A as an example of outline-style character object data.
- the A character object data in the outline style is referred to as character object data 181.
- the character object data 181 is generated based on the outline data.
- Outline data is font style data in which the shape of a character is expressed as an outline. It is one of the data. This outline data was used in outline fonts.
- outline font is one of the font formats that exist in the past, and it is a format that retains outline data and performs lettering. Outline fonts have the characteristics that they are more resistant to enlargement and reduction than bitmap fonts (a format that holds bitmaps).
- FIG. 30 is a diagram for explaining a conventional method among the changing methods for changing the outline (contour) width of the character object data 181.
- FIG. 30 is a diagram for explaining a conventional method among the changing methods for changing the outline (contour) width of the character object data 181.
- the width of the outline 181—LO of the character object data 181 is expanded and contracted at an equal rate in both directions perpendicular to the outline (FIG. 30).
- the left and right sides of the horizontal direction are stretched), and as a result, it is changed to outline 181—L1.
- the character object data 181 has the outline width thickened at a uniform rate in both directions perpendicular to the outline. As a result, the character object data 181 as shown in FIG. Changed to 182.
- FIG. 31 is a diagram for explaining an example of a change method that can solve the problems of the conventional method, among the change methods of the outline size (width).
- the method of FIG. 31 is applied to the combining unit 164 (FIG. 27) of the rasterizing unit 153.
- the synthesis unit 164 performs character object data 181 based on the outline data. Is generated.
- the compositing unit 164 changes the outline width of the character object data 181 in accordance with the conventional method described with reference to FIG. 30 at an equal ratio in both directions perpendicular to the outline. 182 is generated.
- the synthesis unit 164 uses the font style data with the outline supplied from the outline style force check unit 157 in order to generate the character object data 182.
- the font style data with this outline will be described later.
- the composition unit 164 superimposes the character object data 181 on the character object data 182 (executes composition processing).
- the outline of the character object data 181 is referred to as a center line
- the composition unit 164 uses the outline style font style data supplied from the outline style processing unit 157 in order to generate the character object data 182.
- this character object data 182 has outline data (font style data of outline style used in conventional outline fonts), and the width (inward and outward directions) of the outline of character object data 181 that is also generated. Both) and the outline has changed color.
- the outline style force checking unit 157 needs to change the outline data necessary for generating the character object data 181 to the font style data necessary for generating the character object 182.
- the outline style force feeding portion 157 is configured as shown in FIG. That is, in the example of FIG. 27, the outline style force check unit 157 includes an outline style acquisition unit 161, a color change unit 162, and a both width change unit 163.
- the outline style acquisition unit 161 detects the detection result of the font style detection unit 155 out of the font style data supplied from the buffer 84 or the storage data acquisition unit 91.
- the font style specified in step i.e., outline style font style data (outline data)
- the acquired outline style font style data is provided to the color changing unit 162.
- the color changing unit 162 refers to the outline_palette_entry_id_ref ⁇ corresponding to the attribute data stored in the attribute data buffer 123, and obtains the outline style font unit 16 for the outline style font style data supplied. Change the outline color.
- the font style data of the outline style in which the outline color is changed is supplied to the both width changing unit 163 and the synthesizing unit 164 of the rasterizing unit 153.
- the synthesis unit 164 generates character font data 181 having a color outline corresponding to the font style data based on the font style data supplied from the color change unit 162. Will do.
- the double width changing unit 163 refers to the corresponding outline_size of the attribute data stored in the attribute data buffer 123, and for the outline style font style data supplied from the color changing unit 162, The size (width) of the outline is expanded and contracted at an equal rate both in the outer and inner directions.
- Outline style font style data in which the size (width) of the outline has been changed in both the outward direction and the inward direction is supplied from the width change unit 163 to the composition unit 164.
- the synthesizing unit 164 executes the process according to the conventional method shown in FIG. Specifically, for example, when the character object data 181 of FIG. 31 has already been generated, the compositing unit 164 generates an outline of the character object 181 based on the font style data supplied from the both-width changing unit 163. Character object data 182 having an outline whose width is expanded and contracted at the same rate in the outer direction and the inner direction with respect to the width (and the color is changed) is generated.
- the composition unit 164 combines the character object data 181 and the character object data 182 so that only the outer direction of the outline width of the character object 181 expands or contracts.
- Character object data 183 having an outline of a specified width (and color changed) is generated.
- step S1 When the update file downloaded as described above is recorded in the local storage 24 and recorded on the optical disc 11 in a state where playback of the content is instructed, in step S1, the optical disc directory is The management unit 35 reads Studiojd and ContenUd from the optical disc 11 and outputs the read Studiojd and Content jd to the local storage directory management unit 34.
- step S2 the local storage directory management unit 34 searches for the file system of the local storage 24 corresponding to the file system of the optical disc 11 based on the Studiojd and Contentjd supplied from the optical disc directory management unit 35.
- the file system merge processing unit 36 is supplied.
- the file system corresponding to the file system of the optical disc 11 is stored in the directory in which names including Studiojd and Contentjd are set in the local storage 24 (FIG. 5).
- the file system of the optical disc 11 is output from the optical disc directory management unit 35 to the file system merge processing unit 36.
- step S3 the file system merge processing unit 36 merges the file system of the optical disc 11 and the file system of the local storage 24 as described with reference to FIG. 5 and FIG. Generate 1 virtual file system.
- the generated first virtual file system is output to the playback control unit 37 and used for playback of the AV stream file.
- step S4 the playback control unit 37 also designates and executes the navigation program for the supplied first virtual file system card. Therefore, as shown in FIG. 6, when the updated navigation program is in the first virtual file system, the navigation program (the navigation program recorded in the local storage 24! Is executed.
- step S5 the playback control unit 37 acquires the PlayList and SubPlayltem specified as the playback section by the navigation program, proceeds to step S6, and controls the local storage directory management unit 34 and the optical disc directory management unit 35. Then, the acquired PlayList and the file (AV file, text subtitle file, font file, etc.) referenced by SubPlayltem are read and supplied to the decoder 26.
- PlayList and SubPl The time stamp represented by ayltem is converted into an address by Clip information, and an AV stream or the like is accessed.
- the playback control unit 37 reads the AV stream file from the local storage 24 when the AV stream file referred to by the PlayList or SubPlayltem exists in the local storage 24, and from the optical disk 11 when the AV stream file does not exist. Read AV stream file.
- the playback control unit 37 controls the local storage directory management unit 34 and the optical disc directory management unit 35 according to the file system of FIG.
- the file name of r01000.m2tsj is set in advance, and the AV stream file and the file name of “02000.m2ts” are set to the AV stream file! Is read from the local storage 24 for the AV stream file set with the file name “04000.m2ts” added by downloading! /.
- step S7 the content playback process described later with reference to FIG. 33 is executed, and the read AV stream file, text subtitle stream file (text subtitle stream file), etc. are decoded, and video, audio, or Subtitles are output from the display device and the process is terminated.
- step S21 the playback control unit 37 (Fig. 4) controls the decoder 26 to play back the AV stream data specified by Playltem.
- step S22 the operation input acquisition unit 32 determines whether or not the operation input commanding the display of caption data has been received. If it is determined in step S22 that an operation input commanding subtitle data display has not been received, the process proceeds to step S28 described later. [0194] If it is determined in step S22 that an operation input commanding display of caption data has been received, in step S23, the operation input acquisition unit 32 sends a signal corresponding to the user operation input to the menu screen display control unit 31. To supply. The menu screen display control unit 31 displays a list menu of subtitle data that can be displayed on the display device.
- step S24 the operation input acquisition unit 32 determines whether or not it has received an operation input that specifies the language of the caption data to be displayed. If it is determined in step S24 that an operation input designating the language of the character data to be displayed has not been received, the process returns to step S23, and the subsequent processes are repeated.
- step S25 the operation input acquisition unit 32 outputs a signal corresponding to the user operation input to the reproduction control unit 37.
- the playback control unit 37 controls the local storage directory management unit 34 to read the text subtitle data referenced by the Sub Playltem specified by the user's operation input, and obtain the storage data of the decoder 26.
- the font style data corresponding to the font (outline font or the like) designated by the user is read out and supplied to the storage data acquisition unit 91 of the decoder 26 as needed.
- step S26 the decoder 26 executes a caption display process to be described later with reference to FIGS. 34 and 35.
- step S27 the operation input acquisition unit 32 determines whether or not it has received an operation input instructing to change the language of the caption data to be displayed. If it is determined in step S27 that an operation input commanding the change of the language of the caption data to be displayed has been received, the process returns to step S25, and the subsequent processes are repeated.
- step S28 the operation input acquisition unit 32 determines whether the operation input commanding the end of content playback has been received, or whether the AV stream data being played has ended. To do.
- step S28 the operation input commanding the end of content playback is not received.
- step S29 the operation input acquisition unit 32 determines whether or not it has received an operation input for stopping the display of subtitles.
- step S29 If it is determined in step S29 that an operation input for stopping the display of subtitles has not been received, the process returns to step S26, and the subsequent processes are repeated. If it is determined in step S29 that an operation input for stopping the display of subtitles has been received, the process returns to step S21, and the subsequent processes are repeated.
- step S28 If it is determined in step S28 that an operation input for instructing the end of content playback has been received or that the AV stream data being played back has ended, the processing ends.
- a text subtitle file recorded in advance on the optical disc 11 or downloaded from the server 3 and stored in the local storage 24 is used. Is displayed together with the video and audio of the content, and the text subtitle file to be read is changed to change the language of the displayed subtitle based on the user's operation input.
- step S51 the text data decoder 121 (FIG. 26) of the text subtitle processing unit 92 of the decoder 26 (FIG. 25) decodes the text subtitle data acquired by the storage data acquisition unit 91.
- step S52 the text data decoder 121 supplies the character object included in the decoded text subtitle data to the character object buffer 122, so that the character object is buffered in the character object buffer 122.
- step S53 the text data decoder 121 converts the attribute data such as font style (font_style), outline color (outline_palette_entry_id_ref) and size (outline_size) included in the decoded text subtitle data into the attribute data buffer. Attribute data is buffered in the attribute data buffer 123.
- step S54 the font style detection unit 155 of the font rasterizer 124 (Fig. 2).
- step 7 it is determined whether the font style buffered in step S53 is an outline style.
- step S54 If it is determined in step S54 that it is not an outline style, that is, if it is determined that it is a normal font style, in step S55, the normal font style acquisition unit 156 selects the font style data of the normal font style. To get.
- step S54 when it is determined in step S54 that the outline style is selected, in step S56, the outline style force check unit 157 acquires font data of the outline style.
- step S57 the outline style processing unit 157 updates the font style data of the outline style based on the attribute data such as the size and color of the outline buffered in the process of step S53. That is, the font style data of the outline style with the outline color and width changed is generated.
- step S58 the character object acquisition unit 151 acquires a character object from the character object buffer 122 (FIG. 26) and supplies it to the bookmark detection unit 152.
- step S59 the rasterizing unit 153 acquires font style data from the normal font style acquiring unit 156 or the outline style force checking unit 157.
- step S60 the bookmark detection unit 152 refers to the attribute noferred in the process of step S53 (Fig. 34) and the bookmark buffer 125, and attempts to rasterize it.
- the subtitle data corresponding to the character object acquired in the processing is bookmarked and is already buffered in the bookmark buffer 125.
- step S61 If it is determined in step S60 that the subtitle data is bookmarked and buffered, in step S61, the bookmark detection unit 152 stores the subtitles to be bookmarked from the bookmark buffer 125. Read and output image data (raster data). Thereby, the process proceeds to step S65.
- step S60 On the other hand, when it is determined in step S60 that the caption data is not already bookmarked, the character object acquired in the process of step S58 is transferred from the bookmark detection unit 152 to the rasterization unit 153. Supplied. Thereby, the process proceeds to step S62.
- step S62 the rasterizing unit 153 acquires the character object acquired in the process of step S58, the process of step S55 or S56 of Fig. 34 (and is updated in the process of step S57 as necessary). Based on the font style data and the attribute data buffered in step S53 of Fig. 34, rasterization is performed, and the raster data (character object data) that is the execution result is converted into a caption graphic spp.
- the data is output to the lane processing unit 93 (FIG. 25) and supplied to the bookmark storage control unit 154.
- the subtitle graphics plane processing unit 93 generates a subtitle graph status plane based on the supplied raster data, and supplies the subtitle graph status plane to the synthesis processing unit 94.
- step S63 the bookmark storage control unit 154 uses the attribute data of the caption data (character object data, which is raster data) rasterized in the process of step S62, to add a new bookmark to the attribute data. Judge whether or not the power is listed.
- the bookmark storage control unit 154 stores the rasterized subtitle image data (raster data) in the bookmark buffer 125 in step S64.
- step S62 After the process of step S62 is completed, if it is determined in step S63 that no bookmark is described, or if the process of step S61 is completed, in step S65, the composition processing unit 94 (FIG. 25). ) Combines the supplied subtitle image data with video data and outputs it. Thereby, the subtitle display process which is the process of step S26 in FIG. 33 ends, and the process proceeds to step S27.
- the reading of the text subtitle file and the font file data is controlled so that the subtitle data in the language desired by the user is displayed in the font desired by the user with the attribute desired by the user. Then, the buffered attribute data is changed and rasterized as necessary. [0222] Therefore, when the currently played frame is, for example, a frame in which the color of the subtitle and the color of the video are the same or similar, that is, the user's power, the subtitle is integrated with the video. If the user cannot identify the subtitle, the user can specify the desired outline style so that the text composing the subtitle will be displayed in the desired outline style. The subtitles can be reliably identified.
- the recording medium 11 is a disc-shaped recording medium. This case will be described as an example.
- a master disc made of, for example, glass is prepared, and a recording material made of, for example, a photoresist is applied thereon. As a result, a recording master is produced.
- video data in a format that can be played back by the playback device 1 and encoded by the encoding device is stored in the temporary buffer.
- Audio data encoded by the audio encoder is stored in the temporary buffer, and data other than the stream (eg, Indexes, Playlist, Playltem, etc.) encoded by the data encoder is also stored. It is stored in the hour buffer.
- the video data, audio data, and non-stream data stored in each buffer are multiplexed together with the synchronization signal by a multiplexer (MPX), and error correction code (ECC) is used for error correction.
- MPX multiplexer
- ECC error correction code
- a predetermined modulation is applied by a modulation circuit (MOD), and the software is recorded on a magnetic tape or the like according to a predetermined format, and is recorded on a reproducible recording medium 11 by a reproducing device 11. Is produced.
- MOD modulation circuit
- the software is edited (premastered) as necessary, and a signal of a format to be recorded on the optical disc is generated.
- the laser beam is modulated, and this laser beam is irradiated onto the photoresist on the master.
- the photoresist on the master is exposed corresponding to the recording signal.
- the master is developed, and pits appear on the master.
- the master master prepared in this way is treated with, for example, electric light to produce a metal master master with transferred pits on the glass master. Make.
- a metal stamper is further produced from this metal master, and this is used as a molding die.
- a material such as PMMA (acrylic) or PC (polycarbonate) is injected into the molding die by, for example, injection and fixed.
- 2 P (ultraviolet curable resin) or the like is applied on a metal stamper and then cured by irradiating with ultraviolet rays. As a result, the pits on the metal stamper can be transferred onto a replica made of resin.
- a reflective film is formed by vapor deposition or sputtering.
- a reflective film is formed on the generated replica by spin coating.
- the inner and outer diameters of the disk are processed, and necessary measures such as bonding two disks are performed.
- a label is attached or a hub is attached and inserted into the cartridge. In this way, the recording medium 11 on which data reproducible by the reproducing apparatus 1 is recorded is completed.
- the software can be a computer built into hardware dedicated to the program power that constitutes the software, or various functions can be executed by installing various programs. Installed from a program storage medium on a computer.
- this program storage medium is a magnetic disk (including a flexible disk) on which a program is recorded, which is distributed to provide a program to the user, separately from the computer.
- Removable media consisting of optical disks (including CD-ROM (compact disk-read only memory), DVD (digital versatile disk)), magneto-optical disks (including MD (mini-disk) (trademark)), or semiconductor memory 28 It consists of memory 23 such as ROM or RAM where programs are recorded, local storage 24 such as a hard disk, etc. that are provided to the user in a state of being pre-installed in the main unit of the device. Is done.
- a step describing a program recorded on a program storage medium is not necessarily performed in time series as well as processing performed in time series in the order described. Includes processes that are not processed, but that are executed in parallel or individually
- the system represents the entire apparatus composed of a plurality of apparatuses.
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06731282A EP1871097A4 (en) | 2005-04-11 | 2006-04-06 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, PROGRAM MEMORY, PROGRAM, DATA STRUCTURE AND MANUFACTURING METHOD FOR A RECORDING MEDIUM |
KR1020137019610A KR101383290B1 (ko) | 2005-04-11 | 2006-04-06 | 정보 처리 장치, 정보 처리 방법 및 데이터가 기록된 기록 매체 |
KR1020067025974A KR101324129B1 (ko) | 2005-04-11 | 2006-04-06 | 정보 처리 장치 및 정보 처리 방법, 프로그램 격납 매체,프로그램, 데이터 구조와, 기록 매체의 제조 방법 |
US11/629,082 US8208531B2 (en) | 2005-04-11 | 2006-04-06 | Information processing device, information processing method, program storage medium, program, data structure, and recording medium manufacturing method |
US13/490,076 US10075668B2 (en) | 2005-04-11 | 2012-06-06 | Information processing device and information processing method, program storage medium, program, data structure, and manufacturing method for storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005-113302 | 2005-04-11 | ||
JP2005113302A JP4715278B2 (ja) | 2005-04-11 | 2005-04-11 | 情報処理装置および情報処理方法、プログラム格納媒体、プログラム、並びに提供装置 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/629,082 A-371-Of-International US8208531B2 (en) | 2005-04-11 | 2006-04-06 | Information processing device, information processing method, program storage medium, program, data structure, and recording medium manufacturing method |
US13/490,076 Continuation US10075668B2 (en) | 2005-04-11 | 2012-06-06 | Information processing device and information processing method, program storage medium, program, data structure, and manufacturing method for storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006109673A1 true WO2006109673A1 (ja) | 2006-10-19 |
Family
ID=37086949
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/307334 WO2006109673A1 (ja) | 2005-04-11 | 2006-04-06 | 情報処理装置および情報処理方法、プログラム格納媒体、プログラム、データ構造、並びに記録媒体の製造方法。 |
Country Status (7)
Country | Link |
---|---|
US (2) | US8208531B2 (ja) |
EP (1) | EP1871097A4 (ja) |
JP (1) | JP4715278B2 (ja) |
KR (2) | KR101383290B1 (ja) |
CN (1) | CN100525399C (ja) |
TW (2) | TW200705220A (ja) |
WO (1) | WO2006109673A1 (ja) |
Families Citing this family (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5119566B2 (ja) * | 2004-02-16 | 2013-01-16 | ソニー株式会社 | 再生装置および再生方法、プログラム記録媒体、並びにプログラム |
TW200707417A (en) | 2005-03-18 | 2007-02-16 | Sony Corp | Reproducing apparatus, reproducing method, program, program storage medium, data delivery system, data structure, and manufacturing method of recording medium |
JP5194335B2 (ja) * | 2005-04-26 | 2013-05-08 | ソニー株式会社 | 情報処理装置および情報処理方法、並びにプログラム |
KR20080070201A (ko) * | 2007-01-25 | 2008-07-30 | 삼성전자주식회사 | 추가적인 자막 제공 방법 및 그 재생 장치 |
JP2009049726A (ja) | 2007-08-21 | 2009-03-05 | Sony Corp | 情報変換装置、情報変換方法およびプログラム |
JP2010140459A (ja) * | 2008-02-22 | 2010-06-24 | Ricoh Co Ltd | プログラムと印刷データ変換装置とコンピュータ読み取り可能な記録媒体 |
JP2010154053A (ja) * | 2008-12-24 | 2010-07-08 | Canon Inc | 映像処理装置、映像処理方法並びにプログラム |
CN102082921A (zh) * | 2009-11-27 | 2011-06-01 | 新奥特(北京)视频技术有限公司 | 一种多样化字幕飞播的方法及装置 |
CN102082918B (zh) * | 2009-11-27 | 2015-06-17 | 新奥特(北京)视频技术有限公司 | 一种进行多样化字幕飞播的字幕机 |
JP5159916B2 (ja) | 2011-04-28 | 2013-03-13 | 株式会社東芝 | ホスト |
CN102739990B (zh) * | 2011-05-06 | 2017-02-01 | 新奥特(北京)视频技术有限公司 | 一种具有独立播出特性的字幕素材编辑方法和装置 |
US8584167B2 (en) | 2011-05-31 | 2013-11-12 | Echostar Technologies L.L.C. | Electronic programming guides combining stored content information and content provider schedule information |
US8660412B2 (en) | 2011-08-23 | 2014-02-25 | Echostar Technologies L.L.C. | System and method for dynamically adjusting recording parameters |
US8763027B2 (en) | 2011-08-23 | 2014-06-24 | Echostar Technologies L.L.C. | Recording additional channels of a shared multi-channel transmitter |
US8437622B2 (en) | 2011-08-23 | 2013-05-07 | Echostar Technologies L.L.C. | Altering presentation of received content based on use of closed captioning elements as reference locations |
US8606088B2 (en) | 2011-08-23 | 2013-12-10 | Echostar Technologies L.L.C. | System and method for memory jumping within stored instances of content |
US8850476B2 (en) | 2011-08-23 | 2014-09-30 | Echostar Technologies L.L.C. | Backwards guide |
US8959566B2 (en) * | 2011-08-23 | 2015-02-17 | Echostar Technologies L.L.C. | Storing and reading multiplexed content |
US8627349B2 (en) | 2011-08-23 | 2014-01-07 | Echostar Technologies L.L.C. | User interface |
US9621946B2 (en) | 2011-08-23 | 2017-04-11 | Echostar Technologies L.L.C. | Frequency content sort |
US9357159B2 (en) | 2011-08-23 | 2016-05-31 | Echostar Technologies L.L.C. | Grouping and presenting content |
US8447170B2 (en) | 2011-08-23 | 2013-05-21 | Echostar Technologies L.L.C. | Automatically recording supplemental content |
US9185331B2 (en) | 2011-08-23 | 2015-11-10 | Echostar Technologies L.L.C. | Storing multiple instances of content |
CN103067678A (zh) * | 2011-10-20 | 2013-04-24 | 四川长虹电器股份有限公司 | 一种电视机字幕显示方法及装置 |
US8819722B2 (en) | 2012-03-15 | 2014-08-26 | Echostar Technologies L.L.C. | Smartcard encryption cycling |
US8959544B2 (en) | 2012-03-15 | 2015-02-17 | Echostar Technologies L.L.C. | Descrambling of multiple television channels |
US9489981B2 (en) | 2012-03-15 | 2016-11-08 | Echostar Technologies L.L.C. | Successive initialization of television channel recording |
US8989562B2 (en) | 2012-03-15 | 2015-03-24 | Echostar Technologies L.L.C. | Facilitating concurrent recording of multiple television channels |
CN103358727B (zh) * | 2012-03-26 | 2017-09-19 | 精工爱普生株式会社 | 记录装置及记录装置的控制方法 |
US9172737B2 (en) * | 2012-07-30 | 2015-10-27 | New York University | Streamloading content, such as video content for example, by both downloading enhancement layers of the content and streaming a base layer of the content |
US20140046923A1 (en) | 2012-08-10 | 2014-02-13 | Microsoft Corporation | Generating queries based upon data points in a spreadsheet application |
KR20140049832A (ko) * | 2012-10-18 | 2014-04-28 | 삼성전자주식회사 | 블루레이 디스크와 이를 재생하기 위한 블루레이 디스크 재생 장치 및 그 자막 표시 방법 |
US8793724B2 (en) | 2012-11-08 | 2014-07-29 | Eldon Technology Limited | Image domain compliance |
US9628838B2 (en) | 2013-10-01 | 2017-04-18 | Echostar Technologies L.L.C. | Satellite-based content targeting |
US9756378B2 (en) | 2015-01-07 | 2017-09-05 | Echostar Technologies L.L.C. | Single file PVR per service ID |
US20160307603A1 (en) * | 2015-04-15 | 2016-10-20 | Sony Corporation | Information processing device, information recording medium, information processing method, and program |
US10614108B2 (en) * | 2015-11-10 | 2020-04-07 | International Business Machines Corporation | User interface for streaming spoken query |
US11295497B2 (en) * | 2019-11-25 | 2022-04-05 | International Business Machines Corporation | Dynamic subtitle enhancement |
CN112070860A (zh) * | 2020-08-03 | 2020-12-11 | 广东以诺通讯有限公司 | 一种图片处理方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62216969A (ja) * | 1986-03-17 | 1987-09-24 | 住友電気工業株式会社 | 繊維強化Si3N4セラミツクス及びその製造方法 |
WO2004034398A1 (en) * | 2002-10-11 | 2004-04-22 | Thomson Licensing S.A. | Method and apparatus for synchronizing data streams containing audio, video and/or other data |
WO2004036574A1 (en) * | 2002-10-15 | 2004-04-29 | Samsung Electronics Co., Ltd. | Information storage medium containing subtitle data for multiple languages using text data and downloadable fonts and apparatus therefor |
WO2004049710A1 (ja) * | 2002-11-28 | 2004-06-10 | Sony Corporation | 再生装置、再生方法、再生プログラムおよび記録媒体 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6282472A (ja) | 1985-10-07 | 1987-04-15 | Canon Inc | 画像処理方法 |
JP2979109B2 (ja) * | 1992-12-03 | 1999-11-15 | 日本アイ・ビー・エム 株式会社 | 認識文字の情報作成方法及び装置 |
US6744921B1 (en) * | 1993-12-29 | 2004-06-01 | Canon Kabushiki Kaisha | Image processing apparatus and method that determines the thickness of characters and lines |
US5680619A (en) * | 1995-04-03 | 1997-10-21 | Mfactory, Inc. | Hierarchical encapsulation of instantiated objects in a multimedia authoring system |
US6393145B2 (en) * | 1999-01-12 | 2002-05-21 | Microsoft Corporation | Methods apparatus and data structures for enhancing the resolution of images to be rendered on patterned display devices |
JP3532781B2 (ja) * | 1999-02-12 | 2004-05-31 | 株式会社メガチップス | 画像入力装置の画像処理回路 |
US6678410B1 (en) * | 1999-02-17 | 2004-01-13 | Adobe Systems Incorporated | Generating a glyph |
JP3589620B2 (ja) | 1999-06-29 | 2004-11-17 | 松下電器産業株式会社 | ビットマップデータ生成装置、および指示装置 |
US7239318B2 (en) * | 2001-03-23 | 2007-07-03 | Rise Kabushikikaisha | Method and computer software program product for processing characters based on outline font |
JP2002311967A (ja) * | 2001-04-13 | 2002-10-25 | Casio Comput Co Ltd | 替え歌作成装置及び替え歌作成プログラム及び替え歌作成方法 |
JP2003037792A (ja) | 2001-07-25 | 2003-02-07 | Toshiba Corp | データ再生装置及びデータ再生方法 |
JP3906345B2 (ja) * | 2001-11-01 | 2007-04-18 | 株式会社東海電通 | 効果音及び映像の配信システム |
US20040081434A1 (en) * | 2002-10-15 | 2004-04-29 | Samsung Electronics Co., Ltd. | Information storage medium containing subtitle data for multiple languages using text data and downloadable fonts and apparatus therefor |
KR100970727B1 (ko) * | 2002-10-15 | 2010-07-16 | 삼성전자주식회사 | 텍스트 데이터와 다운로드 폰트를 이용한 다국어 지원용서브 타이틀 데이터가 기록된 정보저장매체 및 그 장치 |
CN1781149B (zh) * | 2003-04-09 | 2012-03-21 | Lg电子株式会社 | 具有用于管理文本字幕数据重现的数据结构的记录媒体以及记录和重现的方法和装置 |
JP2004326491A (ja) * | 2003-04-25 | 2004-11-18 | Canon Inc | 画像処理方法 |
JP5119566B2 (ja) | 2004-02-16 | 2013-01-16 | ソニー株式会社 | 再生装置および再生方法、プログラム記録媒体、並びにプログラム |
US7529467B2 (en) * | 2004-02-28 | 2009-05-05 | Samsung Electronics Co., Ltd. | Storage medium recording text-based subtitle stream, reproducing apparatus and reproducing method for reproducing text-based subtitle stream recorded on the storage medium |
WO2005088635A1 (en) * | 2004-03-18 | 2005-09-22 | Lg Electronics Inc. | Recording medium and method and apparatus for reproducing text subtitle stream recorded on the recording medium |
US7571386B2 (en) * | 2004-05-03 | 2009-08-04 | Lg Electronics Inc. | Recording medium having a data structure for managing reproduction of text subtitle data and methods and apparatuses associated therewith |
US9298675B2 (en) * | 2004-09-30 | 2016-03-29 | Adobe Systems Incorporated | Smart document import |
-
2005
- 2005-04-11 JP JP2005113302A patent/JP4715278B2/ja active Active
-
2006
- 2006-04-06 KR KR1020137019610A patent/KR101383290B1/ko not_active IP Right Cessation
- 2006-04-06 US US11/629,082 patent/US8208531B2/en not_active Expired - Fee Related
- 2006-04-06 EP EP06731282A patent/EP1871097A4/en not_active Ceased
- 2006-04-06 CN CNB2006800003304A patent/CN100525399C/zh active Active
- 2006-04-06 KR KR1020067025974A patent/KR101324129B1/ko not_active IP Right Cessation
- 2006-04-06 WO PCT/JP2006/307334 patent/WO2006109673A1/ja active Application Filing
- 2006-04-11 TW TW095112775A patent/TW200705220A/zh unknown
- 2006-04-11 TW TW099139400A patent/TWI488055B/zh not_active IP Right Cessation
-
2012
- 2012-06-06 US US13/490,076 patent/US10075668B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62216969A (ja) * | 1986-03-17 | 1987-09-24 | 住友電気工業株式会社 | 繊維強化Si3N4セラミツクス及びその製造方法 |
WO2004034398A1 (en) * | 2002-10-11 | 2004-04-22 | Thomson Licensing S.A. | Method and apparatus for synchronizing data streams containing audio, video and/or other data |
WO2004036574A1 (en) * | 2002-10-15 | 2004-04-29 | Samsung Electronics Co., Ltd. | Information storage medium containing subtitle data for multiple languages using text data and downloadable fonts and apparatus therefor |
WO2004049710A1 (ja) * | 2002-11-28 | 2004-06-10 | Sony Corporation | 再生装置、再生方法、再生プログラムおよび記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1871097A4 * |
Also Published As
Publication number | Publication date |
---|---|
CN1969540A (zh) | 2007-05-23 |
EP1871097A4 (en) | 2008-12-03 |
CN100525399C (zh) | 2009-08-05 |
US20120242898A1 (en) | 2012-09-27 |
US20080291206A1 (en) | 2008-11-27 |
TWI341473B (ja) | 2011-05-01 |
KR101324129B1 (ko) | 2013-11-01 |
EP1871097A1 (en) | 2007-12-26 |
KR20130096766A (ko) | 2013-08-30 |
KR20080006432A (ko) | 2008-01-16 |
JP4715278B2 (ja) | 2011-07-06 |
JP2006295531A (ja) | 2006-10-26 |
TW200705220A (en) | 2007-02-01 |
US8208531B2 (en) | 2012-06-26 |
KR101383290B1 (ko) | 2014-04-09 |
TWI488055B (zh) | 2015-06-11 |
TW201112009A (en) | 2011-04-01 |
US10075668B2 (en) | 2018-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006109673A1 (ja) | 情報処理装置および情報処理方法、プログラム格納媒体、プログラム、データ構造、並びに記録媒体の製造方法。 | |
US8498515B2 (en) | Recording medium and recording and reproducing method and apparatuses | |
RU2378722C2 (ru) | Носитель записи, способ и устройство для воспроизведения потоков текстовых субтитров | |
EP1730730B1 (en) | Recording medium and method and apparatus for reproducing text subtitle stream recorded on the recording medium | |
US20070140667A1 (en) | Reproducing apparatus, reproducing method, reproducing program, recording medium, data structure, authoring apparatus, authoring method, and authoring program | |
US8849094B2 (en) | Data structure, recording medium, authoring apparatus, method, and program, recording apparatus, method, and program, verification apparatus, method, and program, and manufacturing, apparatus and method for recording medium | |
US20050147387A1 (en) | Recording medium and method and apparatus for reproducing and recording text subtitle streams | |
KR20070052643A (ko) | 데이터 재생방법 및 재생장치 | |
US8326118B2 (en) | Recording medium storing a text subtitle stream including a style segment and a plurality of presentation segments, method and apparatus for reproducing a text subtitle stream including a style segment and a plurality of presentation segments | |
JP5354316B1 (ja) | 再生装置、再生方法、および記録媒体 | |
JP5339002B2 (ja) | 情報処理装置、再生方法、および記録媒体 | |
JP5517012B2 (ja) | 再生装置、再生方法、および記録媒体 | |
JP5321566B2 (ja) | 情報処理装置および情報処理方法 | |
JP5194335B2 (ja) | 情報処理装置および情報処理方法、並びにプログラム | |
RU2378720C2 (ru) | Носитель записи и способ и устройство воспроизведения и записи потоков текстовых субтитров |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 7389/DELNP/2006 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067025974 Country of ref document: KR |
|
REEP | Request for entry into the european phase |
Ref document number: 2006731282 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006731282 Country of ref document: EP Ref document number: 200680000330.4 Country of ref document: CN Ref document number: 11629082 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
WWP | Wipo information: published in national office |
Ref document number: 2006731282 Country of ref document: EP |