JP2004194131A - Caption display method, reproducer, recording device, recording medium, and outputting device - Google Patents

Caption display method, reproducer, recording device, recording medium, and outputting device Download PDF

Info

Publication number
JP2004194131A
JP2004194131A JP2002361603A JP2002361603A JP2004194131A JP 2004194131 A JP2004194131 A JP 2004194131A JP 2002361603 A JP2002361603 A JP 2002361603A JP 2002361603 A JP2002361603 A JP 2002361603A JP 2004194131 A JP2004194131 A JP 2004194131A
Authority
JP
Japan
Prior art keywords
information
subtitle
display
display format
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2002361603A
Other languages
Japanese (ja)
Inventor
Masuo Oku
Naozumi Sugimura
万寿男 奥
直純 杉村
Original Assignee
Hitachi Ltd
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd, 株式会社日立製作所 filed Critical Hitachi Ltd
Priority to JP2002361603A priority Critical patent/JP2004194131A/en
Publication of JP2004194131A publication Critical patent/JP2004194131A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

[PROBLEMS] When downconverting and displaying a high-definition image, conventionally, subtitles displayed together with the image are also reduced at the same time, making it very difficult to see. When down-converting and displaying a letterbox, there are areas where no image is displayed at the top and bottom of the screen. At the time of down-conversion, control is performed so that subtitles are displayed in an area where this image is not displayed. A subtitle display method and a data structure for that purpose are provided.
A plurality of display format information is recorded on a recording medium, and display format information is selected according to an output format. At the time of the high-definition display, the character size and position for high-definition are displayed, and at the time of down-conversion, the subtitle is displayed at the character size and position for NTSC.
[Selection diagram] Fig. 1

Description

[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a technology for recording and reproducing information, and more particularly to a technology for combining subtitles with a television signal.
[0002]
[Prior art]
2. Description of the Related Art Video contents such as movies are widely distributed using recording media typified by DVDs (Digital Versatile Discs). The high density of these recording media, the digitization of broadcasting, and the increase in communication speed represented by broadband have made it possible to enjoy images using high density images.
[0003]
As a device for displaying high-quality video, there is a high-definition television. A high-definition television can display a clearer image with higher resolution than a conventional NTSC television. Also, the high-definition television has an aspect ratio of 16: 9 and is horizontally long, so that more powerful images can be enjoyed.
[0004]
On the other hand, conventional NTSC televisions are still widely used. Generally, an NTSC television has an aspect ratio of 4: 3 and a shorter width than a high-definition television.
[0005]
When a user who owns a conventional NTSC television reproduces a disc on which a high-definition video is recorded or watches a high-definition broadcast, a function called down-conversion is used to reduce the image and reduce the resolution. Is dropped and displayed on the TV. At this time, since the aspect ratio is different between the high-vision television and the NTSC television, the image is displayed as a landscape image called a letter box on the NTSC television. In the letterbox display, a region where a band-shaped image is not displayed occurs above and below the NTSC television.
[0006]
[Patent Document 1]
JP-A-8-317306
[0007]
[Problems to be solved by the invention]
In foreign films, subtitles are often displayed. However, when letterbox display is performed on a conventional television, the subtitles displayed there are also reduced in size and displayed together with the video, and thus are very difficult to see.
[0008]
An object of the present invention is to provide an apparatus that can display subtitles in a legible manner even when a screen is reduced by down-conversion.
[0009]
[Means for Solving the Problems]
Display format information corresponding to a plurality of display formats is simultaneously transmitted or recorded together with the image for the high-definition television. When displaying a video, display format information is selected from the plurality of display format information, captions corresponding to the display format information are generated, and combined with the video.
[0010]
BEST MODE FOR CARRYING OUT THE INVENTION
FIG. 1 shows a recording format of subtitle information in the present invention.
In FIG. 1, 100 is a caption information file, 101 is a caption information size, 102 is a display start time, 103 is a display end time, 104 is first display format information, 105 is second display format information, and 106 is character information. It is.
[0011]
The subtitle information file 100 includes subtitle information for one program (program / movie). Of course, the caption information for one program may be divided into a plurality of files and recorded as necessary. Conversely, subtitle information of a plurality of programs may be combined into one subtitle information file. Each piece of caption information includes information necessary to display one caption.
[0012]
The caption information size 101 indicates the data size of each piece of caption information. By using this, the start position of the next subtitle information can be easily obtained.
[0013]
The display start time 102 and the display end time 103 each have a 5-byte data area, of which 33 bits are valid data. Here, the display start time 102 indicates the display start time of the subtitle. The display end time 103 indicates the display end time of the caption. The time information corresponds to a display start time (PTS) of video and audio in an MPEG (Moving Picture Experts Group) stream, and is a 33-bit value measured by a clock of 90 kHz.
[0014]
The video signal and the audio signal are subjected to image compression processing by the MPEG system, and the processing of returning them to the original video / audio is performed by the decoder. The time serving as a reference for decoding images and audio is written in a packet called a PCR (Program Clock Reference) packet in an MPEG-TS (Transport Stream). The decoder generates a reference clock by so-called PLL control based on time information in the PCR packet. Using this clock, audio and video decoding operations are performed.
[0015]
A clock obtained by dividing the above clock to 90 kHz is used for video display control, and a time stamp value is obtained by counting a 33-bit counter using this clock. A PTS designating a display time is added to each image in the video signal, and the image is displayed when the time and the time according to the time stamp match.
[0016]
Also in the subtitle information, the above time stamp value is used in order to synchronize with the video. Specifically, the system control unit described later refers to the time stamp value and compares the time stamp value with the display start time in the caption information. When the value of the clock counter matches the display start time, a subtitle display process is performed. Similarly, when the subtitle display ends, the value of the clock counter is compared with the display end time, and the display ends when the two match.
By the above processing, the subtitles and the image / audio can be synchronized.
[0017]
The first subtitle display format information 104 and the second subtitle display format information 105 each have information such as display coordinates, character size, and character decoration as subtitle display format information.
[0018]
The display coordinates indicate the display start position of the caption, and are represented by 2-byte values in the order of the X-axis coordinates and the Y-axis coordinates.
[0019]
The character size represents the size of one character in pixels. The number of pixels may be one bit. Of course, the number of vertical and horizontal pixels may be recorded by 1 byte so that the character size can be set vertically and horizontally.
[0020]
The character decoration information specifies conditions for character modification, such as italic, bold, border, and shadow. A 1-byte area is used, and each bit is given a meaning such as italic or bold.
[0021]
Here, the first subtitle display format information 104 will be described as a subtitle display format for HDTV, and the second subtitle display format information 105 will be described as a subtitle display format at the time of down-conversion.
[0022]
The character information 106 includes information indicating the number of characters (the number of bytes) and actual character string data. The number of characters is represented by one byte. When Japanese is used, the character string data has a 16-bit character code such as UTF-16 format. For character data in a language using alphabets such as English, an 8-bit character code such as ASCII or UTF-8 may be used.
[0023]
One subtitle information is configured by a combination of the above data. The subtitle information is recorded on the disc as a subtitle information file 100 for one program.
[0024]
FIG. 2 shows an example of caption display at the time of hi-vision and down-conversion.
In FIG. 2, reference numeral 201 denotes a display example of high-definition subtitles, and reference numeral 202 denotes a subtitle display example at the time of down-conversion.
[0025]
In HDTV, there are 1080 vertical pixels and 1920 horizontal pixels. The display image is displayed using all pixels. At this time, the caption is displayed on the image at the lower part of the screen as shown in 201.
[0026]
On the other hand, the image at the time of down-conversion is a conventional NTSC format image, and image display is performed using 480 pixels vertically and 720 pixels horizontally.
[0027]
At this time, the aspect ratio is different between the high-definition television and the conventional television, and a display format called a letter box is adopted to display the entire high-definition image. In the letterbox format, there are areas where no image is displayed above and below.
[0028]
When a subtitle is displayed on a down-converted image, as shown in 202, if the subtitle is displayed in an area where the image is not displayed, the subtitle can be displayed without blocking the image, and visibility is good.
[0029]
Since the first display format information 104 is subtitle information for high-definition television, the first display format information 104 is displayed on a high-definition television so that subtitles 201 are displayed in accordance with the subtitle display area of the high-definition image. Set information.
[0030]
On the other hand, since the second display format information 105 is subtitle information for down-conversion, the second display format information 105 is displayed in accordance with the subtitle display area at the time of down-conversion so that the subtitle is displayed as 202 on the NTSC television screen. Set information such as size and display coordinates.
[0031]
FIG. 3 shows a recording example of caption information corresponding to the caption display of FIG. The symbols in FIG. 3 are equivalent to those in FIG.
The caption information size 101 indicates the number of bytes of the caption information. The display start time 102 and the display end time 103 use a 5-byte data area. The first display format information 104 includes a display X coordinate of 2 bytes, a display Y coordinate of 2 bytes, and a character size of 1 byte. Similarly, the second display format information 105 is a display X coordinate of 2 bytes, a display Y coordinate of 2 bytes, and a character size of 1 byte. The character information 106 includes 1-byte data indicating the number of characters and character string data.
[0032]
In the example shown in FIG. 3, the subtitle information is recorded in a binary data format, but this is not a limitation, and of course, it may be recorded in a data format such as a text format.
[0033]
FIG. 4 shows an example of recording in a text format.
The first line is a tag indicating the start of subtitle information data. One piece of caption information data is represented by the caption information end tag on the seventh line. The second line shows the value of the display start time 102 (PTS), and the third line shows the value of the display end time 103 (ETS). Here, the PTS and the ETS have a display format of hour, minute, second, and frame number. Since the video in the high-definition format and the NTSC format is composed of an image of 30 frames per second, the frame number takes a value from 0 to 29. The fourth line indicates the display format information 104 for HDTV. Here, LOCATION represents the display position coordinates, and SIZE represents the character size. The fifth line is display format information 105 used at the time of down-conversion. The sixth line indicates a character string 106 displayed as subtitles. Using these caption information, each caption image is generated and displayed.
[0034]
FIG. 5 shows a file structure on an optical disc according to the present invention.
Each of the data recorded on the optical disk is handled as a file. As a result, information such as the name and recording position of each file can be easily obtained from the file management information, which facilitates data management. Each file is collectively managed in a directory format as needed.
[0035]
The “ROOT” directory 501 has directories of “MENU”, “SUBTITLE”, and “STREAM”.
[0036]
In the “MENU” directory, information on a menu screen at the time of starting reproduction is recorded. This includes, for example, information on elements such as display of a title screen to be presented to a user at the start of reproduction, selection of a language used in audio and subtitles, and selection of a chapter (chapter) to be reproduced.
[0037]
Subtitle information is recorded in the “SUBTITLE” directory. A plurality of subtitle information can be prepared, and can be switched and used according to the language, for example. For example, if the Japanese subtitle information is recorded with different file names such as "JAPANESE.STL", the English subtitle information is recorded as "ENGLISH.STL", and the Western subtitle information is recorded as "SPANISH.STL", Switching can be easily performed for processing.
[0038]
In the “STREAM” directory, an MPEG stream including video and audio data is recorded. Here, only one MPEG stream is described, but this is not a limitation. There is no problem if a stream is created for each program (program / movie) and a plurality of streams are recorded on the optical disk. If a plurality of directories are prepared for each program, a plurality of programs can be recorded on one optical disc. For example, directories “001”, “002”,... May be created, and the data of FIG. 5 may be recorded in each of the directories.
[0039]
Alternatively, if different file names such as “STREAM001.MPG” and “STRAM002.MPG” are added to the stream data, the data can be easily identified. In this case, the menus and subtitles can be easily identified and selected by distinguishing the file names like “MENU001.DAT” and “JAPANESE001.STL”. Of course, information of a plurality of languages may exist in one data.
[0040]
FIG. 6 shows a block diagram of a reproducing apparatus according to the present invention.
6, 601 is an optical disk, 602 is an optical pickup, 603 is a reproduction signal processing unit, 604 is an output timing control unit, 605 is a servo, 606 is a drive control unit, 607 is an audio decoder, 608 is an audio output terminal, and 609 is an audio output terminal. A video decoder, 610 is a subtitle synthesizing unit, 611 is a video output terminal, 612 is a time stamp generating unit, 613 is a system control unit composed of a CPU, and 614 is a remote control receiving unit.
[0041]
The reproduction signal processing unit 603, the output timing control unit 604, the audio decoder 607, the video decoder 609, and the subtitle synthesizing unit 610 each constitute a time stamp generating unit 612 as a circuit, and perform processing to be described later in hardware. Alternatively, as a configuration stored as a program in the system control unit 613, processing to be described later may be performed by software.
[0042]
When starting reproduction of the optical disk 601, the user inputs a reproduction start command to the reproduction device using a remote controller (not shown) or the like. A signal from the remote controller is captured by the remote controller receiving unit 614 and input to the system control unit 613. The system control unit 613 receives a reproduction start command from the remote controller, and instructs the drive control unit 606 to read data from the optical disc.
[0043]
Specifically, first, in order to obtain the name of the data file recorded on the disk, an instruction is issued to read the file management information area. The drive control unit 606 receives an instruction from the system control unit 613 and controls the servo 605. The servo 605 controls the rotation speed and rotation phase of the optical disc 601 and controls the position of the optical pickup. As a result, the rotation of the disk is controlled, and data in a predetermined file management information area is read by the optical pickup 602.
[0044]
The data read from the optical disk is sent to the reproduction signal processing unit 603, where the data is subjected to predetermined demodulation processing, error correction processing, data rearrangement, and the like, and is converted into sector data. By the above processing, the sector data of the area instructed to be reproduced by the system control unit is read out, and sent back from the drive control unit 606 to the system control unit 613. The system control unit 613 analyzes the read sector data in the file management information area, and obtains information on the file recorded on the optical disc.
[0045]
Next, the system control unit 613 reads a file related to the menu screen from the file recorded on the optical disc, and displays the menu screen according to the information of the file. Reading of the file is performed by the control as described above. The user selects a program to be reproduced, a language, and the like according to the instructions on the menu screen. When the program to be reproduced is selected by the user, the system control unit starts reproducing the stream.
[0046]
Prior to the start of stream reproduction, the system control unit 613 reads subtitle information corresponding to the language specified by the user. Since the subtitle information is recorded as a file, for example, “JAPANESE.STL”, a desired file is read in advance in a memory. Of course, the subtitle information may be read little by little in parallel with the reading of the stream. As a result, the control becomes complicated, but the capacity of the memory for storing data can be reduced. Here, description will be made assuming that data is read in advance.
[0047]
As described above, the user can select a language used for subtitles when playing the optical disc. In this case, when the disc is inserted, the user may be prompted to make a selection from the menu screen, or the user's preferred language may be stored in the playback device in advance.
[0048]
After completing the reading of the subtitle data, the system control unit 613 instructs the drive control unit 606 to read the MPEG stream. Since the MPEG stream is also managed as a file, the recording position of the file can be obtained from the file management information of the file in which the MPEG stream is recorded. The drive control unit 606 controls the servo 605 so that data can be reproduced from the designated recording position of the MPEG stream, and the optical pickup 602 reads data from the optical disk. The read data is subjected to predetermined demodulation processing, error correction processing, data rearrangement, and the like by the reproduction signal processing unit 603, and is converted into sector data. The read data of the MPEG stream is input to the output timing control unit 604 and output at a predetermined timing. Here, a time stamp for controlling the output timing is added to each packet of the MPEG stream. Hereinafter, the recording format of the MPEG stream will be described.
[0049]
FIG. 7 shows a recording format of MPEG-TS according to the present invention.
MPEG-TS is a data string of a 188-byte MPEG packet (MPEG PACKET in the figure). MPEG packets are not sent continuously, but are transmitted at intervals. This interval is not always constant. Therefore, it is necessary to add a packet transmission time to each packet and record the packet so that the packet can be output at the same packet interval as during transmission even during reproduction from the optical disk. Therefore, in the optical disc according to the present invention, a time stamp (Time Stamp in the figure) indicating the transmission time of 30 bits is added to each packet. Here, the time stamp is a value counted by a 27 MHz clock. The recording device is equipped with a 30-bit counter that runs by itself at a clock of 27 MHz. When an MPEG packet is input to the recording device, if the count value of the counter at the time when the head data of the packet arrives is added as a time stamp to the head of the MPEG packet, the MPEG packet as shown in FIG. Records can be made.
[0050]
Here, since the time stamp is 30 bits, an area of 2 bits out of 4 bytes is vacant, but this area can be used as an area of additional information (AUX in the figure) for each packet. For example, it can be applied to copyright management information of a packet.
[0051]
As described above, the packet data having a 4-byte header and 192 bytes is recorded in the sector on the optical disk. Information about the area where the packet data is recorded is recorded in the file management area and treated as a file.
[0052]
At the time of reproduction, an MPEG packet is reproduced together with a 4-byte header (Header in the figure), and the output timing of the packet is controlled according to the value of the time stamp in the header.
[0053]
The output timing control unit 604 controls the output timing of the packet using the time stamp included in the packet header.
[0054]
The output timing control unit 604 in the playback device has a 30-bit time stamp counter that runs by itself at 27 MHz, like the recording device.
[0055]
The data supplied from the reproduction signal processing circuit 603 to the output timing control unit 604 is in a 192-byte packet data format including a 4-byte header. The output timing control unit 604 temporarily stores the supplied packet data in the buffer memory, and compares the value of the time stamp in the header with the value of the above-described time stamp counter. When the value of the time stamp in the header matches the value of the above-described time stamp counter, it is determined that it is time to output the packet, and the packet is output. At this time, the 4-bit packet header is removed, and only the 188-byte MPEG packet is output. As a result, packets of the MPEG stream are output at the same packet interval as during recording.
[0056]
The MPEG stream packet output from the output timing control unit 604 is input to the audio decoder 607, the video decoder 609, and the time stamp generation unit 612, respectively.
[0057]
The audio decoder 607 decodes audio information using the MPEG stream, and outputs an audio signal to an audio output terminal 608.
[0058]
The video decoder 609 decodes video information using the MPEG stream and outputs a video signal. Here, the video decoder is a high-vision decoder, and the output video signal is also a high-vision signal. The video signal output from the video decoder is output to a video signal output terminal 611 after a caption synthesis unit 610 imposes a caption image described later.
[0059]
Here, generation of a subtitle image will be described.
The contents of the subtitle information file 100 are read in the system control unit 613 in advance. The subtitle information is the information shown in FIG.
[0060]
The system control unit 613 generates a caption image in parallel with the reproduction of the MPEG stream. First, the head caption information is read, and a display start time 102, a display end time 103, display formats 104 and 105, and character information 106 are obtained. Here, whether to use the first display format information 104 or the second display format information 105 as the display format is determined by connecting to a high-definition television or down-converting a signal for NTSC. Is output. This may be selected by the user in advance. Of course, two outputs, a high-definition output and a down-converted output, may be provided, and separate subtitles may be combined and output. Here, a description will be given assuming that a subtitle for high definition is generated.
[0061]
The first display format information 104 is used to generate subtitles for HDTV. A bitmap image is generated from the character information 106 according to information such as the character size and the character decoration in the display format information.
[0062]
In converting character information into an image in a bitmap format, font data in which the shape of each character is converted into data is required. Font data has a data format such as a bitmap format, a stroke data format, and an outline data format. These font data may be stored in the ROM on the reproducing apparatus in advance, or may be recorded as a data file on a disc, and may be controlled so as to be read out and used together with subtitle information at the start of reproduction.
[0063]
Using these font data, character information, character size, character decoration, and other information, a required bitmap image is generated.
[0064]
Next, the system control unit 613 controls the display timing of the subtitle image using the time stamp generated by the time stamp generation unit 612 and the display start time 102 and the display end time 103 in the subtitle information.
[0065]
Here, the operation of the time stamp generation unit will be described.
[0066]
FIG. 8 shows a configuration diagram of the time stamp generation unit.
[0067]
In FIG. 8, reference numeral 801 denotes an input terminal, 802 denotes a PCR extraction unit, 803 denotes a comparator, 804 denotes a PCR counter, 805 denotes a voltage controlled oscillator (Voltage Controlled Oscillator), 806 denotes a divider, and 807 denotes an output terminal.
[0068]
An MPEG stream packet reproduced from the optical disk is input to the input terminal 801 at a predetermined timing. The PCR value extraction unit 802 selects only a packet including a PCR from the MPEG stream, and extracts a value of the PCR included in the packet. The extracted packet value is input to the comparator 803.
[0069]
On the other hand, the PCR counter 804 counts the number of clocks generated by the voltage controlled oscillator 805.
[0070]
The comparator 803 compares the PCR value extracted by the PCR value extraction unit 802 with the count value counted by the PCR counter 804. Here, the PCR value extracted by the PCR value extraction unit 802 and the count value counted by the PCR counter 804 are both 33-bit values.
[0071]
The comparator 803 changes the control voltage of the voltage controlled oscillator 805 according to the comparison result. Specifically, when the count value counted by the PCR counter 804 is smaller than the PCR value extracted by the PCR value extracting unit 802, it is determined that the time is late, and the control voltage of the voltage controlled oscillator 805 is increased. Let it.
[0072]
The voltage controlled oscillator 805 changes the oscillation frequency according to the level of the control voltage. When the input is 0 V, the voltage controlled oscillator 805 oscillates at 27 MHz. When the input voltage becomes higher than 0V, the oscillation frequency is increased. Conversely, when the input voltage becomes lower than 0 V, the oscillation frequency is reduced.
[0073]
When the count value counted by the PCR counter 804 is smaller than the PCR value extracted by the PCR value extraction unit 802, the control voltage output from the comparator 803 increases, so that the oscillation frequency of the voltage control oscillator 805 increases. To rise. As a result, the time counted by the PCR counter 804 advances earlier, and approaches the PCR value in the MPEG stream.
[0074]
Conversely, when the count value counted by the PCR counter 804 is larger than the PCR value extracted by the PCR value extraction unit 802, the control voltage output from the comparator 803 decreases. This means that the value of the PCR counter is ahead of the PCR value in the MPEG stream. At this time, the oscillation frequency of the voltage controlled oscillator 805 decreases, and the advance of the time counted by the PCR counter 804 becomes slow. That is, it approaches the PCR value in the MPEG stream.
[0075]
By the above-described processing, control is performed so that the count value counted by the PCR counter 804 and the PCR value extracted by the PCR value extraction unit 802 match.
[0076]
The PCR counter 804 counts at a cycle substantially coincident with the PCR in the MPEG stream, and may indicate time information in the MPEG stream. The display of the subtitle information is performed in synchronization with the count value of the PCR counter 804. However, although the PCR counter 804 counts at 27 MHz, the PTS and ETS in the caption information are counted at 90 kHz, so the PCR count value is divided by the divider 806 to 1/300 and output via the output terminal 807. , To the system control unit 613.
[0077]
The system control unit 613 compares the time stamp value read from the time stamp generation unit 612 with the display start time 102 in the caption information. When the time stamp value of the time stamp generation unit 612 is smaller than the display start time 102 in the subtitle information, the subtitle display start time has not been reached, so that the subtitle display is not performed and the process stands by. When the time stamp value of the time stamp generation unit 612 matches the display start time 102 in the subtitle information, the subtitle image is written to the memory of the subtitle synthesis unit 610. At this time, the address of the memory to be written follows the display coordinates in the display format information. As a result, subtitles are synthesized at predetermined positions in synchronization with video and audio.
[0078]
After displaying the caption, the system control unit 613 reads the time stamp again from the time stamp generation unit 612 and compares the time stamp with the display end time 103 in the caption information. If the timestamp read from the timestamp generation unit 612 is smaller than the display end time 103 in the subtitle information, the subtitle display end time has not been reached, so that the subtitle is not erased and waits. When the time stamp value of the time stamp generation unit 612 matches the display end time 103 in the caption information, the memory of the caption synthesizing unit 610 is cleared. Thus, the display of the caption is completed.
[0079]
The caption synthesizing unit 610 imposes the caption image provided from the system control unit 613 on the video decoded by the video decoder 609 and outputs the video to the video output terminal 611. A monitor television for high vision is connected to the outside of the playback device, and the video signal output from the video output terminal 611 is displayed on the monitor television.
[0080]
As described above, the subtitle information recorded on the optical disc is displayed in synchronization with the high-definition video and audio.
[0081]
Next, a case where the video output is a down-converted output will be described.
When the down-conversion output is selected, the video decoder 609 down-converts this video signal to an NTSC format and outputs it instead of the decoded high-vision video signal. That is, the signal input to caption synthesizing section 610 is in the NTSC format.
[0082]
The system control unit 613 generates captions corresponding to the down-converted output. Specifically, a subtitle image is generated using the second display format 105 and the character information 106 in the subtitle information.
[0083]
The second display format information 105 is display format information corresponding to the NTSC format at the time of down-conversion. As shown in FIG. 2, subtitles are displayed outside the letter box display of the down-converted image. Is specified as The control of the subtitle display start and display end timings is the same as the operation described above.
[0084]
The subtitle synthesizing unit 610 synthesizes the NTSC video signal decoded and down-converted by the video decoder 609 with the NTSC subtitle image input from the system control unit 613, and outputs the synthesized NTSC video signal from the video signal output terminal 611. The video signal is displayed on an externally connected NTSC television monitor.
[0085]
As described above, at the time of down-conversion, by combining and outputting subtitles for down-conversion, which is different from that at the time of high-definition output, subtitles do not overlap on the display image, so that the visibility of subtitles is improved. .
[0086]
With the above configuration, it is possible to realize a playback device that displays subtitles in a specified display format in each of the high-definition format and the down-convert format.
[0087]
In the above-described reproducing apparatus, the user switches between the high-definition video and the down-converted NTSC video and outputs it, but the present invention is not limited to this. For example, a playback device having two systems of output, a high-definition video output and a down-convert output, can be configured. An example of a reproducing apparatus having these two outputs is shown below.
[0088]
FIG. 9 shows an example of a reproducing apparatus provided with two outputs, a high-definition output and a down-converted output.
In FIG. 9, reference numeral 901 denotes a down converter, 902 denotes an NTSC subtitle synthesizing unit, and 903 denotes an NTSC video output terminal. Other symbols are equivalent to 601 to 614 in FIG.
[0089]
At the time of reproduction, the operation of reading subtitle data from the optical disc and the operation of reproducing the MPEG stream are the same as those of the reproducing apparatus shown in FIG.
[0090]
In the reproducing apparatus shown in FIG. 9, the generation of subtitles for high definition and the generation of subtitles for NTSC used at the time of down-conversion are performed in parallel. Here, the caption information file used for generating these captions will be described as shown in FIG.
[0091]
In parallel with the reproduction of the MPEG stream, the system control unit 613 generates a caption image from the caption information. The system control unit generates a first HDTV subtitle image from the subtitle information using the first display format information 104 and the character information 106. Further, a second subtitle image for NTSC is generated using the second display format information 105 and the character information 106.
[0092]
Next, the system control unit 613 monitors the time stamp input from the time stamp generation unit 612 and compares the time stamp with the display start time 102. When the time stamp input from the time stamp generation unit matches the display start time, the generated subtitle image is output to the subtitle synthesis unit. Specifically, at the display start time, the first subtitle image for high definition is written to the memory in the subtitle synthesizing unit 610 for high definition. Also, the second subtitle information for NTSC is written to the memory in the subtitle synthesizing unit 902 for NTSC.
[0093]
Here, a high-definition image synchronized with the display time is output from the high-definition video decoder 609, and the high-definition image is input to the high-definition subtitle synthesizing unit 610 and the down converter 901.
[0094]
The high-vision subtitle synthesizing unit 610 synthesizes the subtitle image written by the system control unit 613 with the high-vision image input from the high-vision video decoder 609, and outputs the high-vision image to the high-vision video output terminal 611.
[0095]
On the other hand, the downconverter 901 downconverts the high definition video input from the high definition video decoder 609 to the NTSC format. The NTSC video generated by the down-conversion is input to the NTSC subtitle synthesizing unit 902.
[0096]
The NTSC subtitle synthesizing unit 902 synthesizes the NTSC subtitle image input from the system control unit 613 with the NTSC video generated by down-conversion, and outputs an NTSC video signal to the NTSC video output terminal 903.
[0097]
As described above, a subtitle image is combined with each of the high-definition video signal and the NTSC video signal. At this time, each subtitle is a subtitle image in a format suitable for each video signal, so that the subtitles are easy to see when watching a video in high definition or when viewing a video on a monitor in the NTSC format. I have.
[0098]
In addition, since the down conversion process takes a long time, there is a possibility that the display of the high-definition video and the NTSC image is shifted by several frames. In this case, if the data writing timing to the NTSC subtitle synthesizing unit is shifted in anticipation of the frame shift in advance, it is possible to prevent the shift between the video and the subtitle in the subtitle synthesis with the NTSC video.
[0099]
In the above example, an example is given in which subtitle information is recorded as one subtitle information file 100 on an optical disc. However, the present invention is not limited to this. You can keep it. FIG. 12 shows an example in which subtitle information is superimposed on an MPEG stream and transmitted.
[0100]
First, the individual subtitle information shown in FIG. 1 is divided into a size that can be arranged in an MPEG packet. The divided subtitle information is converted into an MPEG packet format by adding an MPEG packet header. Here, the MPEG packet header includes a PID (packet ID) for determining the type of the packet. The caption information packet has a unique PID added so that it can be distinguished from the video information packet and the audio information packet.
[0101]
Information on which information packet each PID corresponds to is recorded in a packet called a PMT (Program Map Table), and is superimposed in the MPEG stream at a predetermined timing.
[0102]
The caption information packet (S) to which the unique PID is added is superimposed on the MPEG stream together with the video packet (V) and the audio packet (A), and is transmitted and recorded.
[0103]
At the time of reproduction, each packet is separated from the MPEG stream based on the PID. The subtitle information packet is distinguished from other information packets by the PID of the subtitle information packet, and only the subtitle information packet is extracted. The header is removed from the extracted subtitle information packet, and only the subtitle information data is extracted. As a result, subtitle information can be extracted in the same manner as in the case of independently recording in a file.
[0104]
As described above, by superimposing subtitle information on the MPEG stream, it is possible to transmit subtitle information according to a plurality of display formats even in digital broadcasting. Therefore, the present invention can be applied to not only a recording device but also a broadcasting device of a broadcasting station or a transmission device using a network.
[0105]
In the example of the apparatus described above, two pieces of display format information (104 and 105) are recorded in one piece of caption information, but the present invention is not limited to this. For example, as shown in FIG. 10, one subtitle information file is composed of the first display format information 104 for high-definition and the character information 106, and the first subtitle information file 110 is used. Similarly, one subtitle information file is configured by the second display format information 105 for down-conversion and the character information 106, and is defined as a second subtitle information file 111. These two can be recorded on the optical disc as files, respectively, and can be selectively used according to the subtitle information to be used.
[0106]
At this time, the first subtitle information file 110 and the second subtitle information file 111 are recorded with different file names such as “JAPANESE.HST” and “JAPANESE.SST”, and are easily distinguished and used. be able to. The file structure at this time is as shown in FIG.
[0107]
In the case of superimposing and recording on the MPEG-TS, if the subtitle information for Hi-Vision and the subtitle information used at the time of down-conversion are recorded as packets having different PIDs, it can be easily distinguished.
[0108]
Alternatively, the determination may be performed by including in the data an identification flag for determining whether the information is subtitle information for high-definition or subtitle information to be used at the time of down-conversion.
[0109]
By the way, as described above, when the subtitle information 110 for high definition and the subtitle information 111 for down-conversion are completely separated and recorded as two files, it is necessary to record the character information 106 in each subtitle information. Yes, the recording area is wasted. To prevent this, display format information (104 and 105) and character information 106 may be separated and recorded as separate files.
[0110]
Specifically, the display start time 102, the display end time 103, and the character information 104 are recorded as one file, and recorded as the subtitle information file 120. On the other hand, only the first display format information 104 is collected and recorded as the first display format information file 121. Similarly, only the second display format information 105 is collected and recorded as the second display format information file 122. This is illustrated in FIG.
[0111]
At this time, the first display format information file 121 has a file name such as “JAPANESE.HDS”, the second display format information file 122 has a file name “JAPANESE.SDS”, and the subtitle information file 120 has a file name “JAPANESE.STX”. By assigning file names as described above, each data file can be identified. FIG. 14 shows a file structure to which such a file name is added.
[0112]
As a result, it is possible to eliminate redundant recording of character information and reduce the recording capacity of subtitle information. At this time, when displaying the subtitle information for high definition, the first display format information file 121 and the subtitle information file 120 in FIG. 13 may be read and used. Similarly, at the time of down conversion, the subtitle display in the NTSC signal is realized by reading and using the second display format information file 122 and the subtitle information file 120.
[0113]
By the way, in the above-described reproducing apparatus, the recorded subtitle information is character information and display format information, and a subtitle image in a bitmap format is generated from these and displayed. However, data may be recorded in advance as a bitmap subtitle image. In this case, it is possible to realize this by separately creating a subtitle image for hi-vision and a subtitle image for down-conversion, and switching and outputting them according to the output format. Alternatively, it is also possible to record only one subtitle image and display it as a different subtitle image by changing the image size or the like between high-definition output and down-converted output. At this time, information such as a display position, a display start time, a display end time, and an enlargement ratio may be recorded as the display format information. FIG. 15 shows the configuration of the subtitle information file when subtitle information is recorded in the bitmap format.
[0114]
【The invention's effect】
ADVANTAGE OF THE INVENTION According to this invention, even if it reduces a screen by performing down conversion, a subtitle can be displayed easily.
[Brief description of the drawings]
FIG. 1 is a diagram showing a recording method of a caption information file.
FIG. 2 is a diagram showing a display example of captions.
FIG. 3 is a diagram showing a specific example of a caption information file.
FIG. 4 is a diagram showing a second specific example of a caption information file.
FIG. 5 is a diagram showing a file structure.
FIG. 6 is a diagram showing a block diagram of a playback device.
FIG. 7 is a diagram showing a recording structure of an MPEG packet.
FIG. 8 is a block diagram illustrating a time stamp generation unit.
FIG. 9 is a diagram showing a second block diagram of the playback device.
FIG. 10 is a diagram showing a second recording method of a subtitle information file.
FIG. 11 is a diagram showing a second file structure.
FIG. 12 is a diagram illustrating a method of superimposing subtitle information on an MPEG packet.
FIG. 13 is a diagram illustrating a third recording method of a subtitle information file.
FIG. 14 is a diagram showing a third file structure.
FIG. 15 is a diagram illustrating a fourth recording method of a caption information file.
[Explanation of symbols]
100: closed caption information file, 101: closed caption information size, 102: display start time, 103: display end time, 104: first display format information, 105: second display format information, 106: character information, 110: first 1 subtitle information file, 111: second subtitle information file, 120: subtitle information file, 121: first display format information file, 122: second display format information file, 601: optical disk, 602: optical pickup, 603: playback signal processing unit, 604: output timing control unit, 605: servo unit, 606: drive control unit, 607: audio decoder, 608: audio output terminal, 609: video decoder, 610: subtitle synthesis unit, 611: video Output terminal, 612: time stamp generation unit, 613: system control unit, 614: remote control reception unit, 801: input Terminals: 802: PCR value extraction unit, 803: Comparator, 804: PCR counter, 805: Voltage controlled oscillator, 806: Divider, 807: Output terminal, 901: Down converter, 902: NTSC subtitle synthesis unit, 903 ... NTSC video output terminal.

Claims (12)

  1. A subtitle display method for displaying subtitles synchronized with video,
    Caption information includes multiple display format information,
    When displaying a video, a subtitle display method characterized by selecting display format information from the plurality of display format information, generating subtitles according to the display format information, and combining the generated subtitles with the video.
  2. The subtitle display method according to claim 1, wherein
    When selecting display format information from a plurality of display format information, a subtitle display method characterized by selecting display format information according to the type of device to be connected.
  3. Subtitle information used in the subtitle display method according to claim 1,
    A subtitle display method, wherein the display format information includes subtitle display position information.
  4. A playback device that plays back video and subtitle information from a recording medium and combines and outputs subtitles to the video,
    The subtitle information recorded on the recording medium includes display format information corresponding to a plurality of display formats,
    A playback apparatus characterized by selecting display format information from the plurality of display format information, generating subtitles according to the selected display format information, synthesizing the caption with a video, and outputting the video.
  5. The playback device according to claim 4, wherein
    A display device, wherein the display format information includes display position information of a subtitle.
  6. The playback device according to claim 4, wherein
    A display device, wherein the display format information includes character size information of a subtitle.
  7. The playback device according to claim 4, wherein
    A reproducing apparatus characterized in that subtitle information includes character code information to be displayed in subtitles, and the character code information is commonly used regardless of display format information.
  8. The playback device according to claim 7, wherein
    A reproducing apparatus characterized in that font data meaning a character shape corresponding to a character code is recorded on a recording medium, and the font data is read and used.
  9. A recording device for recording subtitle information together with video information on a recording medium,
    A recording device for recording a plurality of display format information in caption information.
  10. A recording medium on which subtitle information is recorded together with video information,
    A recording medium characterized by recording a plurality of display format information in caption information.
  11. A subtitle display device that performs subtitle display synchronized with the video using the input video and subtitle information,
    The input subtitle information includes display format information corresponding to a plurality of display formats, selects display format information from the plurality of display format information, generates subtitles according to the selected display format information, A playback device characterized in that it is synthesized with a video and output.
  12. An output device that outputs subtitle information together with a video,
    An output device, wherein the subtitle information to be output includes display format information corresponding to a plurality of display formats.
JP2002361603A 2002-12-13 2002-12-13 Caption display method, reproducer, recording device, recording medium, and outputting device Pending JP2004194131A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002361603A JP2004194131A (en) 2002-12-13 2002-12-13 Caption display method, reproducer, recording device, recording medium, and outputting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002361603A JP2004194131A (en) 2002-12-13 2002-12-13 Caption display method, reproducer, recording device, recording medium, and outputting device

Publications (1)

Publication Number Publication Date
JP2004194131A true JP2004194131A (en) 2004-07-08

Family

ID=32760270

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002361603A Pending JP2004194131A (en) 2002-12-13 2002-12-13 Caption display method, reproducer, recording device, recording medium, and outputting device

Country Status (1)

Country Link
JP (1) JP2004194131A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005122569A1 (en) * 2004-06-11 2005-12-22 Sony Corporation Data processing device, data processing method, program, program recording medium, data recording medium, and data structure
JP2006287747A (en) * 2005-04-01 2006-10-19 Canon Inc Image processor, image processing method, program and storage medium
JP2006308671A (en) * 2005-04-26 2006-11-09 Sony Corp Information processor, information processing method, program storage medium and program
JP2007507828A (en) * 2003-10-04 2007-03-29 サムスン エレクトロニクス カンパニー リミテッド Information recording medium on which text-based subtitle information is recorded, processing apparatus and method therefor
JP2007514255A (en) * 2003-11-10 2007-05-31 サムスン エレクトロニクス カンパニー リミテッド Information recording medium on which subtitles are recorded and processing apparatus therefor
JP2007520844A (en) * 2004-02-03 2007-07-26 エルジー エレクトロニクス インコーポレーテッド Text subtitle decoder and method for decoding text subtitle
JP2009177720A (en) * 2008-01-28 2009-08-06 Sony Corp Method and device for display control, and program
US7574119B2 (en) 2004-07-27 2009-08-11 Kabushiki Kaisha Toshiba Information playback apparatus and information playback method
JP2009301704A (en) * 2003-10-04 2009-12-24 Samsung Electronics Co Ltd Method for providing text-based subtitle
US7787753B2 (en) 2003-04-09 2010-08-31 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of text subtitle data and methods and apparatuses of recording and reproducing
US7809244B2 (en) 2004-03-26 2010-10-05 Lg Electronics Inc. Recording medium and method and apparatus for reproducing and recording text subtitle streams with style information
JP2011030224A (en) * 2009-07-27 2011-02-10 Ipeer Multimedia Internatl Ltd System and method for displaying multimedia subtitle
JP2012529832A (en) * 2009-06-08 2012-11-22 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Video data processing apparatus and method
KR101323979B1 (en) 2006-12-13 2013-10-30 엘지전자 주식회사 System and Method of Supportting Portable Handler
KR101335174B1 (en) 2007-04-13 2013-11-29 주식회사 엘지유플러스 Method and system for managing moving picture in mobile communication terminal
JP2017500770A (en) * 2013-10-24 2017-01-05 ▲華▼▲為▼▲終▼端有限公司 Subtitle display method and subtitle display device

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7787753B2 (en) 2003-04-09 2010-08-31 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of text subtitle data and methods and apparatuses of recording and reproducing
US8135259B2 (en) 2003-04-09 2012-03-13 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of text subtitle data and methods and apparatuses of recording and reproducing
JP4690330B2 (en) * 2003-10-04 2011-06-01 サムスン エレクトロニクス カンパニー リミテッド Information recording medium on which text-based subtitle information is recorded, and reproducing apparatus
JP2007507828A (en) * 2003-10-04 2007-03-29 サムスン エレクトロニクス カンパニー リミテッド Information recording medium on which text-based subtitle information is recorded, processing apparatus and method therefor
JP2011090779A (en) * 2003-10-04 2011-05-06 Samsung Electronics Co Ltd Information recording medium providing text-based subtitle, and playback device
US8331762B2 (en) 2003-10-04 2012-12-11 Samsung Electronics Co., Ltd. Information storage medium storing text-based subtitle, and apparatus and method for processing text-based subtitle
JP2009301704A (en) * 2003-10-04 2009-12-24 Samsung Electronics Co Ltd Method for providing text-based subtitle
US9031380B2 (en) 2003-10-04 2015-05-12 Samsung Electronics Co., Ltd. Information storage medium storing text-based subtitle, and apparatus and method for processing text-based subtitle
US8428432B2 (en) 2003-10-04 2013-04-23 Samsung Electronics Co., Ltd. Information storage medium storing text-based subtitle, and apparatus and method for processing text-based subtitle
US8204361B2 (en) 2003-10-04 2012-06-19 Samsung Electronics Co., Ltd. Information storage medium storing text-based subtitle, and apparatus and method for processing text-based subtitle
US8325275B2 (en) 2003-11-10 2012-12-04 Samsung Electronics Co., Ltd. Information storage medium containing subtitles and processing apparatus therefor
US8289448B2 (en) 2003-11-10 2012-10-16 Samsung Electronics Co., Ltd. Information storage medium containing subtitles and processing apparatus therefor
JP2007514255A (en) * 2003-11-10 2007-05-31 サムスン エレクトロニクス カンパニー リミテッド Information recording medium on which subtitles are recorded and processing apparatus therefor
US8218078B2 (en) 2003-11-10 2012-07-10 Samsung Electronics Co., Ltd. Information storage medium containing subtitles and processing apparatus therefor
US8045056B2 (en) 2003-11-10 2011-10-25 Samsung Electronics Co., Ltd. Information storage medium containing subtitles and processing apparatus therefor
JP2010020901A (en) * 2003-11-10 2010-01-28 Samsung Electronics Co Ltd Reproducing method using information recording medium with subtitle recorded thereon
US8498515B2 (en) 2004-02-03 2013-07-30 Lg Electronics Inc. Recording medium and recording and reproducing method and apparatuses
JP2007520844A (en) * 2004-02-03 2007-07-26 エルジー エレクトロニクス インコーポレーテッド Text subtitle decoder and method for decoding text subtitle
US7982802B2 (en) 2004-02-03 2011-07-19 Lg Electronics Inc. Text subtitle decoder and method for decoding text subtitle streams
JP2008079346A (en) * 2004-02-03 2008-04-03 Lg Electron Inc Text subtitle decoder which decodes test subtitle and its method
US8081860B2 (en) 2004-02-03 2011-12-20 Lg Electronics Inc. Recording medium and recording and reproducing methods and apparatuses
US7809244B2 (en) 2004-03-26 2010-10-05 Lg Electronics Inc. Recording medium and method and apparatus for reproducing and recording text subtitle streams with style information
US8554053B2 (en) 2004-03-26 2013-10-08 Lg Electronics, Inc. Recording medium storing a text subtitle stream including a style segment and a plurality of presentation segments, method and apparatus for reproducing a text subtitle stream including a style segment and a plurality of presentation segments
US8326118B2 (en) 2004-03-26 2012-12-04 Lg Electronics, Inc. Recording medium storing a text subtitle stream including a style segment and a plurality of presentation segments, method and apparatus for reproducing a text subtitle stream including a style segment and a plurality of presentation segments
WO2005122569A1 (en) * 2004-06-11 2005-12-22 Sony Corporation Data processing device, data processing method, program, program recording medium, data recording medium, and data structure
US7574119B2 (en) 2004-07-27 2009-08-11 Kabushiki Kaisha Toshiba Information playback apparatus and information playback method
JP4756892B2 (en) * 2005-04-01 2011-08-24 キヤノン株式会社 Image processing device
JP2006287747A (en) * 2005-04-01 2006-10-19 Canon Inc Image processor, image processing method, program and storage medium
JP2006308671A (en) * 2005-04-26 2006-11-09 Sony Corp Information processor, information processing method, program storage medium and program
KR101323979B1 (en) 2006-12-13 2013-10-30 엘지전자 주식회사 System and Method of Supportting Portable Handler
KR101335174B1 (en) 2007-04-13 2013-11-29 주식회사 엘지유플러스 Method and system for managing moving picture in mobile communication terminal
JP2009177720A (en) * 2008-01-28 2009-08-06 Sony Corp Method and device for display control, and program
JP2012529832A (en) * 2009-06-08 2012-11-22 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Video data processing apparatus and method
JP2011030224A (en) * 2009-07-27 2011-02-10 Ipeer Multimedia Internatl Ltd System and method for displaying multimedia subtitle
JP2017500770A (en) * 2013-10-24 2017-01-05 ▲華▼▲為▼▲終▼端有限公司 Subtitle display method and subtitle display device
US9813773B2 (en) 2013-10-24 2017-11-07 Huawei Device Co., Ltd. Subtitle display method and subtitle display device

Similar Documents

Publication Publication Date Title
CN1130904C (en) System and method for processing audio-only programs in television receiver
US6822661B2 (en) Information-display control apparatus and method
CA2182260C (en) Apparatus and methods for multiplexing, recording and controlling the display of image data, and recording medium therefor
AU716349B2 (en) Processing of digital data and program guide information
CN1134978C (en) Information providing and display controlling apparatus and method, information providing system and method
US5838873A (en) Packetized data formats for digital data storage media
US5889564A (en) Subtitle colorwiping and positioning method and apparatus
KR100850521B1 (en) Method and apparatus for assisting a user in selecting a program for viewing or listening
CN100370814C (en) System and method for creating user profiles
JP4515863B2 (en) Method for providing additional service information of A / V content through recording medium and recording medium thereby
EP0757484B1 (en) Subtitle recording and reproduction
EP0755161B1 (en) Encoding and decoding data, and data searching
US7558318B2 (en) Method and apparatus for processing images, method and apparatus for recording and reproducing images, and television receiver utilizing the same
US5844478A (en) Program specific information formation for digital data processing
DE69637052T2 (en) Coding / decoding of subtitles
JPWO2004095837A1 (en) Playback device, program.
JP4484870B2 (en) Method and apparatus for creating an enhanced digital video disc
US6297797B1 (en) Computer system and closed caption display method
US6512552B1 (en) Subpicture stream change control
JP5676087B2 (en) Method and system for displaying subtitle information during video trick mode
KR101114102B1 (en) Video reproducing apparatus and video reproducing method and program for executing the method in computer
US7561779B2 (en) Video data processor having converting section for producing NTSC- or PAL-compliant synthetic video data
TWI280792B (en) Apparatus for receiving a digital information signal
JP4776538B2 (en) Recording medium containing text-based subtitle information, reproduction device, and its reproduction method
EP1719131B1 (en) Apparatus for reproducing data from a storage medium storing image data and text-based subtitle data