US5731847A - Subtitle encoding/decoding method and apparatus - Google Patents

Subtitle encoding/decoding method and apparatus Download PDF

Info

Publication number
US5731847A
US5731847A US08/618,515 US61851596A US5731847A US 5731847 A US5731847 A US 5731847A US 61851596 A US61851596 A US 61851596A US 5731847 A US5731847 A US 5731847A
Authority
US
United States
Prior art keywords
subtitles
subtitle data
subtitle
data
buffer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/618,515
Other languages
English (en)
Inventor
Ikuo Tsukagoshi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
US case filed in International Trade Commission litigation Critical https://portal.unifiedpatents.com/litigation/International%20Trade%20Commission/case/337-TA-713 Source: International Trade Commission Jurisdiction: International Trade Commission "Unified Patents Litigation Data" by Unified Patents is licensed under a Creative Commons Attribution 4.0 International License.
US case filed in California Central District Court litigation https://portal.unifiedpatents.com/litigation/California%20Central%20District%20Court/case/2%3A09-cv-07698 Source: District Court Jurisdiction: California Central District Court "Unified Patents Litigation Data" by Unified Patents is licensed under a Creative Commons Attribution 4.0 International License.
US case filed in New Jersey District Court litigation https://portal.unifiedpatents.com/litigation/New%20Jersey%20District%20Court/case/2%3A08-cv-05029 Source: District Court Jurisdiction: New Jersey District Court "Unified Patents Litigation Data" by Unified Patents is licensed under a Creative Commons Attribution 4.0 International License.
US case filed in California Central District Court litigation https://portal.unifiedpatents.com/litigation/California%20Central%20District%20Court/case/8%3A08-cv-01135 Source: District Court Jurisdiction: California Central District Court "Unified Patents Litigation Data" by Unified Patents is licensed under a Creative Commons Attribution 4.0 International License.
First worldwide family litigation filed litigation https://patents.darts-ip.com/?family=26426962&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US5731847(A) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
US case filed in California Central District Court litigation https://portal.unifiedpatents.com/litigation/California%20Central%20District%20Court/case/2%3A09-cv-02129 Source: District Court Jurisdiction: California Central District Court "Unified Patents Litigation Data" by Unified Patents is licensed under a Creative Commons Attribution 4.0 International License.
US case filed in International Trade Commission litigation https://portal.unifiedpatents.com/litigation/International%20Trade%20Commission/case/337-TA-765 Source: International Trade Commission Jurisdiction: International Trade Commission "Unified Patents Litigation Data" by Unified Patents is licensed under a Creative Commons Attribution 4.0 International License.
US case filed in California Central District Court litigation https://portal.unifiedpatents.com/litigation/California%20Central%20District%20Court/case/2%3A11-cv-01210 Source: District Court Jurisdiction: California Central District Court "Unified Patents Litigation Data" by Unified Patents is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TSUKAGOSHI, IKUO
Application granted granted Critical
Publication of US5731847A publication Critical patent/US5731847A/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/0122Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal the input and the output signals having different aspect ratios
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • G11B27/3063Subcodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4348Demultiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/278Subtitling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • H04N7/0884Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital for the transmission of additional display-information, e.g. menu for programme or channel selection
    • H04N7/0885Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital for the transmission of additional display-information, e.g. menu for programme or channel selection for the transmission of subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8233Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a character code signal
    • H04N9/8244Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a character code signal involving the use of subcodes
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10527Audio or video recording; Data buffering arrangements
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2545CDs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape
    • H04N5/783Adaptations for reproducing at a rate different from the recording rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/806Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
    • H04N9/8063Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals

Definitions

  • the present invention relates to encoding and decoding video data and, more particularly, to encoding and decoding subtitles superimposed on a video display screen.
  • subtitles are employed to convey textual information to the viewer.
  • the subtitles accompany an audio/video broadcast and provide supplemental information to the viewer that may not be perceivable from the broadcast.
  • Subtitles are frequently used, for example, to aid hearing impaired viewers by displaying the spoken language recorded in the audio soundtrack as written language.
  • subtitles are displayed in different languages than the spoken language recorded in the audio soundtrack.
  • subtitles may be employed to convey important information not related to the subject matter of the corresponding audio/video broadcast.
  • subtitles may represent late-breaking news, such as: emergency information; sports scores; weather reports; and other important information.
  • the subtitles are previously superimposed on the broadcast and become an inseparable part of the video picture. In this situation, a viewer does not have control to turn the subtitles on or off. This is disadvantageous where a viewer desires to video record the broadcast without the subtitles. For example, the viewer may be recording a televised movie and, suddenly, news subtitles are superimposed on the broadcast thereby ruining the recording.
  • Previously superimposed subtitles are also undesirable because a plurality of languages cannot be selected. Where a viewer does not comprehend the subtitle language the subtitles are annoying surplusage. On the other hand, Where the viewer further does not comprehend the spoken language, the broadcast is incomprehensible to the viewer.
  • CD-G Compact Disc Graphics
  • CD-G provides more flexibility in displaying subtitles because this technique records graphics on a compact disc (CD) by using subcodes.
  • CD-G has a serious disadvantage because this technique is limited to CD applications.
  • the CD-G technique does not lend itself to other recording formats and, thus, to the vast majority of audio/video broadcasts which employ such other recording formats, such as video tape.
  • FIGS. 13a-c and 14 demonstrate that the CD-G technique is not suitable for use with broadcasting subtitles during real-time broadcasts.
  • an analysis of the data format employed by CD-G reveals that this technique requires a transmission lead-time of several seconds (10.24 s) which generally is unacceptable for most real-time broadcasts.
  • FIG. 13a depicts the CD-G data format in which one frame includes 1 byte of a subcode and 32 bytes of audio channel data. Of the 32 bytes, 24 bytes are allocated for L and R audio channel data (each channel having 6 samples with 2 bytes per sample) and 8 bytes are allocated to an error correction code.
  • the frames are grouped as a block of 98 frames (Frame 0, Frame 1, . . . , Frame 96 and Frame 97) as shown in FIG. 13b.
  • Eight blocks P,Q,R,S,T,U,V and W are transmitted as shown in FIG. 13c.
  • the subcodes for Frames 0 and 1 in each block are defined as sync patterns S0, S1, whereas the remaining 96 frames store various subcode data.
  • the first 2 blocks P, Q are allocated to search data employed for searching through record tracks; and graphic data can be allocated to the subcodes in the remaining 6 blocks R,S,T,U,V and W.
  • each block of 98 frames is transmitted at a repeating frequency of 75 Hz, the data transmission rate for 1 block is (75 ⁇ 98 bytes) 7.35 kHz, or 7.35K bytes/s.
  • the transmission format for transmitting the information present in blocks R,S,T,U,V and W is shown in FIG. 14.
  • Each of the 96 frames (2, 3, . . . 97) of the 6 blocks (R,S,T,U,V and W) 96 is arranged as a packet including 6 channels (R to W) of 96 symbols per channel.
  • the packet is further subdivided into 4 packs of 24 symbols each (symbol 0 to symbol 23), with each symbol representing a frame.
  • a CD-G character is made up of 6 ⁇ 12 pixels. Since each pack is 6 ⁇ 24, a 6 ⁇ 12 character is easily accommodated in each pack.
  • the CD-G format allocates the six channels of (R,S,T,U,V and W) and the 12 symbols 8 to 19 to a character. The remainder of the symbols in each of the packs store information about the character.
  • Mode information is stored in the first 3 channels (R, S, T) of symbol 0 in each pack, and item information is stored in the last 3 channels (U, V, W) of symbol 0.
  • item information is stored in the last 3 channels (U, V, W) of symbol 0.
  • An instruction is stored in all of the channels of symbol 1.
  • Corresponding mode, item, parity or additional information for the instruction is stored in all of the channels of symbols 2 to 7.
  • Parity for all of the data in the channels of symbols 0 to 19 is stored in all of the channels of the last 4 symbols (symbols 20 to 23) of each pack.
  • the data is transmitted at a repeating frequency of 75 Hz. Therefore, a packet which contains 4 packs is transmitted at a rate of 300 packs per second (75 Hz ⁇ 4 packs). That is, with 1 character allocated to the range of 6 ⁇ 12 pixels, 300 characters can be transmitted in 1 second.
  • a CD-G screen requires more than 300 characters.
  • a CD-G screen is defined as 288 horizontal picture elements ⁇ 192 vertical picture elements and requires more than twice the 300 characters transmitted in 1 second.
  • the total transmission time for a 288 ⁇ 192 screen is, therefore, 2.56 seconds as shown by the following equation:
  • the CD-G system also suffers from defects in reproducing the subtitles.
  • the CD-G system displays subtitles only upon normal reproduction and not during special reproduction such as a fast forward or fast reverse reproduction.
  • CD-G pictures are also subject to sing phenomena (in which oblique portions of a character are ragged) or flickering because this system allocates only one bit of data for each picture element.
  • the lag time of the CD-G picture also prevents switching the subtitle display on or off at a high speed.
  • CAPTAIN system In one type of system (known as the CAPTAIN system), dot patterns, as well as character codes, represent the subtitles. This system, however, does not appear to be any better than the CD-G system and suffers from some of the same disadvantages. In both systems, for example, the subtitles lack refinement because these systems do not provide sufficient resolution power in displaying the subtitles.
  • the CAPTAIN system for example, is developed for a 248 (horizontal picture elements) by 192 (vertical picture elements) display and not for high resolution video pictures of 720 ⁇ 480.
  • An object of the invention is to provide an encoding/decoding method and apparatus for encoding and decoding subtitles with a greater degree of flexibility.
  • a further object of the invention is to encode the subtitles separately from the video data so that the subtitles may be independently manipulated.
  • a further object of the invention is to decode the subtitles in real time so that the subtitles may be contemporaneously superimposed with a video picture.
  • An even further object of the invention is to provide a processor for controlling the encoding/decoding of the subtitles for controlling a flow rate of subtitle data read out from a buffer such that the subtitle data is contemporaneously combined with corresponding video data.
  • the encoding apparatus of the present invention provides a subtitle generator for generating the subtitles for display with a respective video picture.
  • the subtitles are encoded into encoded subtitle data and the flow rate of the data is regulated by a buffer to be contemporaneous with the respective video picture encoded by a video encoder.
  • a buffer regulates the flow rate, i.e. the rate at which bits are read from the buffer, of the encoded subtitle data to contemporaneously combine the encoded subtitle data with a respective video picture decoded by a video decoder.
  • the encoded subtitle data is decoded into decoded subtitle data and a mixer superimposes the decoded subtitle data and the respective video picture.
  • the invention also provides a processor for controlling the encoding/decoding.
  • a respective one of several bit streams of subtitle data is selectively buffered; and a time display stamp indicates the time when the respective bit stream is to be decoded. Decoding of the respective bit stream is initiated during the time indicated by the time display stamp.
  • a mixer mixes the respective decoded bit stream with video picture data.
  • FIG. 1 is a block diagram of a data decoding apparatus of the present invention
  • FIG. 2 is a block diagram of the subtitle decoder depicted in FIG. 1;
  • FIG. 3 is a table of communications between the system controller of FIG. 1 and the controller of FIG. 2;
  • FIG. 4 is a table of parameters for the communications between components of FIG. 1 and FIG. 2;
  • FIGS. 5a to 5c are signal diagrams demonstrating data encoding of the present invention.
  • FIG. 6 is a color look up table referred to when encoding subtitle data
  • FIGS. 7, 7a and 7b constitute a block diagram of the encoding apparatus of the present invention.
  • FIG. 8 is a graph for the explanation of a code buffer operation
  • FIG. 9 is a block diagram describing the internal operation of the code buffer in FIG. 2;
  • FIG. 10 is an explanatory depiction of streams of subtitle data
  • FIGS. 11a-d depict the relationship between video and subtitle data relative to an aspect ratio of a monitor
  • FIG. 12 is a color look up table referred to when conducting a color wipe operation
  • FIGS. 13a to 13c depict the arrangement of data according to a CD-G format
  • FIG. 14 depicts a transmission format of the data in the CD-G format.
  • the data decoding apparatus which incorporates the present invention is shown in FIG. 1 and decodes a reproduction signal to generate a video picture superimposed with subtitles.
  • the system controller 14 of the data decoding apparatus causes the reproduction signal to be processed and sent to a subtitle decoder 7.
  • the system controller communicates with the controller 35 (FIG. 2) of the subtitle decoder to decode the subtitles and combine the decoded subtitles with decoded video data.
  • the combined subtitle and video data are, then, prepared for display on a television screen.
  • a data decoder and demultiplexer 1 receives a digital reproduction signal from, for example, a VCR.
  • the data decoder and demultiplexer 1 error decodes the reproduction signal preferably employing an Error Correcting Code (ECC) technique and demultiplexes the error decoded reproduction signal into video, subtitle and audio data.
  • ECC Error Correcting Code
  • a memory 2 may be used, for example, as a buffer memory and a work area for the purpose of error decoding and demultiplexing the reproduction signal.
  • a video decoder 3 decodes the demultiplexed video data from a video data stream.
  • a memory 4 may be employed for the operation of decoding the video data similar to the operation of the memory 2 employed with data decoder and demultiplexer 1.
  • a letter box circuit 5 converts a video picture with a 4:3 aspect ratio (a squeeze mode) to a 16:9 letter box ratio. The conversion is performed using a 4 to 3 decimation process, whereby every four horizontal lines are decimated to three horizontal lines, thus squeezing the video picture into a 3/4 picture. According to the letter box format, a vertical resolution component is derived from the remaining 1/4 of the video picture which is employed to enhance the vertical resolution of the decimated video picture. A timing control memory 6 ensures that the 1/4 of the letter box picture is not transmitted. When the decoded video data generated by the video decoder 3 is already in a 16:9 letter box format, the letter box circuit bypasses the decimation operation and sends the decoded video data directly to the subtitle decoder 7.
  • the decoded subtitle data demultiplexed by the data decoder and demultiplexer 1 is directly sent to the subtitle decoder 7.
  • the subtitle decoder 7 decodes the subtitle data according to instructions from the system controller 14 and mixes the decoded subtitle data with the decoded video data.
  • a composite encoder 8 encodes the mixed subtitle data and video data into a suitable video picture format, such as NTSC/PAL.
  • a mode display 9 interfaces with a user and indicates, for example, the mode of television monitor connected thereto.
  • a D/A converter 10 converts the encoded signal received from the composite encoder 8 into an analog signal suitable for display in the indicated mode, such as NTSC or PAL.
  • the audio portion of the audio/video signal decoded by the data decoder and demultiplexer 1 is decoded by an audio decoder 11 which decodes the demultiplexed audio data using a memory 12, for example.
  • the decoded audio data output from the audio decoder is converted into an analog audio signal appropriate for broadcast through a television monitor by a D/A converter 13.
  • the subtitle decoder 7 of FIG. 1 communicates with the system controller 14 through a controller 35 as shown in FIG. 2. This communication controls the subtitle decoding performed by the subtitle decoder. Definitions of the communication signals between the system controller 14 and the controller 35 will be discussed with reference to FIG. 3.
  • the system controller 14 sends a reset command to the controller 35 to reset the subtitle decoder 7 and sends command signals indicating an operation mode of the subtitle decoder to initialize it.
  • a special command is sent to the controller 35, for example, when a user indicates through the mode display 9 (FIG. 1) that special reproduction, such as a fast-forward or fast-reverse reproduction, is to be commenced.
  • the user may also turn the subtitles on or off through the mode display, causing the system controller to issue a display ON/OFF command to the subtitle decoder.
  • the user may also control the subtitle display position in the vertical direction relative to the video picture on the television monitor, causing the system controller to issue an U -- position value to the subtitle decoder.
  • the subtitle data is grouped into streams of data comprising bits.
  • Each bit stream corresponds to a portion of a page making up the entire subtitle picture for one picture frame.
  • the bit streams are applied to a word detector 20. Since the word detector selects which bits to forward to the code buffer 22, different types of bit streams may be applied to the word detector contemporaneously. In the preferred invention, for example, bit streams of both a normal playback mode and a fast-forward, or a fast-reverse, mode (special reproduction) are applied to the word detector.
  • the word detector 20 selects the channel indicated by a channel -- select signal sent from the system controller 14 and receives the appropriate bit streams.
  • the system controller 14 also issues a stream -- select signal to instruct the word detector 20 to select either the normal playback mode bit streams or the special reproduction mode bit streams. Thus, a viewer can switch between a normal playback mode and a special reproduction mode without delay.
  • the word detector 20 is also responsible for detecting both header and header -- error information received in the selected bit streams.
  • the header and header -- error information are sent as information signals, S. header and header error, to the system controller 14 (via the controller 35) for further processing.
  • error data representing a detected error is sent as a data error signal to the system controller 14 when the word detector detects errors in the bit stream subtitle data. If the data cannot be restored, a buffer clear signal is sent from the system controller to the controller and the erroneous subtitle data is dumped.
  • a scheduler 21 is provided to ensure that the data received from the demultiplexer 1 (FIG. 1) does not overflow the code buffer 22.
  • the scheduler controls read/write access to the code buffer by determining a bandwidth for an I/O port (not shown) which receives the bit streams selected by the word detector.
  • the bandwidth refers to the number of parallel bits supplied to the I/O port at one time and is calculated by dividing the rate at which the demultiplexer demultiplexes data by the rate at which data is read from the code buffer. For example, a data rate from the demultiplexer of 20 Mbps divided by a 2.5 Mbps rate of data read from the code buffer is equal to 8 bits. Therefore, the scheduler will set the I/O port to receive 8 bits in parallel in order to maintain a consistent flow rate of data into and out of the code buffer.
  • a read operation is commenced in real time and is triggered when the code buffer receives a decode start command from the system controller 14.
  • the timing for the reading is determined from horizontal and vertical sync signals stored in the headers of the subtitle data detected by the word detector 20.
  • the reading rate should correspond to a picture element sampling rate, preferably 13.5 MHz.
  • the subtitle data preferably is written into the code buffer at a rate of 2.5 MHz or more.
  • the 13.5 MHz sampling clock is divided into four clock cycles of 3.375 MHz each. One of these 3.375 MHz clock cycles is allocated to writing (because writing requires at least 2.5 MHz) and the remaining three clock cycles are allocated to reading data from the code buffer thus satisfying the requirement for real time display.
  • each subtitle picture element may comprise six bits, which is more than sufficient to achieve a high quality of resolution for the subtitles.
  • a duration signal and a PTS signal are retrieved by the controller 35 when it is deemed that data will be read from the code buffer.
  • the duration signal indicates the duration that the subtitle data lasts and the PTS signal indicates the proper time that the subtitle data is to be superimposed with the video data.
  • the controller times the display of the subtitles using an internal system clock reference (SCR).
  • SCR system clock reference
  • the system controller 14 sends the display ON command to the controller 35.
  • the system controller sends the display OFF signal as a subtitle decode termination signal to the controller 35 upon termination of the subtitle display.
  • the system controller may also initiate a special reproduction operation in the subtitle decoder by sending a special command to the controller 35.
  • the controller sends back an acknowledge signal (special -- ack), acknowledging that special reproduction is to be initiated.
  • special -- ack acknowledge signal
  • the word detector must select bit streams at a special reproduction rate.
  • the code buffer will read out bit streams at a special reproduction rate.
  • the system clock reference SCR can be altered by adding or subtracting clock pulses. Subtraction pulses are created at an n times rate corresponding to the rate of fast-feeding or fast-reverse feeding.
  • the special reproduction operation may also correspond to a pause operation, wherein no subtraction pulses are created; and instead, an identical frame is continuously read from the code buffer repeatedly.
  • Decoding of the subtitles also ends when the subtitle decoder 7 determines that an end of page (EOP) of the video picture is reached.
  • the system controller 14 sends a repeat time signal to the controller 35 which indicates the length of a page.
  • a run-length circuit 24 includes a counter and sends a display end signal to the controller 35 when the count value of the counter reaches the value indicated by the repeat time signal. The controller 35 thus determines that the repeat time is reached and stops reading from the code buffer.
  • the code buffer preferably stores two pages of subtitle data because one page will be read as another page is written into the code buffer.
  • the controller 35 issues a buffer overflow signal to the system controller 14 when an overflow of the code buffer occurs.
  • An overflow can be determined when the controller receives the display end signal from the run-length circuit 24 before the word detector 20 receives an end of page (EOP) signal on the following page.
  • the system controller 14 withholds transfer of subtitle data from the data decoder and demultiplexer 1 (FIG. 1) to the word detector to prevent an overflow of the code buffer.
  • the stream -- select signal from the system controller 14 designates the streams of subtitle data and the display start position is updated on every frame. Thus, after an overflow condition has passed, the next stream will be written into the code buffer and displayed at the correct display start position.
  • FIG. 8 graphically demonstrates the data flow into and out of the code buffer 22.
  • the t-axis (abscissa) represents time, while the D-axis (ordinate) represents a data size for each page of data.
  • the gradient (rise/run) represents the data flow rate of the subtitles into the code buffer.
  • Graph (C) represents the data flow of the subtitle data.
  • the vertical portions of graph (C) indicate a transfer of subtitle data from the code buffer when the display time stamp (PTS) is aligned with the synchronizing clock (SCR) generated internally by the subtitle decoder 7.
  • the horizontal portions of the graph (C) indicate the transfer of subtitle data into the code buffer.
  • the previous page of subtitle data is transferred from the code buffer and page (SO) is written into the code buffer.
  • the subtitle data of page (SO) is transferred out of the code buffer and page (S1) is written in.
  • the remaining pages (S2), (S3) are written into and read out of the code buffer as indicated.
  • An underflow condition exists when the code buffer has completed reading the subtitle data for an entire page and no further data exists in the code buffer.
  • a code buffer with a capacity of two pages is depicted by the "code buffer size" line in the FIG. 8.
  • an underflow would appear in FIG. 8 as one of the vertical portions of line (C) which extends below the lower limit of the code buffer.
  • an overflow condition is graphically depicted in FIG. 8 when the subtitle data read into the code buffer is too large, i.e., the horizontal portion of line (C) extends beyond line (B).
  • the code buffer must also perform delay compensation, especially where an external memory is employed, for decoding the video data. The delay compensation is achieved by controlling the timing of the decode start command from the system controller 14.
  • the system controller When the controller 35 of the subtitle decoder 7 sends the display time stamp (PTS) to the system controller upon writing the subtitle data to the code buffer 22, the system controller, in response, sends the decode start instruction to the controller 35.
  • the system controller 14 delays the decode start command by a time equal to the processing of a letter box picture (approximately one field) and a delay caused by video decoding at the instant the synchronizing clock of the controller (SCR) is aligned with the display time stamp (PTS). Delay compensation is particularly useful, since the video, audio and subtitle data are multiplexed on the premise that the decode delay in each of the video, audio and subtitle data signals is zero in the data encoding apparatus.
  • the inverse run-length circuit 24 conducts run-length decoding by generating the level of data from the number of run data elements.
  • the VLC circuit 23 and the run-length circuit 24 decompress the subtitle data which had been stored as compressed data in the code buffer 22.
  • the decompressed subtitle data is then sent to a 3:4 filter 25.
  • the 3:4 filter receives an xsqueeze signal from the system controller 14 indicating the aspect ratio of the corresponding television monitor. Where the signal indicates that the monitor has a 4:3 aspect ratio, the 3:4 filter applies 3:4 filtration processing to the subtitle data to match the size of the subtitles to the size of a (16:9) video picture as shown in FIGS. 11c, d.
  • the controller 35 reads 90 pixels worth of subtitle data from the code buffer 22 before the H sync pulse is generated.
  • the 3:4 filter is bypassed as shown in FIGS. 11a, b.
  • a color look-up table 26 (which stores luminance data Y, color difference data (Cr Cb), background video data, and key data K representing a data mixing ratio for the Y, Cr and Cb color components), receives the subtitle data from the 3:4 filter 25.
  • FIG. 6 shows an example of a color look-up table where the components Y, Cr, Cb and K are arranged according to the addresses 0 . . . F (hexadecimal).
  • the color look-up table is employed to generate the correct color for each pixel of the subtitle characters. That is, the luminance value Y and the color difference values Cr, Cb for a particular pixel are mixed according to the ratio specified by the key data K.
  • a mixer 34 (FIG. 2) mixes the pixel from color look-up table 26 with video data from the video decoder 3 (FIG. 1). The resulting mixed data represents a video picture with superimposed subtitles and is ready to be output to a television monitor.
  • Background video data is incorporated in the arrangement of the color look-up table.
  • address 0 of the look-up table includes key data K having the value of 00 H; which means that the subtitle data will not be seen and the background video data will manifest, as shown by regions T1 and T5 in FIG. 5c.
  • Addresses 1 to 6 of the look-up table include values of the key data K which increase linearly (20, 40 . . . C0 hexadecimal); which means that the subtitle pixels according to these addresses are mixed with the background data as shown by the regions T2 and T4 in FIG. 5c.
  • addresses 8 to F of the look-up table include values of key data K of E0; which means that the components Y, Cr and Cb are mixed without any background video data as shown by region T3 in FIG. 5c.
  • the color look-up table data is generated from the system controller and is previously downloaded to the CLUT circuit before decoding. With the color look-up table, the filtered subtitle data is transformed into the appropriate color pixel for display on the television monitor.
  • Color wiping is a display technique which "overlaps" previously displayed elements, such as subtitles, with another color usually by performing the overlay from left to right progression.
  • a viewer has control over the display of the subtitle through the mode display 9.
  • the system controller 14 upon command from the user, sends a control signal to the mixer 34 (FIG. 2), turning the subtitles on or off. Since the present invention generates subtitles in real time, the user does not experience any unpleasant delay when turning the subtitles on or off.
  • the subtitles can be controlled, by the user or otherwise, to fade-in/fade out at a variable rate. This is achieved by multiplying a fade coefficient to the pattern data representing the subtitles at a designated speed.
  • This function also allows an editor of the subtitles to present viewers with different sensations according to the broadcast audio/video picture. For example, news information may be "flashed" rapidly to draw attention to the viewer, whereas subtitles in a movie might "softly” appear in order not to detract from the enjoyment of the movie.
  • the mixer 34 is also operable for positioning the subtitles within the video picture. This is achieved by a u -- position signal sent from the system controller 14 to the mixer via controller 35 which designates the vertical direction for display on the screen. It will be noticed that the u -- position value may be varied, either by a user or otherwise. This provides additional control over the position of the subtitles and a user is free to place the subtitles anywhere along a vertical axis.
  • the decoding apparatus of the present invention may be practiced with the parameters for the different signals shown in FIG. 4.
  • the present invention is not limited to the parameters set forth in that figure and may be employed in different video systems.
  • the subtitle decoder 7 may be thought of as the subtitle decoder buffer model in FIG. 9.
  • the code buffer 22-1 accumulates streams of subtitle data until at least one page of subtitle data is accumulated in the code buffer.
  • the subtitle data for one page is transferred from the code buffer 22-1 to the display memory 22-2 (which acts as a buffer for the subtitle decoder) when the display time stamp (PTS) is aligned with the synchronizing clock (SCR).
  • PTS display time stamp
  • SCR synchronizing clock
  • the subtitles are transferred to the IVLC or run-length decoding section 23, 24 for decoding.
  • the headers of the bit streams are separated therefrom by a parser 22-3 and forwarded to the inverse variable-length code or run-length decoder 23,24 during a vertical blanking period (V).
  • the decoded subtitle data is filtered by filter 25 and color adjusted according to the color look-up table circuit 26.
  • the streams applied to the code buffer 22-1 include subtitles for both normal and special reproduction, such as a fast-forward or fast-reverse mode.
  • the code buffer selectively writes the streams therein according to the stream -- select information supplied from the system controller 14 to select either the normal or special reproduction streams as will now be described.
  • FIG. 10 demonstrates the order of the streams for both normal and special reproduction.
  • the t-axis represents the time in which a frame of subtitle streams are written into the code buffer 22.
  • a frame includes streams which make up a page during normal play and streams that make up a page for special (or trick) play.
  • Streams (1) through (7) for example, make up one page of subtitle data for normal play.
  • These normal-play streams are written into the code buffer at a time along the t-axis corresponding to an "entry point".
  • the streams for special play (referred in FIG. 10 as "trick play") are staggered in-between the streams for normal play as shown in the figure.
  • the code buffer selects between the streams of normal and special play depending upon the stream -- select signal sent from the system controller 14.
  • This arrangement is advantageous because both pages for normal and special reproduction are applied to the code buffer at the same time. That is, the mode of the subtitle decoder 7 can be instantly changed from normal to special reproduction without lapse and the viewer experiences no lapse when subtitles are displayed in a normal mode, then, in a special mode, such as a fast-forward reproduction.
  • FIGS. 5a, 5b and 5c and FIG. 6 The encoding technique employed in the present invention will be described in more particular detail with reference to FIGS. 5a, 5b and 5c and FIG. 6.
  • the technique for encoding the letter “A” of FIG. 5a will be explained.
  • the letter “A” is scanned along successive horizontal lines and the fill data of FIG. 5b is generated for the letter "A” along each horizontal line.
  • the level “E0" demarks the highest level for recreating a color pixel from the color look-up table shown in FIG. 6, whereas level "0" represents a lack of subtitle data.
  • the key data (K) determines the degree to which the fill data is mixed with background video.
  • Regions T1 and T5 of the key data correspond to areas in the video picture that are not superimposed with the fill data; therefore, these areas are designated as level 0 as indicated by address 0 in FIG. 6.
  • Regions T2 and T4 are mixed areas where the subtitles are gradually mixed with the background video picture so that the subtitles blend into the background video picture and do not sharply contrast therewith. Any of the fill data in this area is stored in addresses 1 through 6.
  • the main portion of the letter "A" is displayed within the T3 region where the background information is muted.
  • the subtitle information in region T3 is stored as addresses 7 to F hexadecimal.
  • the video camera 51 generates the video signal and supplies the same to a video encoding unit 52 which converts the video signal from analog to digital form.
  • the digitized video signal is then compressed for video transmission and forwarded to a rate controller 52a, which controls the rate that the compressed video data is transferred to the multiplexer in synchronism with the rate that the subtitles are sent to the multiplexer.
  • the compressed video data is combined with the subtitle data at the correct time.
  • audio information is obtained by the microphone 53 and encoded by an audio encoding unit 54 before being sent to the multiplexer.
  • the audio encoding unit does not necessarily include a rate controller because the audio data is ultimately recorded on a different track or transmitted over a different channel from the video data.
  • the subtitles are generated by either character generator 55 or flying spot scanner 56.
  • the character generator includes a monitor and a keyboard which allows an operator to manually insert subtitles into a video picture.
  • the operator edits the subtitles by typing the subtitles through the keyboard.
  • the flying spot scanner 56 is provided in the situation where subtitles are already provided in an external video picture.
  • the flying spot scanner scans the video picture and determines where the subtitles are positioned and extracts them from the video picture.
  • the subtitles from the flying spot scanner are pre-processed by the processing circuit 63 to conform with subtitles generated by the character generator and forwarded to the subtitle encoding circuit.
  • the subtitle data from either the character generator 55 or the processing circuit 63 are then selected for compression.
  • the character generator outputs blanking data, subtitle data and key data.
  • the subtitle data and key data are forwarded to a switch 61 which is switched according to a predetermined timing to select either the subtitle or key data.
  • the selected data from switch 61 is filtered by a filter 72 and supplied to another switch 62.
  • Switch 62 switches between the blanking data, the filtered data from the character generator and the processed data from the flying spot scanner. When it is determined that no subtitles are present, the blanking data is chosen by the switch 62. Where subtitles are present, the switch 62 chooses between the character generator data or the flying spot scanner data accordingly.
  • the selected data is then quantized by a quantization circuit 64, using a quantization based on data fed back from a subtitle buffer verifier 68.
  • the quantized data which may be compressed data, are supplied to a switch 69 and, during normal operation, forwarded to a differential pulse code modulation (DPCM) circuit 65 for pulse code modulation.
  • DPCM differential pulse code modulation
  • the modulated data is run-length encoded by a run-length coding circuit 66 and variable-length encoded by a variable-length encoding circuit 67 and forwarded to the subtitle buffer verifier 68 for final processing before being sent to the multiplexer 58.
  • the subtitle buffer verifier 68 assembles a load block which includes the encoded subtitle data.
  • the frame of the load block is generated by a loading block creator 70 and is referenced by the subtitle buffer verifier in assembling the data into the load block.
  • the subtitle buffer verifier references the load block by causing switch 69 to switch from the output of the quantization circuit 64 to the output of the loading block creator 70.
  • the loading block creator creates the load block in part with reference to the color look-up table in a color look-up table 71. For purposes of decoding, the color look-up table is forwarded directly to the subtitle buffer verifier and transferred to the multiplexer as part of the load block.
  • the subtitle buffer verifier 68 also prepares a header for the subtitle data which contains information indicating whether the data is to be decoded upon normal or special reproduction. Specifically, the subtitle display time (displayed duration) is determined from those signals at 90 kHz accuracy as PTS, those signals using an upper several bits and 90 kHz or those signals synchronized with the video vertical sync pulse. The header also indicates the subtitle display time as determined from the display start/termination time for the particular subtitle. The amount of information, display position, fade in information and fade out information are also stored in the header for transmission with the load block. The subtitle buffer verifier 68 also loads control information such as: normal/trick play information; position information; subtitle encoding information; time code information; and EOP information; and, an upper limit value.
  • control information such as: normal/trick play information; position information; subtitle encoding information; time code information; and EOP information; and, an upper limit value.
  • the subtitle buffer verifier 68 verifies that the buffer is sufficiently filled with data without overflowing. This is done by feeding back a control signal (referred to in FIG. 7A as a filter signal) to the quantization circuit 64.
  • the control signal changes the quantization level of the quantization circuit, thereby changing the amount of data encoded for a particular subtitle. By increasing the quantization level, the amount of data required for the subtitle data is reduced and the bit rate of data flowing to the subtitle buffer verifier is consequently reduced.
  • the control signal decreases the quantization level and the amount of data output from the quantization circuit increases, thereby filling the subtitle buffer verifier.
  • the subtitles may also be controlled by color wiping.
  • a wipe lever 81 is provided for an operator who operates the lever to control the color wiping of the subtitles.
  • An adapter 82 adapts the analog signals of the wipe lever to R,G,B color data.
  • the color data is forwarded to the loading block creator 70 to employ the color wiping look-up table in FIG. 12 instead of the normal color look-up table in FIG. 6.
  • the operator is also provided with a monitor 84 which displays the subtitles supplied thereto by a switcher 83 as they are color wiped.
  • the subtitle buffer verifier 68 may be considered to be symmetrical (meaning that the encoding and decoding circuits employ the same components, but in a reverse order) with the code buffer 22 (FIG. 8). That is, the subtitle buffer verifier accumulates streams of subtitle data for at least one page of subtitles and transfers each page to display buffer 22-2 when the system clock reference (SCR) is aligned with the subtitle display time stamp (PTS). In this manner, pages of subtitle data are forwarded to the multiplexer 58 for multiplexing with the audio/video data. The multiplexed data is then recorded on an optical disc 91, or transmitted to a television receiver or recorded on other suitable media.
  • SCR system clock reference
  • PTS subtitle display time stamp
  • the present invention provides a flexible encoding/decoding method and apparatus that encodes and decodes subtitles to be superimposed on video pictures in real time.
  • the subtitles are also manipulated during encoding, providing a different appearance for the subtitles with different video pictures.
  • the invention may also be employed to generate subtitle codes instead of actual text, allowing a receiving decoder to change between different languages. It will be appreciated that the present invention is applicable to other applications, such as interactive video where users can be singled out for special messages. It is, therefore, to be understood that, within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Studio Circuits (AREA)
  • Television Signal Processing For Recording (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Error Detection And Correction (AREA)
  • Signal Processing For Digital Recording And Reproducing (AREA)
  • Color Television Systems (AREA)
  • Television Systems (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
US08/618,515 1995-03-20 1996-03-19 Subtitle encoding/decoding method and apparatus Expired - Lifetime US5731847A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP8595695 1995-03-20
JP7-085956 1995-03-20
JP7295990A JPH08322046A (ja) 1995-03-20 1995-10-20 データ符号化/復号化方法および装置、および符号化データ記録媒体
JP7-295990 1995-10-20

Publications (1)

Publication Number Publication Date
US5731847A true US5731847A (en) 1998-03-24

Family

ID=26426962

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/618,515 Expired - Lifetime US5731847A (en) 1995-03-20 1996-03-19 Subtitle encoding/decoding method and apparatus

Country Status (10)

Country Link
US (1) US5731847A (fr)
EP (2) EP0734180B1 (fr)
JP (1) JPH08322046A (fr)
KR (1) KR100381989B1 (fr)
CN (2) CN1117485C (fr)
AT (2) ATE240627T1 (fr)
AU (1) AU706455B2 (fr)
CA (1) CA2172011C (fr)
DE (2) DE69628076T2 (fr)
MY (1) MY115006A (fr)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5847770A (en) * 1995-09-25 1998-12-08 Sony Corporation Apparatus and method for encoding and decoding a subtitle signal
US20020087569A1 (en) * 2000-12-07 2002-07-04 International Business Machines Corporation Method and system for the automatic generation of multi-lingual synchronized sub-titles for audiovisual data
US20030099464A1 (en) * 2001-11-29 2003-05-29 Oh Yeong-Heon Optical recording medium and apparatus and method to play the optical recording medium
US20030117529A1 (en) * 1995-07-21 2003-06-26 Wiebe De Haan Method of video information transmission, as well as an information carrier, a device for receiving and a device for transmitting video information
US20040141714A1 (en) * 2003-01-17 2004-07-22 Minolta Co., Ltd. Apparatus and method for processing a moving picture including a frame having information added thereto
US20040168203A1 (en) * 2002-12-12 2004-08-26 Seo Kang Soo Method and apparatus for presenting video data in synchronization with text-based data
US20040179605A1 (en) * 2003-03-12 2004-09-16 Lane Richard Doil Multimedia transcoding proxy server for wireless telecommunication system
US6801709B1 (en) * 1997-07-19 2004-10-05 Samsung Electronics Co., Ltd. Apparatus and method for synchronously decoding video data and sub-picture data in DVD player
US6901207B1 (en) * 2000-03-30 2005-05-31 Lsi Logic Corporation Audio/visual device for capturing, searching and/or displaying audio/visual material
US20050117886A1 (en) * 2003-11-10 2005-06-02 Samsung Electronics Co., Ltd. Storage medium storing text-based subtitle data including style information, and apparatus and method of playing back the storage medium
US20050196146A1 (en) * 2004-02-10 2005-09-08 Yoo Jea Y. Method for reproducing text subtitle and text subtitle decoding system
US20070053665A1 (en) * 2000-06-02 2007-03-08 Sony Corporation Apparatus and method for image coding and decoding
US20070124418A1 (en) * 2004-01-13 2007-05-31 Yehuda Binder Information device
US7400820B2 (en) 2001-04-27 2008-07-15 Matsushita Electric Industrial Co., Ltd. Signal processing apparatus and signal processing method for locating a lost position of auxiliary data
US20090009661A1 (en) * 2004-11-02 2009-01-08 Shizuo Murakami Captioned Still Picture Contents Producing Apparatus, Captioned Still Picture Contents Producing Program and Captioned Still Picture Contents Producing System
US20090263106A1 (en) * 2004-02-10 2009-10-22 Kang Soo Seo Text subtitle decoder and method for decoding text subtitle streams
US20090295987A1 (en) * 2008-05-30 2009-12-03 Mediatek Inc. Apparatus and Method for Processing a Vertical Blanking Interval Signal
US20100150246A1 (en) * 2007-05-24 2010-06-17 Tatsuo Kosako Video signal processing device
USRE42441E1 (en) * 1996-06-21 2011-06-07 Lg Electronics Inc. Apparatus and method for an additional contents display of an optical disc player
US20110164673A1 (en) * 2007-08-09 2011-07-07 Gary Shaffer Preserving Captioning Through Video Transcoding
US8402500B2 (en) 1997-03-21 2013-03-19 Walker Digital, Llc System and method for supplying supplemental audio information for broadcast television programs
TWI393086B (zh) * 2009-03-04 2013-04-11 Himax Media Solutions Inc 紅外線信號解碼系統以及紅外線信號解碼方法
US8786781B2 (en) 2009-04-09 2014-07-22 Ati Technologies Ulc Detection and enhancement of in-video text
US9462352B2 (en) 2014-06-19 2016-10-04 Alibaba Group Holding Limited Managing interactive subtitle data
US10595067B2 (en) 2015-07-16 2020-03-17 Naver Business Platform Corporation Video providing apparatus, video providing method, and computer program

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9716545D0 (en) * 1997-08-06 1997-10-08 Nds Ltd A method and apparatus for switching between digital bit-streams
JP3039472B2 (ja) 1997-08-29 2000-05-08 日本電気株式会社 画像と音声の再生装置
JP2000100073A (ja) * 1998-09-28 2000-04-07 Sony Corp 記録装置および方法、再生装置および方法、記録媒体、並びに提供媒体
JP3670934B2 (ja) 2000-06-01 2005-07-13 三洋電機株式会社 デジタルテレビ放送受信機における文字データの表示方法
KR100641848B1 (ko) * 2000-11-02 2006-11-02 유겐가이샤 후지야마 디지탈 영상 콘텐츠의 배신 시스템 및 재생 방법 및 그 재생 프로그램을 기록한 기록 매체
KR100716970B1 (ko) * 2003-12-08 2007-05-10 삼성전자주식회사 디지털 저장 미디어의 트릭 재생 방법 및 그에 적합한디지털 저장 미디어 구동기
JP4189883B2 (ja) * 2004-06-24 2008-12-03 インターナショナル・ビジネス・マシーンズ・コーポレーション 画像圧縮装置、画像処理システム、画像圧縮方法、及びプログラム
KR101061115B1 (ko) * 2004-08-13 2011-08-31 엘지전자 주식회사 디지털 방송 수신기 및 그의 서브타이틀 데이터 처리 방법
CN101112096A (zh) 2004-12-02 2008-01-23 索尼株式会社 编码装置和方法、解码装置和方法、程序、记录介质和数据结构
KR100615676B1 (ko) * 2005-01-11 2006-08-25 삼성전자주식회사 콘텐츠 재생장치 및 그의 gui화면 디스플레이방법
JP5201692B2 (ja) * 2006-06-09 2013-06-05 トムソン ライセンシング クローズド・キャプションをつけるシステムおよび方法
KR101158436B1 (ko) * 2006-06-21 2012-06-22 엘지전자 주식회사 디지털 방송과 부가 정보의 동기 제어 방법 및 이를구현하기 위한 디지털 방송용 단말기
JP5444611B2 (ja) * 2007-12-18 2014-03-19 ソニー株式会社 信号処理装置、信号処理方法及びプログラム

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5065143A (en) * 1988-09-26 1991-11-12 Apple Computer, Inc. Apparatus for converting an RGB signal into a composite video signal and its use in providing computer generated video overlays

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU682045B2 (en) * 1993-06-10 1997-09-18 Sony Corporation Rational input buffer arrangements for auxiliary information in video and audio signal processing systems
DE69432685T2 (de) * 1993-06-30 2004-04-08 Sony Corp. Aufzeichnungsmedium
US5461619A (en) * 1993-07-06 1995-10-24 Zenith Electronics Corp. System for multiplexed transmission of compressed video and auxiliary data
US5684542A (en) * 1993-12-21 1997-11-04 Sony Corporation Video subtitle processing system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5065143A (en) * 1988-09-26 1991-11-12 Apple Computer, Inc. Apparatus for converting an RGB signal into a composite video signal and its use in providing computer generated video overlays

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030117529A1 (en) * 1995-07-21 2003-06-26 Wiebe De Haan Method of video information transmission, as well as an information carrier, a device for receiving and a device for transmitting video information
US8594204B2 (en) 1995-07-21 2013-11-26 Koninklijke Philips N.V. Method and device for basic and overlay video information transmission
US8588312B2 (en) 1995-07-21 2013-11-19 Koninklijke Philips N.V. Method and device for information transmission with time information for controlling a period of time for overlay information display
US20110234907A1 (en) * 1995-07-21 2011-09-29 Koninklijke Philips Electronics N.V. Method of video information transmission, as well as an information carrier, a device for receiving and a device for transmitting video information
US5847770A (en) * 1995-09-25 1998-12-08 Sony Corporation Apparatus and method for encoding and decoding a subtitle signal
USRE42441E1 (en) * 1996-06-21 2011-06-07 Lg Electronics Inc. Apparatus and method for an additional contents display of an optical disc player
USRE44651E1 (en) 1996-06-21 2013-12-17 Lg Electronics Inc. Apparatus and method for an additional contents display of an optical disc player
USRE44382E1 (en) * 1996-06-21 2013-07-16 Lg Electronics Inc. Character display apparatus and method for a digital versatile disc
US8756644B2 (en) 1997-03-21 2014-06-17 Inventor Holdings, Llc System and method for supplying supplemental audio information for broadcast television programs
US8402500B2 (en) 1997-03-21 2013-03-19 Walker Digital, Llc System and method for supplying supplemental audio information for broadcast television programs
US6801709B1 (en) * 1997-07-19 2004-10-05 Samsung Electronics Co., Ltd. Apparatus and method for synchronously decoding video data and sub-picture data in DVD player
US6901207B1 (en) * 2000-03-30 2005-05-31 Lsi Logic Corporation Audio/visual device for capturing, searching and/or displaying audio/visual material
US8644672B2 (en) * 2000-06-02 2014-02-04 Sony Corporation Apparatus and method for image coding and decoding
US8625958B2 (en) * 2000-06-02 2014-01-07 Sony Corporation Apparatus and method for image coding and decoding
US20070147789A1 (en) * 2000-06-02 2007-06-28 Sony Corporation Apparatus and method for image coding and decoding
US20070206932A1 (en) * 2000-06-02 2007-09-06 Sony Corporation Apparatus and method for image coding and decoding
US20070206930A1 (en) * 2000-06-02 2007-09-06 Sony Corporation Apparatus and method for image coding and decoding
US8625959B2 (en) * 2000-06-02 2014-01-07 Sony Corporation Apparatus and method for image coding and decoding
US20070053665A1 (en) * 2000-06-02 2007-03-08 Sony Corporation Apparatus and method for image coding and decoding
US20020087569A1 (en) * 2000-12-07 2002-07-04 International Business Machines Corporation Method and system for the automatic generation of multi-lingual synchronized sub-titles for audiovisual data
US7117231B2 (en) * 2000-12-07 2006-10-03 International Business Machines Corporation Method and system for the automatic generation of multi-lingual synchronized sub-titles for audiovisual data
US7400820B2 (en) 2001-04-27 2008-07-15 Matsushita Electric Industrial Co., Ltd. Signal processing apparatus and signal processing method for locating a lost position of auxiliary data
US7486876B2 (en) * 2001-11-29 2009-02-03 Samsung Electronics Co., Ltd. Optical recording medium and apparatus and method to play the optical recording medium
US20030099464A1 (en) * 2001-11-29 2003-05-29 Oh Yeong-Heon Optical recording medium and apparatus and method to play the optical recording medium
US20040168203A1 (en) * 2002-12-12 2004-08-26 Seo Kang Soo Method and apparatus for presenting video data in synchronization with text-based data
US20040141714A1 (en) * 2003-01-17 2004-07-22 Minolta Co., Ltd. Apparatus and method for processing a moving picture including a frame having information added thereto
US8978090B2 (en) * 2003-03-12 2015-03-10 Qualcomm Incorporated Multimedia transcoding proxy server for wireless telecommunication system
US20040179605A1 (en) * 2003-03-12 2004-09-16 Lane Richard Doil Multimedia transcoding proxy server for wireless telecommunication system
US7555207B2 (en) * 2003-11-10 2009-06-30 Samsung Electronics Co., Ltd. Storage medium storing text-based subtitle data including style information, and apparatus and method of playing back the storage medium
US8649661B2 (en) 2003-11-10 2014-02-11 Samsung Electronics Co., Ltd. Storage medium storing text-based subtitle data including style information, and apparatus and method of playing back the storage medium
US20080303945A1 (en) * 2003-11-10 2008-12-11 Samsung Electronics Co., Ltd. Storage medium storing text-based subtitle data including style information, and apparatus and method of playing back the storage medium
US20050117886A1 (en) * 2003-11-10 2005-06-02 Samsung Electronics Co., Ltd. Storage medium storing text-based subtitle data including style information, and apparatus and method of playing back the storage medium
US10986164B2 (en) 2004-01-13 2021-04-20 May Patents Ltd. Information device
US20110013758A1 (en) * 2004-01-13 2011-01-20 May Patents Ltd. Information device
US11095708B2 (en) 2004-01-13 2021-08-17 May Patents Ltd. Information device
US10986165B2 (en) 2004-01-13 2021-04-20 May Patents Ltd. Information device
US20100115571A1 (en) * 2004-01-13 2010-05-06 Yehuda Binder Information device
US20100115564A1 (en) * 2004-01-13 2010-05-06 Yehuda Binder Information device
US20070124418A1 (en) * 2004-01-13 2007-05-31 Yehuda Binder Information device
US20110007220A1 (en) * 2004-01-13 2011-01-13 May Patents Ltd. Information device
US20090198795A1 (en) * 2004-01-13 2009-08-06 Yehuda Binder Information device
US11032353B2 (en) 2004-01-13 2021-06-08 May Patents Ltd. Information device
US20050196146A1 (en) * 2004-02-10 2005-09-08 Yoo Jea Y. Method for reproducing text subtitle and text subtitle decoding system
US20090263106A1 (en) * 2004-02-10 2009-10-22 Kang Soo Seo Text subtitle decoder and method for decoding text subtitle streams
US20090009661A1 (en) * 2004-11-02 2009-01-08 Shizuo Murakami Captioned Still Picture Contents Producing Apparatus, Captioned Still Picture Contents Producing Program and Captioned Still Picture Contents Producing System
US20100150246A1 (en) * 2007-05-24 2010-06-17 Tatsuo Kosako Video signal processing device
US9426479B2 (en) * 2007-08-09 2016-08-23 Cisco Technology, Inc. Preserving captioning through video transcoding
US20110164673A1 (en) * 2007-08-09 2011-07-07 Gary Shaffer Preserving Captioning Through Video Transcoding
US20090295987A1 (en) * 2008-05-30 2009-12-03 Mediatek Inc. Apparatus and Method for Processing a Vertical Blanking Interval Signal
US20150222847A1 (en) * 2008-05-30 2015-08-06 Mediatek Inc. Apparatus and method for processing a vertical blanking interval signal
TWI487377B (zh) * 2008-05-30 2015-06-01 Mediatek Inc 用於處理垂直遮沒間隔訊號與視訊訊號之裝置及方法
TWI393086B (zh) * 2009-03-04 2013-04-11 Himax Media Solutions Inc 紅外線信號解碼系統以及紅外線信號解碼方法
US8786781B2 (en) 2009-04-09 2014-07-22 Ati Technologies Ulc Detection and enhancement of in-video text
US9462352B2 (en) 2014-06-19 2016-10-04 Alibaba Group Holding Limited Managing interactive subtitle data
US9807466B2 (en) 2014-06-19 2017-10-31 Alibaba Group Holding Limited Managing interactive subtitle data
US10178439B2 (en) 2014-06-19 2019-01-08 Alibaba Group Holding Limited Managing interactive subtitle data
US10595067B2 (en) 2015-07-16 2020-03-17 Naver Business Platform Corporation Video providing apparatus, video providing method, and computer program

Also Published As

Publication number Publication date
DE69628076T2 (de) 2004-03-11
KR960035613A (ko) 1996-10-24
CN1141562A (zh) 1997-01-29
CA2172011C (fr) 2006-02-07
JPH08322046A (ja) 1996-12-03
AU4816796A (en) 1996-10-03
ATE240627T1 (de) 2003-05-15
CN1516477A (zh) 2004-07-28
EP1301043B1 (fr) 2007-04-25
AU706455B2 (en) 1999-06-17
MY115006A (en) 2003-03-31
CN1117485C (zh) 2003-08-06
EP0734180A2 (fr) 1996-09-25
EP0734180A3 (fr) 1999-01-13
EP1301043A1 (fr) 2003-04-09
ATE360961T1 (de) 2007-05-15
CA2172011A1 (fr) 1996-09-21
DE69628076D1 (de) 2003-06-18
CN1233176C (zh) 2005-12-21
DE69637052T2 (de) 2008-01-03
DE69637052D1 (de) 2007-06-06
KR100381989B1 (ko) 2003-08-21
EP0734180B1 (fr) 2003-05-14

Similar Documents

Publication Publication Date Title
US5731847A (en) Subtitle encoding/decoding method and apparatus
US6104861A (en) Encoding and decoding of data streams of multiple types including video, audio and subtitle data and searching therefor
US6424792B1 (en) Subtitle encoding/decoding method and apparatus
EP0737016B1 (fr) Nettoyage des couleurs et positionnement de sous-titres
AU702797B2 (en) Apparatus and method for encoding and decoding digital video data operable to remove noise from subtitle data included therewith
US5742352A (en) Video caption data decoding device
KR100390593B1 (ko) 서브타이틀데이터인코딩/디코딩방법및장치및그기록매체
MXPA96002842A (en) Method and multip data current search system
KR100666285B1 (ko) 데이터복호화장치및데이터복호화방법
JPH07250279A (ja) 字幕データ復号化装置
JP4391187B2 (ja) データ復号方法および装置
AU726256B2 (en) Subtitle positioning method and apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TSUKAGOSHI, IKUO;REEL/FRAME:008095/0510

Effective date: 19960729

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12